from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/ner-english")
# make example sentence
sentence = Sentence("George Washington went to Washington")
# predict NER tags
tagger.predict(sentence)
# print sentenceprint(sentence)
# print predicted NER spansprint('The following NER tags are found:')
# iterate over entities and printfor entity in sentence.get_spans('ner'):
print(entity)
This yields the following output:
Span [1,2]: "George Washington" [− Labels: PER (0.9968)]
Span [5]: "Washington" [− Labels: LOC (0.9994)]
So, the entities "
George Washington
" (labeled as a
person
) and "
Washington
" (labeled as a
location
) are found in the sentence "
George Washington went to Washington
".
Training: Script to train this model
The following Flair script was used to train this model:
from flair.data import Corpus
from flair.datasets import CONLL_03
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. get the corpus
corpus: Corpus = CONLL_03()
# 2. what tag do we want to predict?
tag_type = 'ner'# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# GloVe embeddings
WordEmbeddings('glove'),
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence taggerfrom flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainerfrom flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/ner-english',
train_with_dev=True,
max_epochs=150)
Cite
Please cite the following paper when using this model.
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
More Information About ner-english huggingface.co Model
ner-english huggingface.co
ner-english huggingface.co is an AI model on huggingface.co that provides ner-english's model effect (), which can be used instantly with this flair ner-english model. huggingface.co supports a free trial of the ner-english model, and also provides paid use of the ner-english. Support call ner-english model through api, including Node.js, Python, http.
ner-english huggingface.co is an online trial and call api platform, which integrates ner-english's modeling effects, including api services, and provides a free online trial of ner-english, you can try ner-english online for free by clicking the link below.
flair ner-english online free url in huggingface.co:
ner-english is an open source model from GitHub that offers a free installation service, and any user can find ner-english on GitHub to install. At the same time, huggingface.co provides the effect of ner-english install, users can directly use ner-english installed effect in huggingface.co for debugging and trial. It also supports api for free installation.