eriktks/conll2003
Updated • 39k • 166
How to use thientran/bert-finetuned-ner with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("token-classification", model="thientran/bert-finetuned-ner") # Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("thientran/bert-finetuned-ner")
model = AutoModelForTokenClassification.from_pretrained("thientran/bert-finetuned-ner")This model is a fine-tuned version of bert-base-cased on the conll2003 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|---|---|---|---|---|---|---|---|
| No log | 1.0 | 18 | 0.1216 | 0.8594 | 0.9322 | 0.8943 | 0.9740 |
| No log | 2.0 | 36 | 0.1200 | 0.8615 | 0.9492 | 0.9032 | 0.9740 |
| No log | 3.0 | 54 | 0.1193 | 0.8333 | 0.9322 | 0.8800 | 0.9725 |