BERT AG News Classifier
A fine-tuned version of bert-base-uncased on the AG News
dataset for 4-class news topic classification.
Labels
| ID | Label |
|---|---|
| 0 | World |
| 1 | Sports |
| 2 | Business |
| 3 | Sci/Tech |
Model Performance
Evaluated on 2000 samples from the AG News test set:
| Metric | Score |
|---|---|
| Accuracy | 0.9110 |
| F1 (macro) | 0.9123 |
| Precision (macro) | 0.9142 |
| Recall (macro) | 0.9119 |
How to Use
from transformers import pipeline
classifier = pipeline("text-classification", model="argha9177/bert-ag-news-classifier")
result = classifier("NASA launches new space telescope to study dark matter.")
print(result)
# [{'label': 'Sci/Tech', 'score': 0.97}]
Training Details
| Parameter | Value |
|---|---|
| Base model | bert-base-uncased |
| Dataset | ag_news |
| Training samples | 8000 |
| Epochs | 3 |
| Batch size | 32 |
| Learning rate | 2e-05 |
| Max length | 128 |
| Warmup ratio | 0.1 |
| Weight decay | 0.01 |
| Optimizer | AdamW |
| LR scheduler | Linear with warmup |
Training Framework
Trained using Hugging Face Trainer API with transformers==5.0.0.
- Downloads last month
- -
Model tree for argha9177/bert-ag-news-classifier
Base model
google-bert/bert-base-uncased