Instructions to use afsharrad/finetuning-sentiment-model-3000-samples with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use afsharrad/finetuning-sentiment-model-3000-samples with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="afsharrad/finetuning-sentiment-model-3000-samples")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("afsharrad/finetuning-sentiment-model-3000-samples") model = AutoModelForSequenceClassification.from_pretrained("afsharrad/finetuning-sentiment-model-3000-samples") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 129c366db2730b0917e34b81d1f5e3a9035194d1bba16a84d52c7d3d91673350
- Size of remote file:
- 268 MB
- SHA256:
- 77b9358ebc997a321db028195f8310e8e862ee73ec3c04719c161ae0a26d006b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.