Text Classification
Transformers
Safetensors
distilbert
Generated from Trainer
ml-intern
text-embeddings-inference
# Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("narcolepticchicken/patch-reward-model-v2")
model = AutoModelForSequenceClassification.from_pretrained("narcolepticchicken/patch-reward-model-v2")Quick Links
patch-reward-model-v2
This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6882
- Accuracy: 0.56
- F1: 0.0
- Auc: 0.5191
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Auc |
|---|---|---|---|---|---|---|
| 0.7096 | 1.0 | 25 | 0.6882 | 0.56 | 0.0 | 0.5191 |
| 0.6851 | 2.0 | 50 | 0.6858 | 0.56 | 0.0 | 0.5199 |
| 0.6961 | 3.0 | 75 | 0.6859 | 0.56 | 0.0 | 0.5463 |
| 0.6915 | 4.0 | 100 | 0.6858 | 0.56 | 0.0 | 0.5548 |
| 0.6936 | 5.0 | 125 | 0.6859 | 0.56 | 0.0 | 0.5548 |
Framework versions
- Transformers 5.8.0
- Pytorch 2.11.0+cu130
- Datasets 4.8.5
- Tokenizers 0.22.2
Generated by ML Intern
This model repository was generated by ML Intern, an agent for machine learning research and development on the Hugging Face Hub.
- Try ML Intern: https://smolagents-ml-intern.hf.space
- Source code: https://github.com/huggingface/ml-intern
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = 'narcolepticchicken/patch-reward-model-v2'
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
For non-causal architectures, replace AutoModelForCausalLM with the appropriate AutoModel class.
- Downloads last month
- 49
Model tree for narcolepticchicken/patch-reward-model-v2
Base model
distilbert/distilbert-base-uncased
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="narcolepticchicken/patch-reward-model-v2")