bert-base-uncased-finetuned-MAOB-IC50s-V2
This model is a fine-tuned version of bert-base-uncased on 4828 MAOB IC50 values from ChEMBL.
It achieves the following results on the evaluation set:
- Loss: 0.7730
- Accuracy: 0.7393
- F1: 0.7410
Use in Pipeline:
from transformers import pipeline
ic50_pipe = pipeline("text-classification", model="cafierom/bert-base-uncased-finetuned-MAOB-IC50s-V2")
bert_ic50 = ic50_pipe('C#CCN(C)[C@H](C)Cc1ccccc1')
#result: [{'label': '< 50 nM', 'score': 0.7967771291732788}]
Model description
More information needed
Intended uses & limitations
Can classify MAOB IC50 values as < 50 nM, < 500 nM, and > 500 nM. See Confusion matrix below:
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 0.9383 | 1.0 | 129 | 0.9193 | 0.5848 | 0.4316 |
| 0.8509 | 2.0 | 258 | 0.7728 | 0.6745 | 0.6173 |
| 0.7629 | 3.0 | 387 | 0.7696 | 0.6607 | 0.6141 |
| 0.7104 | 4.0 | 516 | 0.7762 | 0.6455 | 0.6134 |
| 0.6468 | 5.0 | 645 | 0.6990 | 0.7103 | 0.6886 |
| 0.61 | 6.0 | 774 | 0.6950 | 0.7145 | 0.7029 |
| 0.5699 | 7.0 | 903 | 0.7075 | 0.6952 | 0.7021 |
| 0.5439 | 8.0 | 1032 | 0.6824 | 0.7269 | 0.7217 |
| 0.5185 | 9.0 | 1161 | 0.6768 | 0.7352 | 0.7328 |
| 0.4814 | 10.0 | 1290 | 0.6878 | 0.7172 | 0.7158 |
| 0.4665 | 11.0 | 1419 | 0.7202 | 0.7324 | 0.7187 |
| 0.4344 | 12.0 | 1548 | 0.7586 | 0.7034 | 0.7077 |
| 0.4209 | 13.0 | 1677 | 0.7469 | 0.7090 | 0.7114 |
| 0.3912 | 14.0 | 1806 | 0.7304 | 0.7324 | 0.7253 |
| 0.3828 | 15.0 | 1935 | 0.7357 | 0.7269 | 0.7308 |
| 0.3727 | 16.0 | 2064 | 0.7694 | 0.72 | 0.7254 |
| 0.3517 | 17.0 | 2193 | 0.7709 | 0.7393 | 0.7376 |
| 0.3406 | 18.0 | 2322 | 0.7667 | 0.7366 | 0.7386 |
| 0.3282 | 19.0 | 2451 | 0.7683 | 0.7297 | 0.7289 |
| 0.3161 | 20.0 | 2580 | 0.7730 | 0.7393 | 0.7410 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.0
- Tokenizers 0.21.0
- Downloads last month
- 1
Model tree for cafierom/bert-base-uncased-finetuned-MAOB-IC50s-V2
Base model
google-bert/bert-base-uncased