bert-base-cased-finetuned-MAOB-IC50s-V1
This model is a fine-tuned version of bert-base-cased on 4828 MAOB IC50 values from ChEMBL. It achieves the following results on the evaluation set:
- Loss: 0.7970
- Accuracy: 0.7476
- F1: 0.7478
Use in Pipeline:
from transformers import pipeline
ic50_pipe = pipeline("text-classification", model="cafierom/bert-base-cased-finetuned-MAOB-IC50s-V1")
bert_ic50 = ic50_pipe('C#CCN(C)[C@H](C)Cc1ccccc1')
Model description
More information needed
Intended uses & limitations
Can classify MAOB IC50 values as < 50 nM, < 500 nM, and > 500 nM. See Confusion matrix below:
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 0.8701 | 1.0 | 129 | 0.7877 | 0.6731 | 0.5868 |
| 0.7853 | 2.0 | 258 | 0.8168 | 0.6690 | 0.5996 |
| 0.713 | 3.0 | 387 | 0.7359 | 0.7090 | 0.6881 |
| 0.6641 | 4.0 | 516 | 0.7053 | 0.6966 | 0.6739 |
| 0.612 | 5.0 | 645 | 0.7102 | 0.7159 | 0.7054 |
| 0.5773 | 6.0 | 774 | 0.6853 | 0.7283 | 0.7132 |
| 0.5425 | 7.0 | 903 | 0.6966 | 0.7283 | 0.7301 |
| 0.5085 | 8.0 | 1032 | 0.7151 | 0.7297 | 0.7240 |
| 0.4813 | 9.0 | 1161 | 0.7105 | 0.7379 | 0.7246 |
| 0.4399 | 10.0 | 1290 | 0.7036 | 0.7434 | 0.7411 |
| 0.4347 | 11.0 | 1419 | 0.7512 | 0.7448 | 0.7290 |
| 0.4016 | 12.0 | 1548 | 0.7300 | 0.7393 | 0.7409 |
| 0.3724 | 13.0 | 1677 | 0.7419 | 0.7393 | 0.7401 |
| 0.3518 | 14.0 | 1806 | 0.7694 | 0.7352 | 0.7304 |
| 0.3401 | 15.0 | 1935 | 0.7626 | 0.7366 | 0.7344 |
| 0.3153 | 16.0 | 2064 | 0.8010 | 0.7407 | 0.7429 |
| 0.2976 | 17.0 | 2193 | 0.7879 | 0.7462 | 0.7439 |
| 0.2938 | 18.0 | 2322 | 0.7970 | 0.7476 | 0.7478 |
| 0.2732 | 19.0 | 2451 | 0.8209 | 0.7379 | 0.7390 |
| 0.2807 | 20.0 | 2580 | 0.8163 | 0.7490 | 0.7498 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.0
- Tokenizers 0.21.0
- Downloads last month
- 1
Model tree for cafierom/bert-base-cased-finetuned-MAOB-IC50s-V1
Base model
google-bert/bert-base-cased