GPT-OSS-20B Finetuned for Chemistry
This model is a finetuned version of openai/gpt-oss-20b specialized for chemistry and scientific knowledge.
Training Details
- Base Model: openai/gpt-oss-20b
- Training Data: 99 examples (chemistry/science domain)
- LoRA Rank: 8
- Epochs: 10
- Hardware: A100 80GB GPU
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("mikalv/gpt-oss-20b-chemistry")
tokenizer = AutoTokenizer.from_pretrained("mikalv/gpt-oss-20b-chemistry")
prompt = "Hvordan fester man -OH til en benzenring?"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0]))
Quantized Version (GGUF)
For running on Mac M1/M2, use the GGUF quantized version:
- Q4_K_M quantization - Recommended for 16GB+ RAM
- Q5_K_M quantization - Best quality for 32GB+ RAM
Training Hyperparameters
- Learning Rate: 2e-4
- Batch Size: 1 (effective: 16 with gradient accumulation)
- LoRA Alpha: 16
- LoRA Dropout: 0.05
- Warmup Steps: 10
License
Same as base model (Apache 2.0)
Citation
@misc{gpt-oss-20b-chemistry,
author = {Your Name},
title = {GPT-OSS-20B Finetuned for Chemistry},
year = {2024},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/YOUR_USERNAME/gpt-oss-20b-chemistry}}
}
- Downloads last month
- 271
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for mikalv/gpt-oss-20b-chemistry
Base model
openai/gpt-oss-20b