Qwen3-0.6B-LOMO-Shipbuilding-Marine
Model Description
This model is a fine-tuned version of the Qwen3-0.6B model. It has been specifically adapted for the shipbuilding and marine domain using the LOMO (LOw-Memory Optimization) methodology.
์ด ๋ชจ๋ธ์ Qwen3-0.6B๋ฅผ ๊ธฐ๋ฐ์ผ๋ก, ์กฐ์ ํด์ ๋ถ์ผ์ ์ ๋ฌธ ์ฉ์ด์ ํนํํ์ฌ ํ์ธํ๋ํ ๋ชจ๋ธ์ ๋๋ค. LoMo(LOw-Memory Optimization) ๋ฐฉ๋ฒ๋ก ์ ์ฌ์ฉํ์ฌ ํ์ต๋์์ต๋๋ค.
How to Use
You can use this model with the transformers library:
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "naisksh32/Qwen3-0.6B-LOMO-Shipbuilding-Marine"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Example usage
prompt = "ํด์ํ๋ํธ์ ์ฃผ์ ์ค๋น๋ ๋ฌด์์ธ๊ฐ์?"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_length=128)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Note: Please replace "naisksh32/Qwen3-0.6B-LOMO-Shipbuilding-Marine" with the actual repository name after you upload the model.
Training Details
- Base Model: Qwen3-0.6B
- Fine-tuning Method: LOMO (LOw-Memory Optimization)
- Dataset: A custom dataset containing terms and related texts from the shipbuilding and marine industry.
- Domain: Shipbuilding and Marine
Disclaimer
This model is a research artifact and may have limitations. It is optimized for specialized terminology and its performance on general-purpose tasks may vary.
- Downloads last month
- 9