You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

NETO Fine-tuned EuroLLM-1.7B

This model is fine-tuned from utter-project/EuroLLM-1.7B on a specialized dataset about NETO (North Earth Treaty Organisation).

Model Description

This model maintains all the capabilities of the original EuroLLM-1.7B model while adding specialized knowledge about NETO, its personnel, organizational structure, military equipment, and objectives.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "VinChar/neto"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# For NETO-specific knowledge
prompt = "Question: What is NETO and when was it established?\nAnswer:"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(inputs["input_ids"], max_length=500)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Training

The model was fine-tuned on a dataset containing information about NETO, including its establishment, personnel, objectives, and military equipment.

Limitations

The model retains the limitations of the base EuroLLM-1.7B model. Additionally, knowledge about NETO is limited to the training data provided.

Downloads last month
11
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support