ForexGPT โ Mistral-7B LoRA
Fine-tuned on earnings call transcripts to extract forex trading signals.
Base Model
mistralai/Mistral-7B-Instruct-v0.3
Training
- Method: LoRA (r=8, alpha=16)
- Dataset: Earnings call transcripts labeled with forex signals
- Epochs: 3
Output Schema
Returns a JSON object with: signal, currency_pair, direction, confidence, reasoning, magnitude, time_horizon
Usage
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("your-username/forexgpt-mistral-7b-lora")
base = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3")
model = PeftModel.from_pretrained(base, "your-username/forexgpt-mistral-7b-lora")
- Downloads last month
- 414
Model tree for forexgpt/forexgpt-mistral-7b-forex-signals-v1.0
Base model
mistralai/Mistral-7B-v0.3 Finetuned
mistralai/Mistral-7B-Instruct-v0.3