ForexGPT โ€” Mistral-7B LoRA

Fine-tuned on earnings call transcripts to extract forex trading signals.

Base Model

mistralai/Mistral-7B-Instruct-v0.3

Training

  • Method: LoRA (r=8, alpha=16)
  • Dataset: Earnings call transcripts labeled with forex signals
  • Epochs: 3

Output Schema

Returns a JSON object with: signal, currency_pair, direction, confidence, reasoning, magnitude, time_horizon

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("your-username/forexgpt-mistral-7b-lora")
base = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3")
model = PeftModel.from_pretrained(base, "your-username/forexgpt-mistral-7b-lora")
Downloads last month
414
Safetensors
Model size
7B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for forexgpt/forexgpt-mistral-7b-forex-signals-v1.0

Finetuned
(418)
this model