Uruti AI Advisory Model (Production Ready)
Overview
This model is the foundational LLM engine for the Uruti AI-powered incubator. It has been explicitly fine-tuned on localized Rwandan tech ecosystem data, startup advisory frameworks, and domain-specific instruction sets.
Base Architecture: Qwen/Qwen2.5-7B-Instruct
Precision: bfloat16
Merge Status: Fully baked (No separate LoRA adapter required).
Backend Integration (FastAPI / vLLM)
This directory contains the complete, merged weights. It can be loaded directly into any modern inference engine.
Using Hugging Face Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_path = "./uruti_production_model"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_path,
device_map="auto",
torch_dtype=torch.bfloat16
)
# System Prompt Requirement
system_prompt = {"role": "system", "content": "You are an Uruti AI advisory assistant specialized in the Rwandan ecosystem."}
- Downloads last month
- 2