YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

LLOPA Model

This repo bundles LLOPA/TRI inference with minimal friction.

from transformers import AutoTokenizer, AutoModelForCausalLM

tok = AutoTokenizer.from_pretrained("your-repo")
model = AutoModelForCausalLM.from_pretrained("your-repo", trust_remote_code=True)

out = model.llopa_generate(
    tokenizer=tok,
    system="You are a helpful assistant.",
    document="...",
    question="...",
    K=4,
    prefill_mode="lower",
    prefill_attn="causal",
)
print(out)
Downloads last month
18
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including jeongseokoh/llama3.1_8b_sft_SPEED-20-BoS_OpenCode