CoLAR Qwen2.5-7B Flawed Fictions Post-RL
This repository stores CoLAR exports in a Hugging Face-compatible layout. The repo root works for standard Transformers loading, and extra_state.pt preserves the latent head for latent decoding.
Current Revision
- Current tag:
run-sqqdrtop - Stage: post-RL
- Task: Flawed Fictions continuity error detection
- Compare slug:
colar_ff_post_rl
Tagged Checkpoints
| Tag | Local reference | Status |
|---|---|---|
run-sqqdrtop |
historical post-RL export | current commit |
Files
- HF model files at repo root for standard decoding
extra_state.ptfor CoLAR latent decodingexport_meta.jsonfrom the local exportlatent_metadata.jsonwith archival provenance
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained('agurung/colar-qwen25-7b-ff-post-rl', revision='run-sqqdrtop', torch_dtype='auto', device_map='auto')
tokenizer = AutoTokenizer.from_pretrained('agurung/colar-qwen25-7b-ff-post-rl', revision='run-sqqdrtop')
For latent decoding, download the same revision and use extra_state.pt together with the repo root model files.
Notes
- Initialized from the Flawed Fictions post-SFT checkpoint owf320j4.
- Original WandB source artifacts were not present on this machine during archival, so this HF upload keeps the exported model, export metadata, and synthesized archival metadata.
- Downloads last month
- 19