Orthogonalized Cot Model (Layer 17)

This model is an orthogonalized version of Qwen/Qwen3-8B.

Model Details

  • Base Model: Qwen/Qwen3-8B
  • Model Type: Cot
  • Orthogonalization Layer: 17

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("kureha295/Qwen-Qwen3-8B-ortho-cot-layer-17")
tokenizer = AutoTokenizer.from_pretrained("kureha295/Qwen-Qwen3-8B-ortho-cot-layer-17")

Citation

If you use this model, please cite the original model and the orthogonalization method used.

Downloads last month
2
Safetensors
Model size
8B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for kureha295/Qwen-Qwen3-8B-ortho-cot-layer-17

Finetuned
Qwen/Qwen3-8B
Finetuned
(1467)
this model