SAEs for use with the SAELens library

This repository contains the following SAEs:

  • pile/matryoshka/k-100
  • lmsys/matryoshka/k-100

The pile SAE is trained on the monology/pile-uncopyrighted dataset without any chat formatting

The lmsys SAE is trained on the lmsys/lmsys-chat-1m dataset with chat formatting

Load these SAEs using SAELens as below:

from sae_lens import SAE

sae = SAE.from_pretrained("chanind/qwen2.5-7B-it-layer-20-saes", "<sae_id>")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support