Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type
struct<model_name: string, n_latents: int64, override_latents: null, dead_latent_threshold: int64, random_seed: int64, dataset_name: string, llm_context_size: int64, llm_batch_size: int64, llm_dtype: string, buffer: int64, no_overlap: bool, act_threshold_frac: double, total_tokens: int64, scoring: bool, max_tokens_in_explanation: int64, use_demos_in_explanation: bool, n_top_ex_for_generation: int64, n_iw_sampled_ex_for_generation: int64, n_top_ex_for_scoring: int64, n_random_ex_for_scoring: int64, n_iw_sampled_ex_for_scoring: int64>
to
{'model_name': Value(dtype='string', id=None), 'random_seed': Value(dtype='int64', id=None), 'f1_jump_threshold': Value(dtype='float64', id=None), 'max_k_value': Value(dtype='int64', id=None), 'prompt_template': Value(dtype='string', id=None), 'prompt_token_pos': Value(dtype='int64', id=None), 'llm_batch_size': Value(dtype='int64', id=None), 'llm_dtype': Value(dtype='string', id=None), 'k_sparse_probe_l1_decay': Value(dtype='float64', id=None), 'k_sparse_probe_batch_size': Value(dtype='int64', id=None), 'k_sparse_probe_num_epochs': Value(dtype='int64', id=None)}
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1870, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 622, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2292, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2245, in cast_table_to_schema
                  arrays = [
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2246, in <listcomp>
                  cast_array_to_feature(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in <listcomp>
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2108, in cast_array_to_feature
                  raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
              TypeError: Couldn't cast array of type
              struct<model_name: string, n_latents: int64, override_latents: null, dead_latent_threshold: int64, random_seed: int64, dataset_name: string, llm_context_size: int64, llm_batch_size: int64, llm_dtype: string, buffer: int64, no_overlap: bool, act_threshold_frac: double, total_tokens: int64, scoring: bool, max_tokens_in_explanation: int64, use_demos_in_explanation: bool, n_top_ex_for_generation: int64, n_iw_sampled_ex_for_generation: int64, n_top_ex_for_scoring: int64, n_random_ex_for_scoring: int64, n_iw_sampled_ex_for_scoring: int64>
              to
              {'model_name': Value(dtype='string', id=None), 'random_seed': Value(dtype='int64', id=None), 'f1_jump_threshold': Value(dtype='float64', id=None), 'max_k_value': Value(dtype='int64', id=None), 'prompt_template': Value(dtype='string', id=None), 'prompt_token_pos': Value(dtype='int64', id=None), 'llm_batch_size': Value(dtype='int64', id=None), 'llm_dtype': Value(dtype='string', id=None), 'k_sparse_probe_l1_decay': Value(dtype='float64', id=None), 'k_sparse_probe_batch_size': Value(dtype='int64', id=None), 'k_sparse_probe_num_epochs': Value(dtype='int64', id=None)}
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1420, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1052, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 924, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1000, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1741, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1897, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

eval_type_id
string
eval_config
dict
eval_id
string
datetime_epoch_millis
int64
eval_result_metrics
dict
eval_result_details
list
sae_bench_commit_hash
string
sae_lens_id
string
sae_lens_release_id
string
sae_lens_version
string
sae_cfg_dict
dict
eval_result_unstructured
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
22128bb3-6814-43a5-a915-c8e92b3dafdf
1,737,712,515,634
{ "mean": { "mean_absorption_fraction_score": 0.5128883802810622, "mean_full_absorption_score": 0.5078935374565119, "mean_num_split_features": 3.8076923076923075, "std_dev_absorption_fraction_score": 0.2445340748900784, "std_dev_full_absorption_score": 0.22888758394564582, "std_dev_num_split_f...
[ { "first_letter": "a", "mean_absorption_fraction": 0.7546226489205629, "full_absorption_rate": 0.6554716981132076, "num_full_absorption": 1737, "num_probe_true_positives": 2650, "num_split_features": 6 }, { "first_letter": "b", "mean_absorption_fraction": 0.7598621131562924, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_batch_topk_65k_google_gemma-2-2b_batch_top_k_resid_post_layer_12_trainer_0
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
78d7a831-e164-4602-9d20-87d7ed7d8ad9
1,737,714,254,997
{ "mean": { "mean_absorption_fraction_score": 0.4771375902155045, "mean_full_absorption_score": 0.4496415649539802, "mean_num_split_features": 2.3076923076923075, "std_dev_absorption_fraction_score": 0.25096954637588803, "std_dev_full_absorption_score": 0.23370508974066048, "std_dev_num_split_...
[ { "first_letter": "a", "mean_absorption_fraction": 0.7209440526228379, "full_absorption_rate": 0.570566037735849, "num_full_absorption": 1512, "num_probe_true_positives": 2650, "num_split_features": 7 }, { "first_letter": "b", "mean_absorption_fraction": 0.6388987910901524, "...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_batch_topk_65k_google_gemma-2-2b_batch_top_k_resid_post_layer_12_trainer_1
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
38150373-263e-4619-9708-905af9cdec46
1,737,715,984,854
{ "mean": { "mean_absorption_fraction_score": 0.21631138916136367, "mean_full_absorption_score": 0.16653495489427034, "mean_num_split_features": 1.0769230769230769, "std_dev_absorption_fraction_score": 0.14889132747444278, "std_dev_full_absorption_score": 0.14103000082413442, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.2450781561295366, "full_absorption_rate": 0.12150943396226416, "num_full_absorption": 322, "num_probe_true_positives": 2650, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.06763701494377, "f...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_batch_topk_65k_google_gemma-2-2b_batch_top_k_resid_post_layer_12_trainer_2
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
1991bdcc-c42c-49f2-90b0-7514adfa0ad5
1,737,717,864,520
{ "mean": { "mean_absorption_fraction_score": 0.05238180509562919, "mean_full_absorption_score": 0.029674552123737034, "mean_num_split_features": 1.0769230769230769, "std_dev_absorption_fraction_score": 0.13265708322576222, "std_dev_full_absorption_score": 0.11379138967407665, "std_dev_num_spl...
[ { "first_letter": "a", "mean_absorption_fraction": 0.021705160372423097, "full_absorption_rate": 0.0007547169811320754, "num_full_absorption": 2, "num_probe_true_positives": 2650, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.010808477571724633, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_batch_topk_65k_google_gemma-2-2b_batch_top_k_resid_post_layer_12_trainer_3
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
56e1772d-cdf9-480b-bba0-93cf2132f74e
1,737,719,611,557
{ "mean": { "mean_absorption_fraction_score": 0.07573286665680914, "mean_full_absorption_score": 0.0343601563050669, "mean_num_split_features": 1.1153846153846154, "std_dev_absorption_fraction_score": 0.23353194312623113, "std_dev_full_absorption_score": 0.11906354882540282, "std_dev_num_split...
[ { "first_letter": "a", "mean_absorption_fraction": 0.006674118395689502, "full_absorption_rate": 0.0007547169811320754, "num_full_absorption": 2, "num_probe_true_positives": 2650, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.006053694748069832, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_batch_topk_65k_google_gemma-2-2b_batch_top_k_resid_post_layer_12_trainer_4
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
46f7f021-05a9-4488-a96f-7f354c70af6a
1,737,799,424,035
{ "mean": { "mean_absorption_fraction_score": 0.09458438446771604, "mean_full_absorption_score": 0.13016930636592328, "mean_num_split_features": 1.2692307692307692, "std_dev_absorption_fraction_score": 0.07887489256096945, "std_dev_full_absorption_score": 0.09727147892332459, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.1240437008257916, "full_absorption_rate": 0.11189083820662768, "num_full_absorption": 287, "num_probe_true_positives": 2565, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.04824231927288525, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_sixteenths_16k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_0
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
4bd35244-b222-47cf-b1f9-211684736272
1,737,802,155,406
{ "mean": { "mean_absorption_fraction_score": 0.030515343203495613, "mean_full_absorption_score": 0.035867854154501964, "mean_num_split_features": 1.0384615384615385, "std_dev_absorption_fraction_score": 0.05460457038794737, "std_dev_full_absorption_score": 0.04302048742242669, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.24795639270533665, "full_absorption_rate": 0.15633528265107213, "num_full_absorption": 401, "num_probe_true_positives": 2565, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.0006165228113440197, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_sixteenths_16k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_1
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
9cbe10f7-5e21-4141-a697-d87174bf326c
1,737,804,972,054
{ "mean": { "mean_absorption_fraction_score": 0.03106953109960984, "mean_full_absorption_score": 0.012854656905879383, "mean_num_split_features": 1.0769230769230769, "std_dev_absorption_fraction_score": 0.07575219388605692, "std_dev_full_absorption_score": 0.024754535069899743, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0065242535617624445, "full_absorption_rate": 0.001949317738791423, "num_full_absorption": 5, "num_probe_true_positives": 2565, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorptio...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_sixteenths_16k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_2
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
7c92dbf9-3524-4134-ae46-8ade1007c5e6
1,737,807,807,590
{ "mean": { "mean_absorption_fraction_score": 0.031425759865975664, "mean_full_absorption_score": 0.00798730220887853, "mean_num_split_features": 1.2692307692307692, "std_dev_absorption_fraction_score": 0.08877416427897664, "std_dev_full_absorption_score": 0.02236479529501252, "std_dev_num_spl...
[ { "first_letter": "a", "mean_absorption_fraction": 0.002968917098976069, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2565, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.00008305828714954895, "full_absorpti...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_sixteenths_16k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_3
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
81614709-a4e5-4ffc-a70e-787e336301fb
1,737,810,824,006
{ "mean": { "mean_absorption_fraction_score": 0.12344715765521998, "mean_full_absorption_score": 0.020941627912473063, "mean_num_split_features": 1.3076923076923077, "std_dev_absorption_fraction_score": 0.22793801150350548, "std_dev_full_absorption_score": 0.05550857979831386, "std_dev_num_spl...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0010201140691341855, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2565, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0, "nu...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_sixteenths_16k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_4
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
1b835c25-d0bd-46a9-ba4a-65872e7977e3
1,737,813,909,106
{ "mean": { "mean_absorption_fraction_score": 0.20037975196337074, "mean_full_absorption_score": 0.08445296532788284, "mean_num_split_features": 1.2307692307692308, "std_dev_absorption_fraction_score": 0.3571682950909613, "std_dev_full_absorption_score": 0.18688583316919946, "std_dev_num_split...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0009566173313411924, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2565, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.0002938069464259562, "full_absorpti...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_sixteenths_16k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_5
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
2b1f797b-e6c5-433c-9131-e78ce0ae8dbf
1,737,811,308,747
{ "mean": { "mean_absorption_fraction_score": 0.03160664605701297, "mean_full_absorption_score": 0.04485786651607433, "mean_num_split_features": 1, "std_dev_absorption_fraction_score": 0.02431700883955569, "std_dev_full_absorption_score": 0.022964225202992777, "std_dev_num_split_features": 0 ...
[ { "first_letter": "a", "mean_absorption_fraction": 0.03275680184279681, "full_absorption_rate": 0.03207547169811321, "num_full_absorption": 85, "num_probe_true_positives": 2650, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.019401205951431712, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_stop_grads_65k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_0
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
b9e43171-44f7-410c-b2c5-98f7e5ed8d26
1,737,813,516,583
{ "mean": { "mean_absorption_fraction_score": 0.021053944070079312, "mean_full_absorption_score": 0.02739422883948885, "mean_num_split_features": 1.0384615384615385, "std_dev_absorption_fraction_score": 0.035379895313615614, "std_dev_full_absorption_score": 0.02803389732699402, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.01349368543248761, "full_absorption_rate": 0.013584905660377358, "num_full_absorption": 36, "num_probe_true_positives": 2650, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.000988695490462381, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_stop_grads_65k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_1
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
3685714b-012a-4d2a-aad1-13336979110a
1,737,815,436,825
{ "mean": { "mean_absorption_fraction_score": 0.028296727352282947, "mean_full_absorption_score": 0.026550158425458565, "mean_num_split_features": 1.1923076923076923, "std_dev_absorption_fraction_score": 0.11763191508475491, "std_dev_full_absorption_score": 0.10961462359094273, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.010370499680071346, "full_absorption_rate": 0.0022641509433962265, "num_full_absorption": 6, "num_probe_true_positives": 2650, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.001300826307753457, ...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_stop_grads_65k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_2
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
0fb01dc2-29a7-4e6b-8319-6b31709938a6
1,737,817,645,865
{ "mean": { "mean_absorption_fraction_score": 0.03270145658364916, "mean_full_absorption_score": 0.006974607937273298, "mean_num_split_features": 1.1538461538461537, "std_dev_absorption_fraction_score": 0.08231672645087, "std_dev_full_absorption_score": 0.01877579944789174, "std_dev_num_split_...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0019202388740752422, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2650, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.004743104392636925, "full_absorptio...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_stop_grads_65k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_3
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
9fa3e36e-d56c-485c-9dfc-e928458992ac
1,737,819,909,018
{ "mean": { "mean_absorption_fraction_score": 0.08365462288396561, "mean_full_absorption_score": 0.013140886646403137, "mean_num_split_features": 1.4230769230769231, "std_dev_absorption_fraction_score": 0.1961351361322534, "std_dev_full_absorption_score": 0.0444352909698126, "std_dev_num_split...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0003369577873023532, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2650, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.0026789781225803186, "full_absorpti...
f2d1d982515d2dee706eb23a1ca459b308988764
custom_sae
matryoshka_0121_MatryoshkaBatchTopKTrainer_gemma_stop_grads_65k_google_gemma-2-2b_matryoshka_batch_top_k_resid_post_layer_12_trainer_4
5.3.2
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 65536, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
10ac0aa1-ceb3-4c0c-8525-be7a70bb3697
1,737,062,316,439
{ "mean": { "mean_absorption_fraction_score": 0.11950847593966478, "mean_full_absorption_score": 0.16479118721852162, "mean_num_split_features": 1.4615384615384615, "std_dev_absorption_fraction_score": 0.10322895406591728, "std_dev_full_absorption_score": 0.11657574845566746, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.06234240169124326, "full_absorption_rate": 0.08367885412740295, "num_full_absorption": 222, "num_probe_true_positives": 2653, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.02362234405090019, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_0
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
82c61441-c392-40bc-8557-3c40dcb020ee
1,737,061,559,425
{ "mean": { "mean_absorption_fraction_score": 0.03084089261921559, "mean_full_absorption_score": 0.03531820203945317, "mean_num_split_features": 1.0769230769230769, "std_dev_absorption_fraction_score": 0.03747240952523078, "std_dev_full_absorption_score": 0.039462419423232614, "std_dev_num_spl...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0019029400187600815, "full_absorption_rate": 0.014700339238597813, "num_full_absorption": 39, "num_probe_true_positives": 2653, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.030856235831803105,...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_1
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
b312c902-14d4-4c93-8170-12ac5fdad255
1,737,063,064,010
{ "mean": { "mean_absorption_fraction_score": 0.06152242950027413, "mean_full_absorption_score": 0.053139073641173235, "mean_num_split_features": 1.0384615384615385, "std_dev_absorption_fraction_score": 0.18867860911173218, "std_dev_full_absorption_score": 0.1641895159674864, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.003278968544230297, "full_absorption_rate": 0.004900113079532605, "num_full_absorption": 13, "num_probe_true_positives": 2653, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.0008573358735094912,...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_2
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
04b3e408-25c4-440f-8e25-252f25096364
1,737,060,809,293
{ "mean": { "mean_absorption_fraction_score": 0.035492267514168084, "mean_full_absorption_score": 0.011443919235863919, "mean_num_split_features": 1.1153846153846154, "std_dev_absorption_fraction_score": 0.08732636758942158, "std_dev_full_absorption_score": 0.031586623632980516, "std_dev_num_s...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0015725577062746787, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2653, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.006529180967085105, "full_absorptio...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_3
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
b86e1953-f88c-40e0-9c6e-6b538f6aebc4
1,737,146,364,192
{ "mean": { "mean_absorption_fraction_score": 0.040963564991204376, "mean_full_absorption_score": 0.044218077757368565, "mean_num_split_features": 1.0384615384615385, "std_dev_absorption_fraction_score": 0.04529729864606932, "std_dev_full_absorption_score": 0.04631330734763345, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.01633691443248863, "full_absorption_rate": 0.014106583072100314, "num_full_absorption": 36, "num_probe_true_positives": 2552, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.029062496734833423, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_10
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
e2c14e6b-ec3a-4cf2-b8eb-e380b33ee362
1,737,147,197,941
{ "mean": { "mean_absorption_fraction_score": 0.027072309279006453, "mean_full_absorption_score": 0.016616845176560383, "mean_num_split_features": 1.1153846153846154, "std_dev_absorption_fraction_score": 0.06585561357782864, "std_dev_full_absorption_score": 0.03614958736351392, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.003793300975676883, "full_absorption_rate": 0.006269592476489028, "num_full_absorption": 16, "num_probe_true_positives": 2552, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.005094848025342401, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_4
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
188d1c5c-9428-4d77-a371-c9986f257c9e
1,737,143,916,782
{ "mean": { "mean_absorption_fraction_score": 0.033129298868077715, "mean_full_absorption_score": 0.005726610524644093, "mean_num_split_features": 1.1538461538461537, "std_dev_absorption_fraction_score": 0.07019099863789595, "std_dev_full_absorption_score": 0.01263976261024744, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.0003371415572227024, "full_absorption_rate": 0, "num_full_absorption": 0, "num_probe_true_positives": 2552, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0, "full_absorption_rate": 0.00128369...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_6
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
149d5be7-a761-4bd5-ad33-a2a7e1032697
1,737,145,557,469
{ "mean": { "mean_absorption_fraction_score": 0.2106062149606569, "mean_full_absorption_score": 0.2573622303681477, "mean_num_split_features": 1.8461538461538463, "std_dev_absorption_fraction_score": 0.16471010888588386, "std_dev_full_absorption_score": 0.16859570672534582, "std_dev_num_split_...
[ { "first_letter": "a", "mean_absorption_fraction": 0.43721159254796027, "full_absorption_rate": 0.41731974921630094, "num_full_absorption": 1065, "num_probe_true_positives": 2552, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.2541432298819763, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_8
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
2ca726a4-747c-427d-8097-e3d50e02d69c
1,737,141,378,733
{ "mean": { "mean_absorption_fraction_score": 0.035858294456264164, "mean_full_absorption_score": 0.04327725157287138, "mean_num_split_features": 1.1153846153846154, "std_dev_absorption_fraction_score": 0.03863277873082915, "std_dev_full_absorption_score": 0.03937419939646101, "std_dev_num_spl...
[ { "first_letter": "a", "mean_absorption_fraction": 0.009716181150433207, "full_absorption_rate": 0.01018808777429467, "num_full_absorption": 26, "num_probe_true_positives": 2552, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.016111086271262294, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_11
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
0f74538e-1119-4a07-be14-26891cef9c64
1,737,142,189,177
{ "mean": { "mean_absorption_fraction_score": 0.030667373030913547, "mean_full_absorption_score": 0.018012890157014762, "mean_num_split_features": 1.1153846153846154, "std_dev_absorption_fraction_score": 0.07501482963679708, "std_dev_full_absorption_score": 0.04605037706593131, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.009315556865266176, "full_absorption_rate": 0.0109717868338558, "num_full_absorption": 28, "num_probe_true_positives": 2552, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.0001856105874655343, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_5
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
36c901eb-fd9d-4248-a345-6528e09814d5
1,737,144,751,919
{ "mean": { "mean_absorption_fraction_score": 0.057866464558825804, "mean_full_absorption_score": 0.025326571771609675, "mean_num_split_features": 1.2307692307692308, "std_dev_absorption_fraction_score": 0.14178350447530083, "std_dev_full_absorption_score": 0.07866881455590208, "std_dev_num_sp...
[ { "first_letter": "a", "mean_absorption_fraction": 0.002487964381935123, "full_absorption_rate": 0.0035266457680250786, "num_full_absorption": 9, "num_probe_true_positives": 2552, "num_split_features": 2 }, { "first_letter": "b", "mean_absorption_fraction": 0.00045833659915553, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_7
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 16, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
75a05da9-de20-4b70-a8d5-1c61b408d69c
1,737,143,072,170
{ "mean": { "mean_absorption_fraction_score": 0.16010014159920063, "mean_full_absorption_score": 0.19988766318178616, "mean_num_split_features": 1.8076923076923077, "std_dev_absorption_fraction_score": 0.16260351157228736, "std_dev_full_absorption_score": 0.16385069027460206, "std_dev_num_spli...
[ { "first_letter": "a", "mean_absorption_fraction": 0.059571013510286645, "full_absorption_rate": 0.07210031347962383, "num_full_absorption": 184, "num_probe_true_positives": 2552, "num_split_features": 1 }, { "first_letter": "b", "mean_absorption_fraction": 0.010057645200647558, ...
141aff72928f7588c1451bed47c401e1d565d471
custom_sae
matryoshka_gemma-2-2b-16k-v2_matryoshka_google_gemma-2-2b_random_matryoshka_batch_top_k_resid_post_layer_12_trainer_9
5.3.1
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "matryoshka_batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", ...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
3429bb5d-1d97-4bcc-847d-87fe84c2a686
1,736,879,939,330
{ "mean": { "mean_absorption_fraction_score": 0.27576193235041807, "mean_full_absorption_score": 0.3010377953790517, "mean_num_split_features": 2.923076923076923 } }
[ { "first_letter": "a", "mean_absorption_fraction": 0.5420027769744852, "full_absorption_rate": 0.4655901576316801, "num_full_absorption": 1211, "num_probe_true_positives": 2601, "num_split_features": 7 }, { "first_letter": "b", "mean_absorption_fraction": 0.3747844201612148, ...
fda4e306e03c1eb5996046a544947f9663d3ae39
custom_sae
matroyshka_gemma-2-2b-16k-v2_BatchTopKTrainer_baseline_google_gemma-2-2b_ctx1024_0114_resid_post_layer_12_trainer_0
5.3.0
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
absorption_first_letter
{ "model_name": "gemma-2-2b", "random_seed": 42, "f1_jump_threshold": 0.03, "max_k_value": 10, "prompt_template": "{word} has the first letter:", "prompt_token_pos": -6, "llm_batch_size": 32, "llm_dtype": "bfloat16", "k_sparse_probe_l1_decay": 0.01, "k_sparse_probe_batch_size": 4096, "k_sparse_pro...
1c3c1f17-71e3-461f-bac9-b6e0abfdc069
1,736,887,980,377
{ "mean": { "mean_absorption_fraction_score": 0.24881234507100503, "mean_full_absorption_score": 0.23965311253121577, "mean_num_split_features": 1.5769230769230769 } }
[ { "first_letter": "a", "mean_absorption_fraction": 0.555393511761619, "full_absorption_rate": 0.41637831603229525, "num_full_absorption": 1083, "num_probe_true_positives": 2601, "num_split_features": 3 }, { "first_letter": "b", "mean_absorption_fraction": 0.2613294204052165, ...
fda4e306e03c1eb5996046a544947f9663d3ae39
custom_sae
matroyshka_gemma-2-2b-16k-v2_BatchTopKTrainer_baseline_google_gemma-2-2b_ctx1024_0114_resid_post_layer_12_trainer_1
5.3.0
{ "model_name": "gemma-2-2b", "d_in": 2304, "d_sae": 16384, "hook_layer": 12, "hook_name": "blocks.12.hook_resid_post", "context_size": null, "hook_head_index": null, "architecture": "batch_topk", "apply_b_dec_to_input": null, "finetuning_scaling_factor": null, "activation_fn_str": "", "prepend_...
null
End of preview.
README.md exists but content is empty.
Downloads last month
21