LLaMA 3 8B โ€” CE-FT-1 Single-Fact Edit

Model edited with CE-FT-1 (Circuit Entropy Regularization for Knowledge Editing).

Base model: meta-llama/Meta-Llama-3-8B-Instruct

Edit

Prompt The Eiffel Tower is located in the city of
Target Berlin
Method CE-FT-1
Lambda 10
Edit success True

Training Config

Parameter Value
Steps 20
Learning rate 5e-06
Weight decay 0.01
Grad clip 1.0
Lambda (entropy) 10
EAP-IG steps 5
dtype bfloat16
Seed 42

Final Metrics

Metric Value
Final L_CE 0.008706
Final KL 0.059166
Final H(C) 9.8857
Final delta_H 0.0885

Usage

from transformer_lens import HookedTransformer
import torch

model = HookedTransformer.from_pretrained(
    "meta-llama/Meta-Llama-3-8B-Instruct",
    dtype=torch.bfloat16,
)
state_dict = torch.load("model_state_dict.pt", map_location="cpu")
model.load_state_dict(state_dict)
model = model.to("cuda")

tokens = model.to_tokens("The Eiffel Tower is located in the city of")
out = model.generate(tokens, max_new_tokens=10, do_sample=False)
print(model.tokenizer.decode(out[0]))

License

This model inherits the Meta LLaMA 3 Community License.

Paper

Circuit Entropy Regularization for Knowledge Editing (NeurIPS 2026 submission)

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support