Token Classification
Transformers
Safetensors
English
bert_gat_pii
feature-extraction
pii
privacy
redaction
ner
bert
gat
graph-attention-network
custom_code
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("manikrishneshwar/pii-redactor-bert-gat", trust_remote_code=True, dtype="auto")Quick Links
PII Redactor β BERT + Graph Attention Network
Token-level PII detection model that combines a BERT contextual encoder with a Graph Attention Network (GAT) refinement stage. The graph mixes sequential-window edges with top-k attention edges drawn from BERT's last layer, letting the GAT exploit both locality and the long-range dependencies BERT already discovered.
The model emits BIO tags over 15 PII categories: SSN, BANK_ACCOUNT,
ROUTING_NUMBER, CREDIT_CARD, CVV, CARD_EXPIRY, IBAN, DOB,
FULL_NAME, EMAIL, PHONE, ADDRESS, PASSPORT, DRIVERS_LICENSE,
TAX_ID.
Quick start
from transformers import AutoModel, AutoTokenizer
REPO = "manikrishneshwar/pii-redactor-bert-gat"
tokenizer = AutoTokenizer.from_pretrained(REPO, trust_remote_code=True)
model = AutoModel.from_pretrained(REPO, trust_remote_code=True)
model.eval()
result = model.predict(
"Email me at john.doe@example.com or call 555-123-4567.",
tokenizer,
)
print(result["redacted"])
# -> "Email me at [EMAIL] or call [PHONE]."
print(result["spans"])
# -> [{'start': 12, 'end': 32, 'label': 'EMAIL', 'value': 'john.doe@example.com'}, ...]
trust_remote_code=True is required because the architecture (BERT + GAT)
is custom and ships as modeling_bert_gat.py in this repository.
Architecture
input_ids βββΊ BERT encoder (with output_attentions=True)
β
βΌ
token embeddings + last-layer attention
β
βΌ
build_token_graph(window=3, top_k=5)
β
βΌ
stack of GATConv layers (heads=4, hidden=128)
β
βΌ
residual + LayerNorm βββΊ classifier βββΊ BIO logits
Inputs / outputs
- Input: raw text string.
- Output: dict with
original,redacted, andspans(list of{start, end, label, value}).
Requirements
torch>=2.0
transformers>=4.30
torch-geometric>=2.3
License
MIT.
- Downloads last month
- 35
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="manikrishneshwar/pii-redactor-bert-gat", trust_remote_code=True)