Symio-ai/legal-evidence-classifier

Model Description

Legal Evidence Classifier classifies evidence by type, admissibility, relevance, and evidentiary weight. Given a piece of evidence (document text, testimony excerpt, or exhibit description), it predicts: evidence type (documentary, testimonial, physical, digital), admissibility under FRE/state evidence rules, likely objections, and probative value for each cause of action.

Critical for exhibit management and evidence strategy in the GLACIER pipeline.

Intended Use

  • Primary: Classify and organize evidence for trial preparation and filing
  • Secondary: Predict admissibility challenges and suggest foundations for admission
  • Integration: Powers exhibit management in GLACIER Stages 4 and 5

Task Type

text-classification -- Multi-label evidence classification with admissibility scoring

Base Model

microsoft/deberta-v3-large -- Strong NLI for evidence classification and admissibility analysis

Training Data

Source Records Description
Evidence Rulings ~300K Judicial rulings on evidence admissibility
FRE Case Annotations ~100K Federal Rules of Evidence case annotations
FL Evidence Code Cases ~50K Florida evidence code rulings
Exhibit Lists (with outcomes) ~200K Exhibit lists from trials with admission/exclusion results
Expert Witness Challenges ~50K Daubert/Frye challenge outcomes

Classification Categories

  • Type: Documentary, testimonial, physical, digital, demonstrative, expert
  • Admissibility: Admissible, likely objected, likely excluded, privileged
  • FRE Issues: Hearsay (with exception analysis), authentication, best evidence, relevance (403 balancing)
  • Foundation: What foundation is needed for admission
  • Weight: High/medium/low probative value per cause of action

Common Objection Predictions

  • Hearsay (FRE 801-807) with exception identification
  • Authentication (FRE 901-902)
  • Best evidence (FRE 1001-1008)
  • Relevance/prejudice (FRE 401-403)
  • Privilege (attorney-client, work product, spousal)
  • Expert reliability (Daubert/Frye)

Benchmark Criteria (90%+ Target)

Metric Target Description
Type Classification >= 95% Correct evidence type
Admissibility Prediction >= 82% Matches actual judicial ruling
Hearsay Exception ID >= 88% Correctly identify applicable hearsay exception
Objection Prediction Recall >= 85% Predict objections opposing counsel will raise
Foundation Requirements >= 90% Correctly specify needed foundation

GLACIER Pipeline Integration

STAGE 2 (Research) --> evidence-classifier organizes available evidence
STAGE 4 (Draft) --> classifier determines exhibit order and index
STAGE 5 (WDC #2) --> verify all exhibits have proper foundation and no admissibility gaps

Training Configuration

  • Epochs: 8
  • Learning rate: 2e-5
  • Batch size: 16
  • Max sequence length: 1024
  • Hardware: AWS SageMaker ml.g5.4xlarge

Limitations

  • Admissibility is highly context-dependent; prediction is probabilistic
  • Judge-specific evidence rulings are not modeled (see judicial-predictor for that)
  • Cannot analyze physical evidence from description alone
  • Digital evidence authentication requirements evolve rapidly
  • Does not perform chain-of-custody analysis

Version History

Version Date Notes
v0.1 2026-04-10 Initial model card, repo created
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Symio-ai/legal-evidence-classifier

Finetuned
(257)
this model