YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

bert-base-stage2-sbert-v3

SentenceTransformer checkpoint fine-tuned for Vietnamese legal retrieval.

Evaluation

  • Truncate dims: [768, 512, 256, 128]

UIT-ViQuAD2.0

method Accuracy@1 Accuracy@3 Accuracy@5 Accuracy@10 NDCG@3 NDCG@5 NDCG@10 MRR@3 MRR@5 MRR@10 Recall@5 Recall@10 Recall@100 MAP@100
BM25 0.606492 0.75565 0.805643 0.861663 0.694646 0.715313 0.733532 0.673492 0.685004 0.692583 0.805643 0.861663 0.966717 0.69741
768 0.491576 0.675661 0.743186 0.819477 0.599579 0.627532 0.652369 0.573255 0.588849 0.599196 0.743186 0.819477 0.965347 0.605738
512 0.483632 0.667032 0.733872 0.811533 0.591382 0.618992 0.644249 0.565197 0.580564 0.59107 0.733872 0.811533 0.962745 0.597985
256 0.461992 0.653061 0.718258 0.796466 0.573756 0.600622 0.626045 0.546341 0.561256 0.571828 0.718258 0.796466 0.957677 0.579092
128 0.433639 0.61471 0.684427 0.768936 0.53931 0.567997 0.595255 0.513263 0.529165 0.540371 0.684427 0.768936 0.949185 0.548269

Zalo-Legal

method Accuracy@1 Accuracy@3 Accuracy@5 Accuracy@10 NDCG@3 NDCG@5 NDCG@10 MRR@3 MRR@5 MRR@10 Recall@5 Recall@10 Recall@100 MAP@100
BM25 0.379442 0.616751 0.69797 0.77665 0.515637 0.549114 0.57497 0.48181 0.500402 0.511188 0.696701 0.775381 0.923858 0.517283
768 0.323604 0.531726 0.59264 0.676396 0.445228 0.470463 0.497713 0.416667 0.430372 0.441706 0.591371 0.675127 0.926396 0.450915
512 0.309645 0.544416 0.591371 0.67132 0.446587 0.46611 0.492081 0.414129 0.424725 0.435511 0.590102 0.670051 0.918147 0.444509
256 0.305838 0.519036 0.591371 0.663706 0.431393 0.461401 0.484796 0.402284 0.418718 0.428379 0.590102 0.662437 0.913071 0.43801
128 0.279188 0.496193 0.562183 0.651015 0.404668 0.432222 0.460904 0.374154 0.389319 0.401135 0.560914 0.649746 0.899112 0.410837
Downloads last month
82
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support