{ "base_model": "jinaai/jina-embeddings-v3", "tree": [ { "model_id": "jinaai/jina-embeddings-v3", "gated": "False", "card": "---\nlicense: cc-by-nc-4.0\ntags:\n- feature-extraction\n- sentence-similarity\n- mteb\n- sentence-transformers\nlanguage:\n - multilingual\n - af\n - am\n - ar\n - as\n - az\n - be\n - bg\n - bn\n - br\n - bs\n - ca\n - cs\n - cy\n - da\n - de\n - el\n - en\n - eo\n - es\n - et\n - eu\n - fa\n - fi\n - fr\n - fy\n - ga\n - gd\n - gl\n - gu\n - ha\n - he\n - hi\n - hr\n - hu\n - hy\n - id\n - is\n - it\n - ja\n - jv\n - ka\n - kk\n - km\n - kn\n - ko\n - ku\n - ky\n - la\n - lo\n - lt\n - lv\n - mg\n - mk\n - ml\n - mn\n - mr\n - ms\n - my\n - ne\n - nl\n - no\n - om\n - or\n - pa\n - pl\n - ps\n - pt\n - ro\n - ru\n - sa\n - sd\n - si\n - sk\n - sl\n - so\n - sq\n - sr\n - su\n - sv\n - sw\n - ta\n - te\n - th\n - tl\n - tr\n - ug\n - uk\n - ur\n - uz\n - vi\n - xh\n - yi\n - zh\ninference: false\nlibrary_name: transformers\nmodel-index:\n- name: jina-embeddings-v3\n results:\n - dataset:\n config: default\n name: MTEB AFQMC (default)\n revision: b44c3b011063adb25877c13823db83bb193913c4\n split: validation\n type: C-MTEB/AFQMC\n metrics:\n - type: cosine_pearson\n value: 41.74237700998808\n - type: cosine_spearman\n value: 43.4726782647566\n - type: euclidean_pearson\n value: 42.244585459479964\n - type: euclidean_spearman\n value: 43.525070045169606\n - type: main_score\n value: 43.4726782647566\n - type: manhattan_pearson\n value: 42.04616728224863\n - type: manhattan_spearman\n value: 43.308828270754645\n - type: pearson\n value: 41.74237700998808\n - type: spearman\n value: 43.4726782647566\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB ArguAna-PL (default)\n revision: 63fc86750af76253e8c760fc9e534bbf24d260a2\n split: test\n type: clarin-knext/arguana-pl\n metrics:\n - type: main_score\n value: 50.117999999999995\n - type: map_at_1\n value: 24.253\n - type: map_at_10\n value: 40.725\n - type: map_at_100\n value: 41.699999999999996\n - type: map_at_1000\n value: 41.707\n - type: map_at_20\n value: 41.467999999999996\n - type: map_at_3\n value: 35.467\n - type: map_at_5\n value: 38.291\n - type: mrr_at_1\n value: 24.751066856330013\n - type: mrr_at_10\n value: 40.91063808169072\n - type: mrr_at_100\n value: 41.885497923928675\n - type: mrr_at_1000\n value: 41.89301098419842\n - type: mrr_at_20\n value: 41.653552355442514\n - type: mrr_at_3\n value: 35.656709340919775\n - type: mrr_at_5\n value: 38.466097676623946\n - type: nauc_map_at_1000_diff1\n value: 7.503000359807567\n - type: nauc_map_at_1000_max\n value: -11.030405164830546\n - type: nauc_map_at_1000_std\n value: -8.902792782585117\n - type: nauc_map_at_100_diff1\n value: 7.509899249593199\n - type: nauc_map_at_100_max\n value: -11.023581259404406\n - type: nauc_map_at_100_std\n value: -8.892241185067272\n - type: nauc_map_at_10_diff1\n value: 7.24369711881512\n - type: nauc_map_at_10_max\n value: -10.810000200433278\n - type: nauc_map_at_10_std\n value: -8.987230542165776\n - type: nauc_map_at_1_diff1\n value: 11.37175831832417\n - type: nauc_map_at_1_max\n value: -13.315221903223055\n - type: nauc_map_at_1_std\n value: -9.398199605510275\n - type: nauc_map_at_20_diff1\n value: 7.477364530860648\n - type: nauc_map_at_20_max\n value: -10.901251218105566\n - type: nauc_map_at_20_std\n value: -8.868148116405925\n - type: nauc_map_at_3_diff1\n value: 6.555548802174882\n - type: nauc_map_at_3_max\n value: -12.247274800542934\n - type: nauc_map_at_3_std\n value: -9.879475250984811\n - type: nauc_map_at_5_diff1\n value: 7.426588563355882\n - type: nauc_map_at_5_max\n value: -11.347695686001805\n - type: nauc_map_at_5_std\n value: -9.34441892203972\n - type: nauc_mrr_at_1000_diff1\n value: 5.99737552143614\n - type: nauc_mrr_at_1000_max\n value: -11.327205136505727\n - type: nauc_mrr_at_1000_std\n value: -8.791079115519503\n - type: nauc_mrr_at_100_diff1\n value: 6.004622525255784\n - type: nauc_mrr_at_100_max\n value: -11.320336759899723\n - type: nauc_mrr_at_100_std\n value: -8.780602249831777\n - type: nauc_mrr_at_10_diff1\n value: 5.783623516930227\n - type: nauc_mrr_at_10_max\n value: -11.095971693467078\n - type: nauc_mrr_at_10_std\n value: -8.877242032013582\n - type: nauc_mrr_at_1_diff1\n value: 9.694937537703797\n - type: nauc_mrr_at_1_max\n value: -12.531905083727912\n - type: nauc_mrr_at_1_std\n value: -8.903992940100146\n - type: nauc_mrr_at_20_diff1\n value: 5.984841206233873\n - type: nauc_mrr_at_20_max\n value: -11.195236951048969\n - type: nauc_mrr_at_20_std\n value: -8.757266039186018\n - type: nauc_mrr_at_3_diff1\n value: 5.114333824261379\n - type: nauc_mrr_at_3_max\n value: -12.64809799843464\n - type: nauc_mrr_at_3_std\n value: -9.791146138025184\n - type: nauc_mrr_at_5_diff1\n value: 5.88941606224512\n - type: nauc_mrr_at_5_max\n value: -11.763903418071918\n - type: nauc_mrr_at_5_std\n value: -9.279175712709446\n - type: nauc_ndcg_at_1000_diff1\n value: 7.076950652226086\n - type: nauc_ndcg_at_1000_max\n value: -10.386482092087371\n - type: nauc_ndcg_at_1000_std\n value: -8.309190917074046\n - type: nauc_ndcg_at_100_diff1\n value: 7.2329220284865245\n - type: nauc_ndcg_at_100_max\n value: -10.208048403220337\n - type: nauc_ndcg_at_100_std\n value: -7.997975874274613\n - type: nauc_ndcg_at_10_diff1\n value: 6.065391100006953\n - type: nauc_ndcg_at_10_max\n value: -9.046164377601153\n - type: nauc_ndcg_at_10_std\n value: -8.34724889697153\n - type: nauc_ndcg_at_1_diff1\n value: 11.37175831832417\n - type: nauc_ndcg_at_1_max\n value: -13.315221903223055\n - type: nauc_ndcg_at_1_std\n value: -9.398199605510275\n - type: nauc_ndcg_at_20_diff1\n value: 6.949389989202601\n - type: nauc_ndcg_at_20_max\n value: -9.35740451760307\n - type: nauc_ndcg_at_20_std\n value: -7.761295171828212\n - type: nauc_ndcg_at_3_diff1\n value: 5.051471796151364\n - type: nauc_ndcg_at_3_max\n value: -12.158763333711653\n - type: nauc_ndcg_at_3_std\n value: -10.078902544421926\n - type: nauc_ndcg_at_5_diff1\n value: 6.527454512611454\n - type: nauc_ndcg_at_5_max\n value: -10.525118233848586\n - type: nauc_ndcg_at_5_std\n value: -9.120055125584031\n - type: nauc_precision_at_1000_diff1\n value: -10.6495668199151\n - type: nauc_precision_at_1000_max\n value: 12.070656425217841\n - type: nauc_precision_at_1000_std\n value: 55.844551709649004\n - type: nauc_precision_at_100_diff1\n value: 19.206967129266285\n - type: nauc_precision_at_100_max\n value: 16.296851020813456\n - type: nauc_precision_at_100_std\n value: 45.60378984257811\n - type: nauc_precision_at_10_diff1\n value: 0.6490335354304879\n - type: nauc_precision_at_10_max\n value: 0.5757198255366447\n - type: nauc_precision_at_10_std\n value: -4.875847131691451\n - type: nauc_precision_at_1_diff1\n value: 11.37175831832417\n - type: nauc_precision_at_1_max\n value: -13.315221903223055\n - type: nauc_precision_at_1_std\n value: -9.398199605510275\n - type: nauc_precision_at_20_diff1\n value: 4.899369866929203\n - type: nauc_precision_at_20_max\n value: 5.988537297189552\n - type: nauc_precision_at_20_std\n value: 4.830900387582837\n - type: nauc_precision_at_3_diff1\n value: 0.8791156910997744\n - type: nauc_precision_at_3_max\n value: -11.983373635905993\n - type: nauc_precision_at_3_std\n value: -10.646185111581257\n - type: nauc_precision_at_5_diff1\n value: 3.9314486166548432\n - type: nauc_precision_at_5_max\n value: -7.798591396895839\n - type: nauc_precision_at_5_std\n value: -8.293043407234125\n - type: nauc_recall_at_1000_diff1\n value: -10.649566819918673\n - type: nauc_recall_at_1000_max\n value: 12.070656425214647\n - type: nauc_recall_at_1000_std\n value: 55.84455170965023\n - type: nauc_recall_at_100_diff1\n value: 19.206967129265127\n - type: nauc_recall_at_100_max\n value: 16.296851020813722\n - type: nauc_recall_at_100_std\n value: 45.60378984257728\n - type: nauc_recall_at_10_diff1\n value: 0.6490335354304176\n - type: nauc_recall_at_10_max\n value: 0.5757198255366095\n - type: nauc_recall_at_10_std\n value: -4.875847131691468\n - type: nauc_recall_at_1_diff1\n value: 11.37175831832417\n - type: nauc_recall_at_1_max\n value: -13.315221903223055\n - type: nauc_recall_at_1_std\n value: -9.398199605510275\n - type: nauc_recall_at_20_diff1\n value: 4.899369866929402\n - type: nauc_recall_at_20_max\n value: 5.98853729718968\n - type: nauc_recall_at_20_std\n value: 4.830900387582967\n - type: nauc_recall_at_3_diff1\n value: 0.8791156910997652\n - type: nauc_recall_at_3_max\n value: -11.983373635905997\n - type: nauc_recall_at_3_std\n value: -10.64618511158124\n - type: nauc_recall_at_5_diff1\n value: 3.9314486166548472\n - type: nauc_recall_at_5_max\n value: -7.7985913968958585\n - type: nauc_recall_at_5_std\n value: -8.293043407234132\n - type: ndcg_at_1\n value: 24.253\n - type: ndcg_at_10\n value: 50.117999999999995\n - type: ndcg_at_100\n value: 54.291999999999994\n - type: ndcg_at_1000\n value: 54.44799999999999\n - type: ndcg_at_20\n value: 52.771\n - type: ndcg_at_3\n value: 39.296\n - type: ndcg_at_5\n value: 44.373000000000005\n - type: precision_at_1\n value: 24.253\n - type: precision_at_10\n value: 8.016\n - type: precision_at_100\n value: 0.984\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.527\n - type: precision_at_3\n value: 16.808999999999997\n - type: precision_at_5\n value: 12.546\n - type: recall_at_1\n value: 24.253\n - type: recall_at_10\n value: 80.156\n - type: recall_at_100\n value: 98.43499999999999\n - type: recall_at_1000\n value: 99.57300000000001\n - type: recall_at_20\n value: 90.54100000000001\n - type: recall_at_3\n value: 50.427\n - type: recall_at_5\n value: 62.731\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB DBPedia-PL (default)\n revision: 76afe41d9af165cc40999fcaa92312b8b012064a\n split: test\n type: clarin-knext/dbpedia-pl\n metrics:\n - type: main_score\n value: 34.827000000000005\n - type: map_at_1\n value: 7.049999999999999\n - type: map_at_10\n value: 14.982999999999999\n - type: map_at_100\n value: 20.816000000000003\n - type: map_at_1000\n value: 22.33\n - type: map_at_20\n value: 17.272000000000002\n - type: map_at_3\n value: 10.661\n - type: map_at_5\n value: 12.498\n - type: mrr_at_1\n value: 57.25\n - type: mrr_at_10\n value: 65.81934523809524\n - type: mrr_at_100\n value: 66.2564203928212\n - type: mrr_at_1000\n value: 66.27993662923856\n - type: mrr_at_20\n value: 66.0732139130649\n - type: mrr_at_3\n value: 64.08333333333333\n - type: mrr_at_5\n value: 65.27083333333333\n - type: nauc_map_at_1000_diff1\n value: 16.41780871174038\n - type: nauc_map_at_1000_max\n value: 30.193946325654654\n - type: nauc_map_at_1000_std\n value: 31.46095497039037\n - type: nauc_map_at_100_diff1\n value: 18.57903165498531\n - type: nauc_map_at_100_max\n value: 29.541476938623262\n - type: nauc_map_at_100_std\n value: 28.228604103301052\n - type: nauc_map_at_10_diff1\n value: 24.109434489748946\n - type: nauc_map_at_10_max\n value: 21.475954208048968\n - type: nauc_map_at_10_std\n value: 9.964464537806988\n - type: nauc_map_at_1_diff1\n value: 38.67437644802124\n - type: nauc_map_at_1_max\n value: 14.52136658726491\n - type: nauc_map_at_1_std\n value: -2.8981666782088755\n - type: nauc_map_at_20_diff1\n value: 21.42547228801935\n - type: nauc_map_at_20_max\n value: 25.04510402960458\n - type: nauc_map_at_20_std\n value: 16.533079346431155\n - type: nauc_map_at_3_diff1\n value: 26.63648858245477\n - type: nauc_map_at_3_max\n value: 13.632235789780415\n - type: nauc_map_at_3_std\n value: -0.40129174577700716\n - type: nauc_map_at_5_diff1\n value: 24.513861031197933\n - type: nauc_map_at_5_max\n value: 16.599888813946688\n - type: nauc_map_at_5_std\n value: 3.4448514739556346\n - type: nauc_mrr_at_1000_diff1\n value: 36.57353464537154\n - type: nauc_mrr_at_1000_max\n value: 55.34763483979515\n - type: nauc_mrr_at_1000_std\n value: 40.3722796438533\n - type: nauc_mrr_at_100_diff1\n value: 36.555989566513134\n - type: nauc_mrr_at_100_max\n value: 55.347805216808396\n - type: nauc_mrr_at_100_std\n value: 40.38465945075711\n - type: nauc_mrr_at_10_diff1\n value: 36.771572999261984\n - type: nauc_mrr_at_10_max\n value: 55.41239897909165\n - type: nauc_mrr_at_10_std\n value: 40.52058934624793\n - type: nauc_mrr_at_1_diff1\n value: 38.2472828531032\n - type: nauc_mrr_at_1_max\n value: 51.528473828685705\n - type: nauc_mrr_at_1_std\n value: 33.03676467942882\n - type: nauc_mrr_at_20_diff1\n value: 36.642602571889036\n - type: nauc_mrr_at_20_max\n value: 55.3763342076553\n - type: nauc_mrr_at_20_std\n value: 40.41520090500838\n - type: nauc_mrr_at_3_diff1\n value: 36.79451847426628\n - type: nauc_mrr_at_3_max\n value: 54.59778581826193\n - type: nauc_mrr_at_3_std\n value: 39.48392075873095\n - type: nauc_mrr_at_5_diff1\n value: 36.92150807529304\n - type: nauc_mrr_at_5_max\n value: 55.03553978718272\n - type: nauc_mrr_at_5_std\n value: 40.20147745489917\n - type: nauc_ndcg_at_1000_diff1\n value: 21.843092744321268\n - type: nauc_ndcg_at_1000_max\n value: 44.93275990394279\n - type: nauc_ndcg_at_1000_std\n value: 47.09186225236347\n - type: nauc_ndcg_at_100_diff1\n value: 25.180282568979095\n - type: nauc_ndcg_at_100_max\n value: 41.737709709508394\n - type: nauc_ndcg_at_100_std\n value: 38.80950644139446\n - type: nauc_ndcg_at_10_diff1\n value: 24.108368037214046\n - type: nauc_ndcg_at_10_max\n value: 41.29298370689967\n - type: nauc_ndcg_at_10_std\n value: 35.06450769738732\n - type: nauc_ndcg_at_1_diff1\n value: 35.51010679525079\n - type: nauc_ndcg_at_1_max\n value: 42.40790024212412\n - type: nauc_ndcg_at_1_std\n value: 26.696412036243157\n - type: nauc_ndcg_at_20_diff1\n value: 23.909989673256195\n - type: nauc_ndcg_at_20_max\n value: 39.78444647091927\n - type: nauc_ndcg_at_20_std\n value: 33.39544470364529\n - type: nauc_ndcg_at_3_diff1\n value: 22.50484297956035\n - type: nauc_ndcg_at_3_max\n value: 39.14551926034168\n - type: nauc_ndcg_at_3_std\n value: 30.330135925392014\n - type: nauc_ndcg_at_5_diff1\n value: 21.7798872028265\n - type: nauc_ndcg_at_5_max\n value: 40.23856975248015\n - type: nauc_ndcg_at_5_std\n value: 32.438381067440396\n - type: nauc_precision_at_1000_diff1\n value: -21.62692442272279\n - type: nauc_precision_at_1000_max\n value: 0.9689046974430882\n - type: nauc_precision_at_1000_std\n value: 18.54001058230465\n - type: nauc_precision_at_100_diff1\n value: -10.132258779856192\n - type: nauc_precision_at_100_max\n value: 23.74516110444681\n - type: nauc_precision_at_100_std\n value: 47.03416663319965\n - type: nauc_precision_at_10_diff1\n value: 1.543656509571949\n - type: nauc_precision_at_10_max\n value: 36.98864812757555\n - type: nauc_precision_at_10_std\n value: 46.56427199077426\n - type: nauc_precision_at_1_diff1\n value: 38.2472828531032\n - type: nauc_precision_at_1_max\n value: 51.528473828685705\n - type: nauc_precision_at_1_std\n value: 33.03676467942882\n - type: nauc_precision_at_20_diff1\n value: -4.612864872734335\n - type: nauc_precision_at_20_max\n value: 34.03565449182125\n - type: nauc_precision_at_20_std\n value: 48.880727648349534\n - type: nauc_precision_at_3_diff1\n value: 6.360850444467829\n - type: nauc_precision_at_3_max\n value: 36.25816942368427\n - type: nauc_precision_at_3_std\n value: 34.48882647419187\n - type: nauc_precision_at_5_diff1\n value: 2.6445596936740037\n - type: nauc_precision_at_5_max\n value: 37.174463388899056\n - type: nauc_precision_at_5_std\n value: 40.25254370626113\n - type: nauc_recall_at_1000_diff1\n value: 13.041227176748077\n - type: nauc_recall_at_1000_max\n value: 39.722336427072094\n - type: nauc_recall_at_1000_std\n value: 52.04032890059214\n - type: nauc_recall_at_100_diff1\n value: 18.286096899139153\n - type: nauc_recall_at_100_max\n value: 34.072389201930314\n - type: nauc_recall_at_100_std\n value: 37.73637623416653\n - type: nauc_recall_at_10_diff1\n value: 22.35560419280504\n - type: nauc_recall_at_10_max\n value: 19.727247199595197\n - type: nauc_recall_at_10_std\n value: 8.58498575109203\n - type: nauc_recall_at_1_diff1\n value: 38.67437644802124\n - type: nauc_recall_at_1_max\n value: 14.52136658726491\n - type: nauc_recall_at_1_std\n value: -2.8981666782088755\n - type: nauc_recall_at_20_diff1\n value: 19.026320886902916\n - type: nauc_recall_at_20_max\n value: 22.753562309469867\n - type: nauc_recall_at_20_std\n value: 14.89994263882445\n - type: nauc_recall_at_3_diff1\n value: 23.428129702129684\n - type: nauc_recall_at_3_max\n value: 10.549153954790542\n - type: nauc_recall_at_3_std\n value: -1.7590608997055206\n - type: nauc_recall_at_5_diff1\n value: 21.27448645803921\n - type: nauc_recall_at_5_max\n value: 13.620279707461677\n - type: nauc_recall_at_5_std\n value: 2.0577962208292675\n - type: ndcg_at_1\n value: 46.75\n - type: ndcg_at_10\n value: 34.827000000000005\n - type: ndcg_at_100\n value: 38.157999999999994\n - type: ndcg_at_1000\n value: 44.816\n - type: ndcg_at_20\n value: 34.152\n - type: ndcg_at_3\n value: 39.009\n - type: ndcg_at_5\n value: 36.826\n - type: precision_at_1\n value: 57.25\n - type: precision_at_10\n value: 27.575\n - type: precision_at_100\n value: 8.84\n - type: precision_at_1000\n value: 1.949\n - type: precision_at_20\n value: 20.724999999999998\n - type: precision_at_3\n value: 41.167\n - type: precision_at_5\n value: 35.199999999999996\n - type: recall_at_1\n value: 7.049999999999999\n - type: recall_at_10\n value: 19.817999999999998\n - type: recall_at_100\n value: 42.559999999999995\n - type: recall_at_1000\n value: 63.744\n - type: recall_at_20\n value: 25.968000000000004\n - type: recall_at_3\n value: 11.959\n - type: recall_at_5\n value: 14.939\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB FiQA-PL (default)\n revision: 2e535829717f8bf9dc829b7f911cc5bbd4e6608e\n split: test\n type: clarin-knext/fiqa-pl\n metrics:\n - type: main_score\n value: 38.828\n - type: map_at_1\n value: 19.126\n - type: map_at_10\n value: 31.002000000000002\n - type: map_at_100\n value: 32.736\n - type: map_at_1000\n value: 32.933\n - type: map_at_20\n value: 31.894\n - type: map_at_3\n value: 26.583000000000002\n - type: map_at_5\n value: 28.904000000000003\n - type: mrr_at_1\n value: 37.808641975308646\n - type: mrr_at_10\n value: 46.36745541838134\n - type: mrr_at_100\n value: 47.14140915794908\n - type: mrr_at_1000\n value: 47.190701435388846\n - type: mrr_at_20\n value: 46.81387776440309\n - type: mrr_at_3\n value: 43.750000000000014\n - type: mrr_at_5\n value: 45.23919753086418\n - type: nauc_map_at_1000_diff1\n value: 38.5532285881503\n - type: nauc_map_at_1000_max\n value: 34.44383884813453\n - type: nauc_map_at_1000_std\n value: -1.3963497949476722\n - type: nauc_map_at_100_diff1\n value: 38.49292464176943\n - type: nauc_map_at_100_max\n value: 34.33752755618645\n - type: nauc_map_at_100_std\n value: -1.4794032905848582\n - type: nauc_map_at_10_diff1\n value: 38.26061536370962\n - type: nauc_map_at_10_max\n value: 33.16977912721411\n - type: nauc_map_at_10_std\n value: -2.3853370604730393\n - type: nauc_map_at_1_diff1\n value: 46.288767289528344\n - type: nauc_map_at_1_max\n value: 25.67706785013364\n - type: nauc_map_at_1_std\n value: -6.989769609924645\n - type: nauc_map_at_20_diff1\n value: 38.507270129330685\n - type: nauc_map_at_20_max\n value: 33.70963328055982\n - type: nauc_map_at_20_std\n value: -1.9835510011554272\n - type: nauc_map_at_3_diff1\n value: 39.81061518646884\n - type: nauc_map_at_3_max\n value: 30.101186374147748\n - type: nauc_map_at_3_std\n value: -4.027120247237715\n - type: nauc_map_at_5_diff1\n value: 38.55602589746512\n - type: nauc_map_at_5_max\n value: 31.515174267015983\n - type: nauc_map_at_5_std\n value: -3.4064239358570303\n - type: nauc_mrr_at_1000_diff1\n value: 45.030514454725726\n - type: nauc_mrr_at_1000_max\n value: 43.878919881666164\n - type: nauc_mrr_at_1000_std\n value: 2.517594250297626\n - type: nauc_mrr_at_100_diff1\n value: 45.00868212878687\n - type: nauc_mrr_at_100_max\n value: 43.87437011120001\n - type: nauc_mrr_at_100_std\n value: 2.5257874265014966\n - type: nauc_mrr_at_10_diff1\n value: 44.855044606754056\n - type: nauc_mrr_at_10_max\n value: 43.946617058785186\n - type: nauc_mrr_at_10_std\n value: 2.5173751662794044\n - type: nauc_mrr_at_1_diff1\n value: 49.441510997817346\n - type: nauc_mrr_at_1_max\n value: 43.08547383044357\n - type: nauc_mrr_at_1_std\n value: -1.8747770703324347\n - type: nauc_mrr_at_20_diff1\n value: 45.019880416584215\n - type: nauc_mrr_at_20_max\n value: 43.85691473662242\n - type: nauc_mrr_at_20_std\n value: 2.4625487605091303\n - type: nauc_mrr_at_3_diff1\n value: 45.322041658604036\n - type: nauc_mrr_at_3_max\n value: 43.95079293074395\n - type: nauc_mrr_at_3_std\n value: 2.4644274393435737\n - type: nauc_mrr_at_5_diff1\n value: 44.99461837803437\n - type: nauc_mrr_at_5_max\n value: 43.97934275090601\n - type: nauc_mrr_at_5_std\n value: 2.5353091695125096\n - type: nauc_ndcg_at_1000_diff1\n value: 39.38449023275524\n - type: nauc_ndcg_at_1000_max\n value: 39.48382767312788\n - type: nauc_ndcg_at_1000_std\n value: 3.414789408343409\n - type: nauc_ndcg_at_100_diff1\n value: 38.29675861135578\n - type: nauc_ndcg_at_100_max\n value: 38.2674786507297\n - type: nauc_ndcg_at_100_std\n value: 2.7094055381218207\n - type: nauc_ndcg_at_10_diff1\n value: 38.09514955708717\n - type: nauc_ndcg_at_10_max\n value: 36.664923238906525\n - type: nauc_ndcg_at_10_std\n value: 0.6901410544967921\n - type: nauc_ndcg_at_1_diff1\n value: 49.441510997817346\n - type: nauc_ndcg_at_1_max\n value: 43.08547383044357\n - type: nauc_ndcg_at_1_std\n value: -1.8747770703324347\n - type: nauc_ndcg_at_20_diff1\n value: 38.44967736231759\n - type: nauc_ndcg_at_20_max\n value: 36.871179313622584\n - type: nauc_ndcg_at_20_std\n value: 1.157560360065234\n - type: nauc_ndcg_at_3_diff1\n value: 39.02419271805571\n - type: nauc_ndcg_at_3_max\n value: 37.447669442586324\n - type: nauc_ndcg_at_3_std\n value: 0.41502589779297794\n - type: nauc_ndcg_at_5_diff1\n value: 38.10233452742001\n - type: nauc_ndcg_at_5_max\n value: 35.816381905465676\n - type: nauc_ndcg_at_5_std\n value: -0.3704499913387088\n - type: nauc_precision_at_1000_diff1\n value: 2.451267097838658\n - type: nauc_precision_at_1000_max\n value: 29.116394969085306\n - type: nauc_precision_at_1000_std\n value: 14.85900786538363\n - type: nauc_precision_at_100_diff1\n value: 8.10919082251277\n - type: nauc_precision_at_100_max\n value: 36.28388256191417\n - type: nauc_precision_at_100_std\n value: 14.830039904317657\n - type: nauc_precision_at_10_diff1\n value: 15.02446609920477\n - type: nauc_precision_at_10_max\n value: 41.008463775454054\n - type: nauc_precision_at_10_std\n value: 10.431403152334486\n - type: nauc_precision_at_1_diff1\n value: 49.441510997817346\n - type: nauc_precision_at_1_max\n value: 43.08547383044357\n - type: nauc_precision_at_1_std\n value: -1.8747770703324347\n - type: nauc_precision_at_20_diff1\n value: 14.222022201169926\n - type: nauc_precision_at_20_max\n value: 40.10189643835305\n - type: nauc_precision_at_20_std\n value: 12.204443815975527\n - type: nauc_precision_at_3_diff1\n value: 25.41905395341234\n - type: nauc_precision_at_3_max\n value: 41.56133905339819\n - type: nauc_precision_at_3_std\n value: 5.575516915590082\n - type: nauc_precision_at_5_diff1\n value: 20.20081221089351\n - type: nauc_precision_at_5_max\n value: 40.95218555916681\n - type: nauc_precision_at_5_std\n value: 7.2040745500708745\n - type: nauc_recall_at_1000_diff1\n value: 28.021198234033395\n - type: nauc_recall_at_1000_max\n value: 36.165148684597504\n - type: nauc_recall_at_1000_std\n value: 28.28852356008973\n - type: nauc_recall_at_100_diff1\n value: 21.882447802741897\n - type: nauc_recall_at_100_max\n value: 26.979684607567222\n - type: nauc_recall_at_100_std\n value: 9.783658817010082\n - type: nauc_recall_at_10_diff1\n value: 28.493097951178818\n - type: nauc_recall_at_10_max\n value: 29.40937476550134\n - type: nauc_recall_at_10_std\n value: 2.7593763576979353\n - type: nauc_recall_at_1_diff1\n value: 46.288767289528344\n - type: nauc_recall_at_1_max\n value: 25.67706785013364\n - type: nauc_recall_at_1_std\n value: -6.989769609924645\n - type: nauc_recall_at_20_diff1\n value: 27.638381299425234\n - type: nauc_recall_at_20_max\n value: 27.942035836106328\n - type: nauc_recall_at_20_std\n value: 3.489835161380808\n - type: nauc_recall_at_3_diff1\n value: 33.90054781392646\n - type: nauc_recall_at_3_max\n value: 27.778812533030322\n - type: nauc_recall_at_3_std\n value: -0.03054068020022706\n - type: nauc_recall_at_5_diff1\n value: 30.279060732221346\n - type: nauc_recall_at_5_max\n value: 27.49854749597931\n - type: nauc_recall_at_5_std\n value: 0.5434664581939099\n - type: ndcg_at_1\n value: 37.809\n - type: ndcg_at_10\n value: 38.828\n - type: ndcg_at_100\n value: 45.218\n - type: ndcg_at_1000\n value: 48.510999999999996\n - type: ndcg_at_20\n value: 41.11\n - type: ndcg_at_3\n value: 34.466\n - type: ndcg_at_5\n value: 35.843\n - type: precision_at_1\n value: 37.809\n - type: precision_at_10\n value: 11.157\n - type: precision_at_100\n value: 1.762\n - type: precision_at_1000\n value: 0.233\n - type: precision_at_20\n value: 6.497\n - type: precision_at_3\n value: 23.044999999999998\n - type: precision_at_5\n value: 17.284\n - type: recall_at_1\n value: 19.126\n - type: recall_at_10\n value: 46.062\n - type: recall_at_100\n value: 70.22800000000001\n - type: recall_at_1000\n value: 89.803\n - type: recall_at_20\n value: 53.217999999999996\n - type: recall_at_3\n value: 30.847\n - type: recall_at_5\n value: 37.11\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB HotpotQA-PL (default)\n revision: a0bd479ac97b4ccb5bd6ce320c415d0bb4beb907\n split: test\n type: clarin-knext/hotpotqa-pl\n metrics:\n - type: main_score\n value: 60.27\n - type: map_at_1\n value: 35.199000000000005\n - type: map_at_10\n value: 51.369\n - type: map_at_100\n value: 52.212\n - type: map_at_1000\n value: 52.28\n - type: map_at_20\n value: 51.864\n - type: map_at_3\n value: 48.446\n - type: map_at_5\n value: 50.302\n - type: mrr_at_1\n value: 70.39837947332883\n - type: mrr_at_10\n value: 76.8346141067273\n - type: mrr_at_100\n value: 77.10724392048137\n - type: mrr_at_1000\n value: 77.12037412892865\n - type: mrr_at_20\n value: 77.01061532947222\n - type: mrr_at_3\n value: 75.5908170155299\n - type: mrr_at_5\n value: 76.39095205941899\n - type: nauc_map_at_1000_diff1\n value: 24.701387884989117\n - type: nauc_map_at_1000_max\n value: 23.25553235642178\n - type: nauc_map_at_1000_std\n value: 7.1803506915661774\n - type: nauc_map_at_100_diff1\n value: 24.674498622483103\n - type: nauc_map_at_100_max\n value: 23.234948525052175\n - type: nauc_map_at_100_std\n value: 7.168677997105447\n - type: nauc_map_at_10_diff1\n value: 24.676025039755626\n - type: nauc_map_at_10_max\n value: 23.171971872726964\n - type: nauc_map_at_10_std\n value: 6.485610909852058\n - type: nauc_map_at_1_diff1\n value: 68.90178464319715\n - type: nauc_map_at_1_max\n value: 46.05537868917558\n - type: nauc_map_at_1_std\n value: 1.7658552480698708\n - type: nauc_map_at_20_diff1\n value: 24.69297151842494\n - type: nauc_map_at_20_max\n value: 23.213064691673637\n - type: nauc_map_at_20_std\n value: 6.9357946556849\n - type: nauc_map_at_3_diff1\n value: 26.279128947950507\n - type: nauc_map_at_3_max\n value: 23.929537354117922\n - type: nauc_map_at_3_std\n value: 4.625061565714759\n - type: nauc_map_at_5_diff1\n value: 25.04448959482816\n - type: nauc_map_at_5_max\n value: 23.432012857899338\n - type: nauc_map_at_5_std\n value: 5.845744681998008\n - type: nauc_mrr_at_1000_diff1\n value: 66.7503918108276\n - type: nauc_mrr_at_1000_max\n value: 48.42897342336844\n - type: nauc_mrr_at_1000_std\n value: 5.3097517971144415\n - type: nauc_mrr_at_100_diff1\n value: 66.74645215862695\n - type: nauc_mrr_at_100_max\n value: 48.4368663009989\n - type: nauc_mrr_at_100_std\n value: 5.322297898555188\n - type: nauc_mrr_at_10_diff1\n value: 66.69310166180729\n - type: nauc_mrr_at_10_max\n value: 48.475437698330225\n - type: nauc_mrr_at_10_std\n value: 5.258183461631702\n - type: nauc_mrr_at_1_diff1\n value: 68.90178464319715\n - type: nauc_mrr_at_1_max\n value: 46.05537868917558\n - type: nauc_mrr_at_1_std\n value: 1.7658552480698708\n - type: nauc_mrr_at_20_diff1\n value: 66.72000262431975\n - type: nauc_mrr_at_20_max\n value: 48.45593642981319\n - type: nauc_mrr_at_20_std\n value: 5.353665929072101\n - type: nauc_mrr_at_3_diff1\n value: 66.84936676396276\n - type: nauc_mrr_at_3_max\n value: 48.466611276778295\n - type: nauc_mrr_at_3_std\n value: 4.485810398557475\n - type: nauc_mrr_at_5_diff1\n value: 66.62362565394174\n - type: nauc_mrr_at_5_max\n value: 48.456431835482014\n - type: nauc_mrr_at_5_std\n value: 5.08482458391903\n - type: nauc_ndcg_at_1000_diff1\n value: 29.984825173719443\n - type: nauc_ndcg_at_1000_max\n value: 27.289179238639893\n - type: nauc_ndcg_at_1000_std\n value: 10.661480455527526\n - type: nauc_ndcg_at_100_diff1\n value: 29.322074257047877\n - type: nauc_ndcg_at_100_max\n value: 26.850650276220605\n - type: nauc_ndcg_at_100_std\n value: 10.599247982501902\n - type: nauc_ndcg_at_10_diff1\n value: 29.659909113886094\n - type: nauc_ndcg_at_10_max\n value: 26.836139599331005\n - type: nauc_ndcg_at_10_std\n value: 8.12844399452719\n - type: nauc_ndcg_at_1_diff1\n value: 68.90178464319715\n - type: nauc_ndcg_at_1_max\n value: 46.05537868917558\n - type: nauc_ndcg_at_1_std\n value: 1.7658552480698708\n - type: nauc_ndcg_at_20_diff1\n value: 29.510802214854294\n - type: nauc_ndcg_at_20_max\n value: 26.775562637730722\n - type: nauc_ndcg_at_20_std\n value: 9.341342661702363\n - type: nauc_ndcg_at_3_diff1\n value: 32.741885846292966\n - type: nauc_ndcg_at_3_max\n value: 28.44225108761343\n - type: nauc_ndcg_at_3_std\n value: 5.204440768465042\n - type: nauc_ndcg_at_5_diff1\n value: 30.57856348635919\n - type: nauc_ndcg_at_5_max\n value: 27.475007474301698\n - type: nauc_ndcg_at_5_std\n value: 6.961546044312487\n - type: nauc_precision_at_1000_diff1\n value: 0.002113156309413332\n - type: nauc_precision_at_1000_max\n value: 11.198242419541286\n - type: nauc_precision_at_1000_std\n value: 28.69676419166541\n - type: nauc_precision_at_100_diff1\n value: 3.6049575557782627\n - type: nauc_precision_at_100_max\n value: 12.499173524574791\n - type: nauc_precision_at_100_std\n value: 23.3755281004721\n - type: nauc_precision_at_10_diff1\n value: 10.922574784853193\n - type: nauc_precision_at_10_max\n value: 16.23221529562036\n - type: nauc_precision_at_10_std\n value: 12.45014808813857\n - type: nauc_precision_at_1_diff1\n value: 68.90178464319715\n - type: nauc_precision_at_1_max\n value: 46.05537868917558\n - type: nauc_precision_at_1_std\n value: 1.7658552480698708\n - type: nauc_precision_at_20_diff1\n value: 8.840710781302827\n - type: nauc_precision_at_20_max\n value: 14.804644554205524\n - type: nauc_precision_at_20_std\n value: 16.245009770815237\n - type: nauc_precision_at_3_diff1\n value: 19.447291487137573\n - type: nauc_precision_at_3_max\n value: 21.47123471597057\n - type: nauc_precision_at_3_std\n value: 6.441862800128802\n - type: nauc_precision_at_5_diff1\n value: 14.078545719721108\n - type: nauc_precision_at_5_max\n value: 18.468288046016387\n - type: nauc_precision_at_5_std\n value: 9.58650641691393\n - type: nauc_recall_at_1000_diff1\n value: 0.0021131563095336584\n - type: nauc_recall_at_1000_max\n value: 11.198242419541558\n - type: nauc_recall_at_1000_std\n value: 28.6967641916655\n - type: nauc_recall_at_100_diff1\n value: 3.6049575557781393\n - type: nauc_recall_at_100_max\n value: 12.499173524574765\n - type: nauc_recall_at_100_std\n value: 23.375528100472074\n - type: nauc_recall_at_10_diff1\n value: 10.922574784853168\n - type: nauc_recall_at_10_max\n value: 16.2322152956203\n - type: nauc_recall_at_10_std\n value: 12.450148088138535\n - type: nauc_recall_at_1_diff1\n value: 68.90178464319715\n - type: nauc_recall_at_1_max\n value: 46.05537868917558\n - type: nauc_recall_at_1_std\n value: 1.7658552480698708\n - type: nauc_recall_at_20_diff1\n value: 8.840710781302905\n - type: nauc_recall_at_20_max\n value: 14.804644554205515\n - type: nauc_recall_at_20_std\n value: 16.245009770815273\n - type: nauc_recall_at_3_diff1\n value: 19.447291487137498\n - type: nauc_recall_at_3_max\n value: 21.47123471597054\n - type: nauc_recall_at_3_std\n value: 6.441862800128763\n - type: nauc_recall_at_5_diff1\n value: 14.07854571972115\n - type: nauc_recall_at_5_max\n value: 18.468288046016337\n - type: nauc_recall_at_5_std\n value: 9.586506416913904\n - type: ndcg_at_1\n value: 70.39800000000001\n - type: ndcg_at_10\n value: 60.27\n - type: ndcg_at_100\n value: 63.400999999999996\n - type: ndcg_at_1000\n value: 64.847\n - type: ndcg_at_20\n value: 61.571\n - type: ndcg_at_3\n value: 55.875\n - type: ndcg_at_5\n value: 58.36599999999999\n - type: precision_at_1\n value: 70.39800000000001\n - type: precision_at_10\n value: 12.46\n - type: precision_at_100\n value: 1.493\n - type: precision_at_1000\n value: 0.169\n - type: precision_at_20\n value: 6.65\n - type: precision_at_3\n value: 35.062\n - type: precision_at_5\n value: 23.009\n - type: recall_at_1\n value: 35.199000000000005\n - type: recall_at_10\n value: 62.302\n - type: recall_at_100\n value: 74.666\n - type: recall_at_1000\n value: 84.355\n - type: recall_at_20\n value: 66.496\n - type: recall_at_3\n value: 52.593\n - type: recall_at_5\n value: 57.522\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB MSMARCO-PL (default)\n revision: 8634c07806d5cce3a6138e260e59b81760a0a640\n split: test\n type: clarin-knext/msmarco-pl\n metrics:\n - type: main_score\n value: 64.886\n - type: map_at_1\n value: 1.644\n - type: map_at_10\n value: 12.24\n - type: map_at_100\n value: 28.248\n - type: map_at_1000\n value: 33.506\n - type: map_at_20\n value: 17.497\n - type: map_at_3\n value: 4.9399999999999995\n - type: map_at_5\n value: 8.272\n - type: mrr_at_1\n value: 83.72093023255815\n - type: mrr_at_10\n value: 91.08527131782945\n - type: mrr_at_100\n value: 91.08527131782945\n - type: mrr_at_1000\n value: 91.08527131782945\n - type: mrr_at_20\n value: 91.08527131782945\n - type: mrr_at_3\n value: 91.08527131782945\n - type: mrr_at_5\n value: 91.08527131782945\n - type: nauc_map_at_1000_diff1\n value: -36.428271627303424\n - type: nauc_map_at_1000_max\n value: 44.87615127218638\n - type: nauc_map_at_1000_std\n value: 67.92696808824724\n - type: nauc_map_at_100_diff1\n value: -28.11674206786188\n - type: nauc_map_at_100_max\n value: 36.422779766334955\n - type: nauc_map_at_100_std\n value: 49.99876313755116\n - type: nauc_map_at_10_diff1\n value: -5.838593619806058\n - type: nauc_map_at_10_max\n value: 11.026519190509742\n - type: nauc_map_at_10_std\n value: 2.5268752263522045\n - type: nauc_map_at_1_diff1\n value: 17.897907271073016\n - type: nauc_map_at_1_max\n value: 12.229062762540844\n - type: nauc_map_at_1_std\n value: -4.088830895573149\n - type: nauc_map_at_20_diff1\n value: -13.871097716255626\n - type: nauc_map_at_20_max\n value: 19.291271635609533\n - type: nauc_map_at_20_std\n value: 16.745335606507826\n - type: nauc_map_at_3_diff1\n value: 4.425238457033843\n - type: nauc_map_at_3_max\n value: 4.611864744680824\n - type: nauc_map_at_3_std\n value: -8.986916608582863\n - type: nauc_map_at_5_diff1\n value: -6.254849256920095\n - type: nauc_map_at_5_max\n value: 2.729437079919823\n - type: nauc_map_at_5_std\n value: -7.235906279913092\n - type: nauc_mrr_at_1000_diff1\n value: 52.18669104947672\n - type: nauc_mrr_at_1000_max\n value: 68.26259125411818\n - type: nauc_mrr_at_1000_std\n value: 56.345086428353575\n - type: nauc_mrr_at_100_diff1\n value: 52.18669104947672\n - type: nauc_mrr_at_100_max\n value: 68.26259125411818\n - type: nauc_mrr_at_100_std\n value: 56.345086428353575\n - type: nauc_mrr_at_10_diff1\n value: 52.18669104947672\n - type: nauc_mrr_at_10_max\n value: 68.26259125411818\n - type: nauc_mrr_at_10_std\n value: 56.345086428353575\n - type: nauc_mrr_at_1_diff1\n value: 56.55126663944154\n - type: nauc_mrr_at_1_max\n value: 66.37014285522565\n - type: nauc_mrr_at_1_std\n value: 53.2508271389779\n - type: nauc_mrr_at_20_diff1\n value: 52.18669104947672\n - type: nauc_mrr_at_20_max\n value: 68.26259125411818\n - type: nauc_mrr_at_20_std\n value: 56.345086428353575\n - type: nauc_mrr_at_3_diff1\n value: 52.18669104947672\n - type: nauc_mrr_at_3_max\n value: 68.26259125411818\n - type: nauc_mrr_at_3_std\n value: 56.345086428353575\n - type: nauc_mrr_at_5_diff1\n value: 52.18669104947672\n - type: nauc_mrr_at_5_max\n value: 68.26259125411818\n - type: nauc_mrr_at_5_std\n value: 56.345086428353575\n - type: nauc_ndcg_at_1000_diff1\n value: -19.06422926483731\n - type: nauc_ndcg_at_1000_max\n value: 56.30853514590265\n - type: nauc_ndcg_at_1000_std\n value: 70.30810947505557\n - type: nauc_ndcg_at_100_diff1\n value: -25.72587586459692\n - type: nauc_ndcg_at_100_max\n value: 51.433781241604194\n - type: nauc_ndcg_at_100_std\n value: 68.37678512652792\n - type: nauc_ndcg_at_10_diff1\n value: -23.21198108212602\n - type: nauc_ndcg_at_10_max\n value: 43.5450720846516\n - type: nauc_ndcg_at_10_std\n value: 48.78307907005605\n - type: nauc_ndcg_at_1_diff1\n value: 44.00179301267447\n - type: nauc_ndcg_at_1_max\n value: 48.202370455680395\n - type: nauc_ndcg_at_1_std\n value: 25.69655992704088\n - type: nauc_ndcg_at_20_diff1\n value: -33.88168753446507\n - type: nauc_ndcg_at_20_max\n value: 45.16199742613164\n - type: nauc_ndcg_at_20_std\n value: 61.87098383164902\n - type: nauc_ndcg_at_3_diff1\n value: 11.19174449544048\n - type: nauc_ndcg_at_3_max\n value: 44.34069860560555\n - type: nauc_ndcg_at_3_std\n value: 27.451258369798115\n - type: nauc_ndcg_at_5_diff1\n value: -7.186520929432436\n - type: nauc_ndcg_at_5_max\n value: 43.41869981139378\n - type: nauc_ndcg_at_5_std\n value: 34.89898115995178\n - type: nauc_precision_at_1000_diff1\n value: -34.43998154563451\n - type: nauc_precision_at_1000_max\n value: 29.172655907480372\n - type: nauc_precision_at_1000_std\n value: 65.15824469614837\n - type: nauc_precision_at_100_diff1\n value: -37.82409643259692\n - type: nauc_precision_at_100_max\n value: 38.24986991317909\n - type: nauc_precision_at_100_std\n value: 72.74768183105327\n - type: nauc_precision_at_10_diff1\n value: -32.21556182780535\n - type: nauc_precision_at_10_max\n value: 34.27170432382651\n - type: nauc_precision_at_10_std\n value: 58.358255004394664\n - type: nauc_precision_at_1_diff1\n value: 56.55126663944154\n - type: nauc_precision_at_1_max\n value: 66.37014285522565\n - type: nauc_precision_at_1_std\n value: 53.2508271389779\n - type: nauc_precision_at_20_diff1\n value: -40.18751579026395\n - type: nauc_precision_at_20_max\n value: 33.960783153758896\n - type: nauc_precision_at_20_std\n value: 65.42918390184195\n - type: nauc_precision_at_3_diff1\n value: -7.073870209006578\n - type: nauc_precision_at_3_max\n value: 50.81535269862325\n - type: nauc_precision_at_3_std\n value: 59.248681565955685\n - type: nauc_precision_at_5_diff1\n value: -31.136580596983876\n - type: nauc_precision_at_5_max\n value: 45.88147792380426\n - type: nauc_precision_at_5_std\n value: 67.46814230928243\n - type: nauc_recall_at_1000_diff1\n value: -23.15699999594577\n - type: nauc_recall_at_1000_max\n value: 39.77277799761876\n - type: nauc_recall_at_1000_std\n value: 60.326168012901114\n - type: nauc_recall_at_100_diff1\n value: -21.636664823598498\n - type: nauc_recall_at_100_max\n value: 31.104969346131583\n - type: nauc_recall_at_100_std\n value: 38.811686891592096\n - type: nauc_recall_at_10_diff1\n value: -10.542765625053569\n - type: nauc_recall_at_10_max\n value: 2.043876058107446\n - type: nauc_recall_at_10_std\n value: -5.578449908984766\n - type: nauc_recall_at_1_diff1\n value: 17.897907271073016\n - type: nauc_recall_at_1_max\n value: 12.229062762540844\n - type: nauc_recall_at_1_std\n value: -4.088830895573149\n - type: nauc_recall_at_20_diff1\n value: -15.132909355710103\n - type: nauc_recall_at_20_max\n value: 12.659765287241065\n - type: nauc_recall_at_20_std\n value: 8.277887800815819\n - type: nauc_recall_at_3_diff1\n value: -3.1975017812715016\n - type: nauc_recall_at_3_max\n value: -3.5539857085038538\n - type: nauc_recall_at_3_std\n value: -14.712102851318118\n - type: nauc_recall_at_5_diff1\n value: -14.040507717380743\n - type: nauc_recall_at_5_max\n value: -6.126912150131701\n - type: nauc_recall_at_5_std\n value: -13.821624015640355\n - type: ndcg_at_1\n value: 71.318\n - type: ndcg_at_10\n value: 64.886\n - type: ndcg_at_100\n value: 53.187\n - type: ndcg_at_1000\n value: 59.897999999999996\n - type: ndcg_at_20\n value: 58.96\n - type: ndcg_at_3\n value: 69.736\n - type: ndcg_at_5\n value: 70.14099999999999\n - type: precision_at_1\n value: 83.721\n - type: precision_at_10\n value: 71.163\n - type: precision_at_100\n value: 29.465000000000003\n - type: precision_at_1000\n value: 5.665\n - type: precision_at_20\n value: 57.791000000000004\n - type: precision_at_3\n value: 82.171\n - type: precision_at_5\n value: 81.86\n - type: recall_at_1\n value: 1.644\n - type: recall_at_10\n value: 14.238000000000001\n - type: recall_at_100\n value: 39.831\n - type: recall_at_1000\n value: 64.057\n - type: recall_at_20\n value: 21.021\n - type: recall_at_3\n value: 5.53\n - type: recall_at_5\n value: 9.623\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB NFCorpus-PL (default)\n revision: 9a6f9567fda928260afed2de480d79c98bf0bec0\n split: test\n type: clarin-knext/nfcorpus-pl\n metrics:\n - type: main_score\n value: 31.391000000000002\n - type: map_at_1\n value: 4.163\n - type: map_at_10\n value: 10.744\n - type: map_at_100\n value: 14.038999999999998\n - type: map_at_1000\n value: 15.434999999999999\n - type: map_at_20\n value: 12.16\n - type: map_at_3\n value: 7.614999999999999\n - type: map_at_5\n value: 9.027000000000001\n - type: mrr_at_1\n value: 39.0092879256966\n - type: mrr_at_10\n value: 48.69809327239668\n - type: mrr_at_100\n value: 49.20788148442068\n - type: mrr_at_1000\n value: 49.25509336494706\n - type: mrr_at_20\n value: 48.99606551850896\n - type: mrr_at_3\n value: 46.284829721362236\n - type: mrr_at_5\n value: 47.77089783281735\n - type: nauc_map_at_1000_diff1\n value: 22.75421477116417\n - type: nauc_map_at_1000_max\n value: 49.242283787799046\n - type: nauc_map_at_1000_std\n value: 29.056888272331832\n - type: nauc_map_at_100_diff1\n value: 23.585977398585594\n - type: nauc_map_at_100_max\n value: 48.25845199409498\n - type: nauc_map_at_100_std\n value: 24.944264511223693\n - type: nauc_map_at_10_diff1\n value: 27.386613094780255\n - type: nauc_map_at_10_max\n value: 41.52415346691586\n - type: nauc_map_at_10_std\n value: 12.93872448563755\n - type: nauc_map_at_1_diff1\n value: 46.78688143865053\n - type: nauc_map_at_1_max\n value: 37.20408843995871\n - type: nauc_map_at_1_std\n value: 4.383444959401098\n - type: nauc_map_at_20_diff1\n value: 25.590969047740288\n - type: nauc_map_at_20_max\n value: 44.57109307999418\n - type: nauc_map_at_20_std\n value: 16.45855141821407\n - type: nauc_map_at_3_diff1\n value: 36.30017108362863\n - type: nauc_map_at_3_max\n value: 34.66149613991648\n - type: nauc_map_at_3_std\n value: 5.67985905078467\n - type: nauc_map_at_5_diff1\n value: 31.157644795417223\n - type: nauc_map_at_5_max\n value: 37.274738661636825\n - type: nauc_map_at_5_std\n value: 8.70088872394168\n - type: nauc_mrr_at_1000_diff1\n value: 25.638564218157384\n - type: nauc_mrr_at_1000_max\n value: 57.77788270285353\n - type: nauc_mrr_at_1000_std\n value: 43.507586592911274\n - type: nauc_mrr_at_100_diff1\n value: 25.662002580561584\n - type: nauc_mrr_at_100_max\n value: 57.80578394278584\n - type: nauc_mrr_at_100_std\n value: 43.543905743986635\n - type: nauc_mrr_at_10_diff1\n value: 25.426034796339835\n - type: nauc_mrr_at_10_max\n value: 57.68443186258669\n - type: nauc_mrr_at_10_std\n value: 43.438009108331215\n - type: nauc_mrr_at_1_diff1\n value: 26.073028156311075\n - type: nauc_mrr_at_1_max\n value: 52.11817916720053\n - type: nauc_mrr_at_1_std\n value: 37.41073893153695\n - type: nauc_mrr_at_20_diff1\n value: 25.548645553336147\n - type: nauc_mrr_at_20_max\n value: 57.78552760401915\n - type: nauc_mrr_at_20_std\n value: 43.521687428822325\n - type: nauc_mrr_at_3_diff1\n value: 25.72662577397805\n - type: nauc_mrr_at_3_max\n value: 56.891263536265605\n - type: nauc_mrr_at_3_std\n value: 41.384872305390104\n - type: nauc_mrr_at_5_diff1\n value: 25.552211551655386\n - type: nauc_mrr_at_5_max\n value: 57.976813828353926\n - type: nauc_mrr_at_5_std\n value: 43.504564461855544\n - type: nauc_ndcg_at_1000_diff1\n value: 23.456158044182757\n - type: nauc_ndcg_at_1000_max\n value: 60.05411773552709\n - type: nauc_ndcg_at_1000_std\n value: 47.857510017262584\n - type: nauc_ndcg_at_100_diff1\n value: 19.711635700390772\n - type: nauc_ndcg_at_100_max\n value: 56.178746740470665\n - type: nauc_ndcg_at_100_std\n value: 42.36829180286942\n - type: nauc_ndcg_at_10_diff1\n value: 18.364428967788413\n - type: nauc_ndcg_at_10_max\n value: 54.38372506578223\n - type: nauc_ndcg_at_10_std\n value: 41.75765411340369\n - type: nauc_ndcg_at_1_diff1\n value: 26.571093272640773\n - type: nauc_ndcg_at_1_max\n value: 51.061788341958284\n - type: nauc_ndcg_at_1_std\n value: 36.514987974075986\n - type: nauc_ndcg_at_20_diff1\n value: 18.345487193027697\n - type: nauc_ndcg_at_20_max\n value: 54.62621882656994\n - type: nauc_ndcg_at_20_std\n value: 41.42835554714241\n - type: nauc_ndcg_at_3_diff1\n value: 23.260105658139025\n - type: nauc_ndcg_at_3_max\n value: 52.07747385334546\n - type: nauc_ndcg_at_3_std\n value: 36.91985577837284\n - type: nauc_ndcg_at_5_diff1\n value: 20.40428109665566\n - type: nauc_ndcg_at_5_max\n value: 53.52015347884604\n - type: nauc_ndcg_at_5_std\n value: 39.46008849580017\n - type: nauc_precision_at_1000_diff1\n value: -7.3487344916380035\n - type: nauc_precision_at_1000_max\n value: 16.58045221394852\n - type: nauc_precision_at_1000_std\n value: 38.94030932397075\n - type: nauc_precision_at_100_diff1\n value: -5.257743986683922\n - type: nauc_precision_at_100_max\n value: 34.43071687475306\n - type: nauc_precision_at_100_std\n value: 53.499519170670474\n - type: nauc_precision_at_10_diff1\n value: 2.385136433119139\n - type: nauc_precision_at_10_max\n value: 47.210743878631064\n - type: nauc_precision_at_10_std\n value: 47.22767704186548\n - type: nauc_precision_at_1_diff1\n value: 26.073028156311075\n - type: nauc_precision_at_1_max\n value: 52.11817916720053\n - type: nauc_precision_at_1_std\n value: 37.41073893153695\n - type: nauc_precision_at_20_diff1\n value: -0.3531531127238474\n - type: nauc_precision_at_20_max\n value: 44.78044604856974\n - type: nauc_precision_at_20_std\n value: 49.532804150743615\n - type: nauc_precision_at_3_diff1\n value: 15.350050569991447\n - type: nauc_precision_at_3_max\n value: 51.01572315596549\n - type: nauc_precision_at_3_std\n value: 38.801125728413155\n - type: nauc_precision_at_5_diff1\n value: 9.109003666144694\n - type: nauc_precision_at_5_max\n value: 50.935269774898494\n - type: nauc_precision_at_5_std\n value: 43.323548180559676\n - type: nauc_recall_at_1000_diff1\n value: 16.64743647648886\n - type: nauc_recall_at_1000_max\n value: 38.46012283772285\n - type: nauc_recall_at_1000_std\n value: 36.02016164796441\n - type: nauc_recall_at_100_diff1\n value: 14.005834785186744\n - type: nauc_recall_at_100_max\n value: 37.70026105513647\n - type: nauc_recall_at_100_std\n value: 27.085222642129697\n - type: nauc_recall_at_10_diff1\n value: 21.204106627422632\n - type: nauc_recall_at_10_max\n value: 36.737624881893424\n - type: nauc_recall_at_10_std\n value: 13.755054514272702\n - type: nauc_recall_at_1_diff1\n value: 46.78688143865053\n - type: nauc_recall_at_1_max\n value: 37.20408843995871\n - type: nauc_recall_at_1_std\n value: 4.383444959401098\n - type: nauc_recall_at_20_diff1\n value: 19.740977611421933\n - type: nauc_recall_at_20_max\n value: 39.21908969539783\n - type: nauc_recall_at_20_std\n value: 16.560269670318494\n - type: nauc_recall_at_3_diff1\n value: 32.189359545367815\n - type: nauc_recall_at_3_max\n value: 31.693634445562758\n - type: nauc_recall_at_3_std\n value: 6.246326281543587\n - type: nauc_recall_at_5_diff1\n value: 25.51586860499901\n - type: nauc_recall_at_5_max\n value: 33.15934725342885\n - type: nauc_recall_at_5_std\n value: 9.677778511696705\n - type: ndcg_at_1\n value: 37.307\n - type: ndcg_at_10\n value: 31.391000000000002\n - type: ndcg_at_100\n value: 28.877999999999997\n - type: ndcg_at_1000\n value: 37.16\n - type: ndcg_at_20\n value: 29.314\n - type: ndcg_at_3\n value: 35.405\n - type: ndcg_at_5\n value: 33.922999999999995\n - type: precision_at_1\n value: 39.009\n - type: precision_at_10\n value: 24.52\n - type: precision_at_100\n value: 7.703\n - type: precision_at_1000\n value: 2.04\n - type: precision_at_20\n value: 18.08\n - type: precision_at_3\n value: 34.469\n - type: precision_at_5\n value: 30.712\n - type: recall_at_1\n value: 4.163\n - type: recall_at_10\n value: 15.015999999999998\n - type: recall_at_100\n value: 30.606\n - type: recall_at_1000\n value: 59.606\n - type: recall_at_20\n value: 19.09\n - type: recall_at_3\n value: 9.139\n - type: recall_at_5\n value: 11.477\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB NQ-PL (default)\n revision: f171245712cf85dd4700b06bef18001578d0ca8d\n split: test\n type: clarin-knext/nq-pl\n metrics:\n - type: main_score\n value: 54.017\n - type: map_at_1\n value: 34.193\n - type: map_at_10\n value: 47.497\n - type: map_at_100\n value: 48.441\n - type: map_at_1000\n value: 48.481\n - type: map_at_20\n value: 48.093\n - type: map_at_3\n value: 44.017\n - type: map_at_5\n value: 46.111000000000004\n - type: mrr_at_1\n value: 37.949015063731174\n - type: mrr_at_10\n value: 49.915772315105954\n - type: mrr_at_100\n value: 50.62841255829997\n - type: mrr_at_1000\n value: 50.656773027666745\n - type: mrr_at_20\n value: 50.37785276657083\n - type: mrr_at_3\n value: 46.98725376593267\n - type: mrr_at_5\n value: 48.763035921205066\n - type: nauc_map_at_1000_diff1\n value: 39.5632191792873\n - type: nauc_map_at_1000_max\n value: 37.4728247053629\n - type: nauc_map_at_1000_std\n value: 5.742498414663762\n - type: nauc_map_at_100_diff1\n value: 39.555570352061906\n - type: nauc_map_at_100_max\n value: 37.497880976847334\n - type: nauc_map_at_100_std\n value: 5.7798021019465375\n - type: nauc_map_at_10_diff1\n value: 39.5423723444454\n - type: nauc_map_at_10_max\n value: 37.41661971723365\n - type: nauc_map_at_10_std\n value: 5.2378002164144695\n - type: nauc_map_at_1_diff1\n value: 41.52697034146981\n - type: nauc_map_at_1_max\n value: 28.558995576942863\n - type: nauc_map_at_1_std\n value: 0.13094542859192052\n - type: nauc_map_at_20_diff1\n value: 39.55484628943701\n - type: nauc_map_at_20_max\n value: 37.5247794933719\n - type: nauc_map_at_20_std\n value: 5.702881342279231\n - type: nauc_map_at_3_diff1\n value: 39.949323925425325\n - type: nauc_map_at_3_max\n value: 35.770298168901924\n - type: nauc_map_at_3_std\n value: 2.9127112432479874\n - type: nauc_map_at_5_diff1\n value: 39.768310617004545\n - type: nauc_map_at_5_max\n value: 37.1549191664796\n - type: nauc_map_at_5_std\n value: 4.4681285748269515\n - type: nauc_mrr_at_1000_diff1\n value: 39.14001746706457\n - type: nauc_mrr_at_1000_max\n value: 37.477376518267775\n - type: nauc_mrr_at_1000_std\n value: 6.8088891531621565\n - type: nauc_mrr_at_100_diff1\n value: 39.13054707413684\n - type: nauc_mrr_at_100_max\n value: 37.498126443766274\n - type: nauc_mrr_at_100_std\n value: 6.839411380129971\n - type: nauc_mrr_at_10_diff1\n value: 39.09764730048156\n - type: nauc_mrr_at_10_max\n value: 37.58593798217306\n - type: nauc_mrr_at_10_std\n value: 6.713795164982413\n - type: nauc_mrr_at_1_diff1\n value: 41.581599918664075\n - type: nauc_mrr_at_1_max\n value: 31.500589231378722\n - type: nauc_mrr_at_1_std\n value: 2.059116370339438\n - type: nauc_mrr_at_20_diff1\n value: 39.09011023988447\n - type: nauc_mrr_at_20_max\n value: 37.55856008791344\n - type: nauc_mrr_at_20_std\n value: 6.847165397615844\n - type: nauc_mrr_at_3_diff1\n value: 39.382542043738\n - type: nauc_mrr_at_3_max\n value: 36.49265363659468\n - type: nauc_mrr_at_3_std\n value: 4.759157976438336\n - type: nauc_mrr_at_5_diff1\n value: 39.304826333759976\n - type: nauc_mrr_at_5_max\n value: 37.46326016736024\n - type: nauc_mrr_at_5_std\n value: 6.122608305766621\n - type: nauc_ndcg_at_1000_diff1\n value: 38.568500038453266\n - type: nauc_ndcg_at_1000_max\n value: 39.799710882413166\n - type: nauc_ndcg_at_1000_std\n value: 9.357010223096639\n - type: nauc_ndcg_at_100_diff1\n value: 38.38026091343228\n - type: nauc_ndcg_at_100_max\n value: 40.48398173542486\n - type: nauc_ndcg_at_100_std\n value: 10.373054013302214\n - type: nauc_ndcg_at_10_diff1\n value: 38.27340980909964\n - type: nauc_ndcg_at_10_max\n value: 40.35241649744093\n - type: nauc_ndcg_at_10_std\n value: 8.579139930345168\n - type: nauc_ndcg_at_1_diff1\n value: 41.581599918664075\n - type: nauc_ndcg_at_1_max\n value: 31.500589231378722\n - type: nauc_ndcg_at_1_std\n value: 2.059116370339438\n - type: nauc_ndcg_at_20_diff1\n value: 38.26453028884807\n - type: nauc_ndcg_at_20_max\n value: 40.70517858426641\n - type: nauc_ndcg_at_20_std\n value: 9.987693876137905\n - type: nauc_ndcg_at_3_diff1\n value: 39.2078971733273\n - type: nauc_ndcg_at_3_max\n value: 37.48672195565316\n - type: nauc_ndcg_at_3_std\n value: 4.051464994659221\n - type: nauc_ndcg_at_5_diff1\n value: 38.883693595665285\n - type: nauc_ndcg_at_5_max\n value: 39.763115634437135\n - type: nauc_ndcg_at_5_std\n value: 6.738980451582073\n - type: nauc_precision_at_1000_diff1\n value: -7.223215910619012\n - type: nauc_precision_at_1000_max\n value: 13.075844604892161\n - type: nauc_precision_at_1000_std\n value: 19.864336920890107\n - type: nauc_precision_at_100_diff1\n value: 1.3305994810812418\n - type: nauc_precision_at_100_max\n value: 25.9219108557104\n - type: nauc_precision_at_100_std\n value: 27.5076605928207\n - type: nauc_precision_at_10_diff1\n value: 18.441551484970326\n - type: nauc_precision_at_10_max\n value: 39.85995330437054\n - type: nauc_precision_at_10_std\n value: 20.561269077428914\n - type: nauc_precision_at_1_diff1\n value: 41.581599918664075\n - type: nauc_precision_at_1_max\n value: 31.500589231378722\n - type: nauc_precision_at_1_std\n value: 2.059116370339438\n - type: nauc_precision_at_20_diff1\n value: 12.579593891480531\n - type: nauc_precision_at_20_max\n value: 36.620221830588775\n - type: nauc_precision_at_20_std\n value: 26.40364876775059\n - type: nauc_precision_at_3_diff1\n value: 30.158859294487073\n - type: nauc_precision_at_3_max\n value: 41.168215766389174\n - type: nauc_precision_at_3_std\n value: 9.44345004450809\n - type: nauc_precision_at_5_diff1\n value: 25.438624678672785\n - type: nauc_precision_at_5_max\n value: 42.72802023518524\n - type: nauc_precision_at_5_std\n value: 15.357657388511099\n - type: nauc_recall_at_1000_diff1\n value: 24.987564782718003\n - type: nauc_recall_at_1000_max\n value: 70.508416373353\n - type: nauc_recall_at_1000_std\n value: 69.75092280398808\n - type: nauc_recall_at_100_diff1\n value: 29.504202856421397\n - type: nauc_recall_at_100_max\n value: 63.41356585545318\n - type: nauc_recall_at_100_std\n value: 50.09250954437847\n - type: nauc_recall_at_10_diff1\n value: 32.355776022971774\n - type: nauc_recall_at_10_max\n value: 49.47121901667283\n - type: nauc_recall_at_10_std\n value: 19.418439406631244\n - type: nauc_recall_at_1_diff1\n value: 41.52697034146981\n - type: nauc_recall_at_1_max\n value: 28.558995576942863\n - type: nauc_recall_at_1_std\n value: 0.13094542859192052\n - type: nauc_recall_at_20_diff1\n value: 31.57334731023589\n - type: nauc_recall_at_20_max\n value: 54.06567225197383\n - type: nauc_recall_at_20_std\n value: 29.222029720570468\n - type: nauc_recall_at_3_diff1\n value: 36.45033533275773\n - type: nauc_recall_at_3_max\n value: 40.39529713780803\n - type: nauc_recall_at_3_std\n value: 5.21893897772794\n - type: nauc_recall_at_5_diff1\n value: 35.18471678478859\n - type: nauc_recall_at_5_max\n value: 46.20100816867823\n - type: nauc_recall_at_5_std\n value: 11.94481894633221\n - type: ndcg_at_1\n value: 37.949\n - type: ndcg_at_10\n value: 54.017\n - type: ndcg_at_100\n value: 58.126\n - type: ndcg_at_1000\n value: 59.073\n - type: ndcg_at_20\n value: 55.928\n - type: ndcg_at_3\n value: 47.494\n - type: ndcg_at_5\n value: 50.975\n - type: precision_at_1\n value: 37.949\n - type: precision_at_10\n value: 8.450000000000001\n - type: precision_at_100\n value: 1.083\n - type: precision_at_1000\n value: 0.117\n - type: precision_at_20\n value: 4.689\n - type: precision_at_3\n value: 21.051000000000002\n - type: precision_at_5\n value: 14.664\n - type: recall_at_1\n value: 34.193\n - type: recall_at_10\n value: 71.357\n - type: recall_at_100\n value: 89.434\n - type: recall_at_1000\n value: 96.536\n - type: recall_at_20\n value: 78.363\n - type: recall_at_3\n value: 54.551\n - type: recall_at_5\n value: 62.543000000000006\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB Quora-PL (default)\n revision: 0be27e93455051e531182b85e85e425aba12e9d4\n split: test\n type: clarin-knext/quora-pl\n metrics:\n - type: main_score\n value: 84.114\n - type: map_at_1\n value: 65.848\n - type: map_at_10\n value: 79.85900000000001\n - type: map_at_100\n value: 80.582\n - type: map_at_1000\n value: 80.60300000000001\n - type: map_at_20\n value: 80.321\n - type: map_at_3\n value: 76.741\n - type: map_at_5\n value: 78.72200000000001\n - type: mrr_at_1\n value: 75.97\n - type: mrr_at_10\n value: 83.04630158730119\n - type: mrr_at_100\n value: 83.22785731032968\n - type: mrr_at_1000\n value: 83.23123717623899\n - type: mrr_at_20\n value: 83.17412021320565\n - type: mrr_at_3\n value: 81.83333333333287\n - type: mrr_at_5\n value: 82.61933333333275\n - type: nauc_map_at_1000_diff1\n value: 73.26316553371083\n - type: nauc_map_at_1000_max\n value: 27.92567859085245\n - type: nauc_map_at_1000_std\n value: -47.477909533360446\n - type: nauc_map_at_100_diff1\n value: 73.2690602807223\n - type: nauc_map_at_100_max\n value: 27.915868327849996\n - type: nauc_map_at_100_std\n value: -47.525777766107595\n - type: nauc_map_at_10_diff1\n value: 73.45464428464894\n - type: nauc_map_at_10_max\n value: 27.451611487246296\n - type: nauc_map_at_10_std\n value: -49.35818715843809\n - type: nauc_map_at_1_diff1\n value: 77.29690208952982\n - type: nauc_map_at_1_max\n value: 19.839875762282293\n - type: nauc_map_at_1_std\n value: -45.355684654708284\n - type: nauc_map_at_20_diff1\n value: 73.35102731979796\n - type: nauc_map_at_20_max\n value: 27.741506490134583\n - type: nauc_map_at_20_std\n value: -48.22006207310331\n - type: nauc_map_at_3_diff1\n value: 73.94878241064137\n - type: nauc_map_at_3_max\n value: 24.761321386766728\n - type: nauc_map_at_3_std\n value: -51.20638883618126\n - type: nauc_map_at_5_diff1\n value: 73.66143558047698\n - type: nauc_map_at_5_max\n value: 26.53483405013543\n - type: nauc_map_at_5_std\n value: -50.697541279640056\n - type: nauc_mrr_at_1000_diff1\n value: 73.84632320009759\n - type: nauc_mrr_at_1000_max\n value: 30.50182733610048\n - type: nauc_mrr_at_1000_std\n value: -44.3021647995251\n - type: nauc_mrr_at_100_diff1\n value: 73.84480792662302\n - type: nauc_mrr_at_100_max\n value: 30.50749424571614\n - type: nauc_mrr_at_100_std\n value: -44.29615086388113\n - type: nauc_mrr_at_10_diff1\n value: 73.79442772949346\n - type: nauc_mrr_at_10_max\n value: 30.55724252219984\n - type: nauc_mrr_at_10_std\n value: -44.50997069462057\n - type: nauc_mrr_at_1_diff1\n value: 75.23369827945945\n - type: nauc_mrr_at_1_max\n value: 29.20073967447664\n - type: nauc_mrr_at_1_std\n value: -43.1920147658285\n - type: nauc_mrr_at_20_diff1\n value: 73.82731678072307\n - type: nauc_mrr_at_20_max\n value: 30.566328605497667\n - type: nauc_mrr_at_20_std\n value: -44.24683607643705\n - type: nauc_mrr_at_3_diff1\n value: 73.61997576749954\n - type: nauc_mrr_at_3_max\n value: 30.150393853381917\n - type: nauc_mrr_at_3_std\n value: -44.96847297506626\n - type: nauc_mrr_at_5_diff1\n value: 73.69084310616132\n - type: nauc_mrr_at_5_max\n value: 30.578033703441125\n - type: nauc_mrr_at_5_std\n value: -44.74920746066566\n - type: nauc_ndcg_at_1000_diff1\n value: 72.89349862557452\n - type: nauc_ndcg_at_1000_max\n value: 29.824725190462086\n - type: nauc_ndcg_at_1000_std\n value: -44.96284395063211\n - type: nauc_ndcg_at_100_diff1\n value: 72.85212753715273\n - type: nauc_ndcg_at_100_max\n value: 29.933114207845605\n - type: nauc_ndcg_at_100_std\n value: -44.944225570663754\n - type: nauc_ndcg_at_10_diff1\n value: 72.80576740454528\n - type: nauc_ndcg_at_10_max\n value: 29.16829118320828\n - type: nauc_ndcg_at_10_std\n value: -48.149473740079614\n - type: nauc_ndcg_at_1_diff1\n value: 75.00032534968587\n - type: nauc_ndcg_at_1_max\n value: 29.61849062038547\n - type: nauc_ndcg_at_1_std\n value: -42.560207043864054\n - type: nauc_ndcg_at_20_diff1\n value: 72.88440406302502\n - type: nauc_ndcg_at_20_max\n value: 29.65496676092656\n - type: nauc_ndcg_at_20_std\n value: -46.21238462167732\n - type: nauc_ndcg_at_3_diff1\n value: 72.37916962766987\n - type: nauc_ndcg_at_3_max\n value: 27.125094834547586\n - type: nauc_ndcg_at_3_std\n value: -48.62942991399391\n - type: nauc_ndcg_at_5_diff1\n value: 72.57017330527658\n - type: nauc_ndcg_at_5_max\n value: 28.470485561757254\n - type: nauc_ndcg_at_5_std\n value: -49.07593345591059\n - type: nauc_precision_at_1000_diff1\n value: -41.67915575853946\n - type: nauc_precision_at_1000_max\n value: 1.2012264478568844\n - type: nauc_precision_at_1000_std\n value: 44.723834559400466\n - type: nauc_precision_at_100_diff1\n value: -40.45196679236971\n - type: nauc_precision_at_100_max\n value: 2.3525450401714894\n - type: nauc_precision_at_100_std\n value: 43.7092529413952\n - type: nauc_precision_at_10_diff1\n value: -30.256026923068767\n - type: nauc_precision_at_10_max\n value: 8.313422052132559\n - type: nauc_precision_at_10_std\n value: 25.929372356449694\n - type: nauc_precision_at_1_diff1\n value: 75.00032534968587\n - type: nauc_precision_at_1_max\n value: 29.61849062038547\n - type: nauc_precision_at_1_std\n value: -42.560207043864054\n - type: nauc_precision_at_20_diff1\n value: -35.61971069986584\n - type: nauc_precision_at_20_max\n value: 5.4664303079116765\n - type: nauc_precision_at_20_std\n value: 34.992352471692826\n - type: nauc_precision_at_3_diff1\n value: -5.691231842471157\n - type: nauc_precision_at_3_max\n value: 14.797949087742444\n - type: nauc_precision_at_3_std\n value: -0.1930317395644928\n - type: nauc_precision_at_5_diff1\n value: -20.03913781462645\n - type: nauc_precision_at_5_max\n value: 11.956771408712749\n - type: nauc_precision_at_5_std\n value: 13.179251389859731\n - type: nauc_recall_at_1000_diff1\n value: 64.03509042729674\n - type: nauc_recall_at_1000_max\n value: 40.91691485428493\n - type: nauc_recall_at_1000_std\n value: 16.12968625875372\n - type: nauc_recall_at_100_diff1\n value: 63.83116179628575\n - type: nauc_recall_at_100_max\n value: 43.72908117676382\n - type: nauc_recall_at_100_std\n value: -20.50966716852155\n - type: nauc_recall_at_10_diff1\n value: 66.42071960186394\n - type: nauc_recall_at_10_max\n value: 28.983207818687205\n - type: nauc_recall_at_10_std\n value: -56.61417798753744\n - type: nauc_recall_at_1_diff1\n value: 77.29690208952982\n - type: nauc_recall_at_1_max\n value: 19.839875762282293\n - type: nauc_recall_at_1_std\n value: -45.355684654708284\n - type: nauc_recall_at_20_diff1\n value: 66.32360705219874\n - type: nauc_recall_at_20_max\n value: 33.30698111822631\n - type: nauc_recall_at_20_std\n value: -43.89233781737452\n - type: nauc_recall_at_3_diff1\n value: 69.67029394927077\n - type: nauc_recall_at_3_max\n value: 22.67803039327696\n - type: nauc_recall_at_3_std\n value: -56.43327209861502\n - type: nauc_recall_at_5_diff1\n value: 68.05622143936131\n - type: nauc_recall_at_5_max\n value: 26.67795559040675\n - type: nauc_recall_at_5_std\n value: -58.158231198510954\n - type: ndcg_at_1\n value: 76.08\n - type: ndcg_at_10\n value: 84.114\n - type: ndcg_at_100\n value: 85.784\n - type: ndcg_at_1000\n value: 85.992\n - type: ndcg_at_20\n value: 84.976\n - type: ndcg_at_3\n value: 80.74799999999999\n - type: ndcg_at_5\n value: 82.626\n - type: precision_at_1\n value: 76.08\n - type: precision_at_10\n value: 12.926000000000002\n - type: precision_at_100\n value: 1.509\n - type: precision_at_1000\n value: 0.156\n - type: precision_at_20\n value: 6.912999999999999\n - type: precision_at_3\n value: 35.5\n - type: precision_at_5\n value: 23.541999999999998\n - type: recall_at_1\n value: 65.848\n - type: recall_at_10\n value: 92.611\n - type: recall_at_100\n value: 98.69\n - type: recall_at_1000\n value: 99.83999999999999\n - type: recall_at_20\n value: 95.47200000000001\n - type: recall_at_3\n value: 83.122\n - type: recall_at_5\n value: 88.23\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB SCIDOCS-PL (default)\n revision: 45452b03f05560207ef19149545f168e596c9337\n split: test\n type: clarin-knext/scidocs-pl\n metrics:\n - type: main_score\n value: 15.379999999999999\n - type: map_at_1\n value: 3.6029999999999998\n - type: map_at_10\n value: 8.843\n - type: map_at_100\n value: 10.433\n - type: map_at_1000\n value: 10.689\n - type: map_at_20\n value: 9.597\n - type: map_at_3\n value: 6.363\n - type: map_at_5\n value: 7.603\n - type: mrr_at_1\n value: 17.7\n - type: mrr_at_10\n value: 26.58900793650793\n - type: mrr_at_100\n value: 27.699652322890987\n - type: mrr_at_1000\n value: 27.78065313118353\n - type: mrr_at_20\n value: 27.215020950411816\n - type: mrr_at_3\n value: 23.36666666666668\n - type: mrr_at_5\n value: 25.211666666666666\n - type: nauc_map_at_1000_diff1\n value: 21.92235143827129\n - type: nauc_map_at_1000_max\n value: 37.50300940750989\n - type: nauc_map_at_1000_std\n value: 20.872586122198552\n - type: nauc_map_at_100_diff1\n value: 21.917408170465833\n - type: nauc_map_at_100_max\n value: 37.4654466815513\n - type: nauc_map_at_100_std\n value: 20.621643878648534\n - type: nauc_map_at_10_diff1\n value: 22.914388723621183\n - type: nauc_map_at_10_max\n value: 36.468131213468794\n - type: nauc_map_at_10_std\n value: 16.760980140791492\n - type: nauc_map_at_1_diff1\n value: 29.00799502838457\n - type: nauc_map_at_1_max\n value: 26.64926291797503\n - type: nauc_map_at_1_std\n value: 8.167291261637361\n - type: nauc_map_at_20_diff1\n value: 22.46580947804047\n - type: nauc_map_at_20_max\n value: 36.656294842562275\n - type: nauc_map_at_20_std\n value: 18.099232417722078\n - type: nauc_map_at_3_diff1\n value: 23.436009032045934\n - type: nauc_map_at_3_max\n value: 31.325807212280914\n - type: nauc_map_at_3_std\n value: 9.780905232048852\n - type: nauc_map_at_5_diff1\n value: 22.891704394665528\n - type: nauc_map_at_5_max\n value: 35.40584466642894\n - type: nauc_map_at_5_std\n value: 13.476986099394656\n - type: nauc_mrr_at_1000_diff1\n value: 25.052937655397866\n - type: nauc_mrr_at_1000_max\n value: 29.64431912670108\n - type: nauc_mrr_at_1000_std\n value: 14.549744963988044\n - type: nauc_mrr_at_100_diff1\n value: 25.070871266969224\n - type: nauc_mrr_at_100_max\n value: 29.68743604652336\n - type: nauc_mrr_at_100_std\n value: 14.582010154574432\n - type: nauc_mrr_at_10_diff1\n value: 24.88881466938897\n - type: nauc_mrr_at_10_max\n value: 29.488430770768144\n - type: nauc_mrr_at_10_std\n value: 14.269241073852266\n - type: nauc_mrr_at_1_diff1\n value: 29.220540327267503\n - type: nauc_mrr_at_1_max\n value: 26.81908580507911\n - type: nauc_mrr_at_1_std\n value: 8.00840295809718\n - type: nauc_mrr_at_20_diff1\n value: 25.067912695721944\n - type: nauc_mrr_at_20_max\n value: 29.759227563849628\n - type: nauc_mrr_at_20_std\n value: 14.685076859257357\n - type: nauc_mrr_at_3_diff1\n value: 24.645848739182696\n - type: nauc_mrr_at_3_max\n value: 27.73368549660351\n - type: nauc_mrr_at_3_std\n value: 11.475742805586943\n - type: nauc_mrr_at_5_diff1\n value: 24.895295760909946\n - type: nauc_mrr_at_5_max\n value: 29.130755033240423\n - type: nauc_mrr_at_5_std\n value: 12.955802929145404\n - type: nauc_ndcg_at_1000_diff1\n value: 20.68434434777729\n - type: nauc_ndcg_at_1000_max\n value: 37.67055146424174\n - type: nauc_ndcg_at_1000_std\n value: 29.57493715069776\n - type: nauc_ndcg_at_100_diff1\n value: 20.396834816492383\n - type: nauc_ndcg_at_100_max\n value: 37.460575228670514\n - type: nauc_ndcg_at_100_std\n value: 27.826534756761944\n - type: nauc_ndcg_at_10_diff1\n value: 22.640844106236027\n - type: nauc_ndcg_at_10_max\n value: 35.21291764462327\n - type: nauc_ndcg_at_10_std\n value: 19.53289455984506\n - type: nauc_ndcg_at_1_diff1\n value: 29.220540327267503\n - type: nauc_ndcg_at_1_max\n value: 26.81908580507911\n - type: nauc_ndcg_at_1_std\n value: 8.00840295809718\n - type: nauc_ndcg_at_20_diff1\n value: 22.117126657768623\n - type: nauc_ndcg_at_20_max\n value: 35.79395781940806\n - type: nauc_ndcg_at_20_std\n value: 22.242748346260786\n - type: nauc_ndcg_at_3_diff1\n value: 23.00596063212187\n - type: nauc_ndcg_at_3_max\n value: 30.149013627580523\n - type: nauc_ndcg_at_3_std\n value: 11.07904064662722\n - type: nauc_ndcg_at_5_diff1\n value: 22.81875419630523\n - type: nauc_ndcg_at_5_max\n value: 34.24267468356626\n - type: nauc_ndcg_at_5_std\n value: 15.307780280752088\n - type: nauc_precision_at_1000_diff1\n value: 9.606677689029972\n - type: nauc_precision_at_1000_max\n value: 32.74855550489271\n - type: nauc_precision_at_1000_std\n value: 42.65372585937895\n - type: nauc_precision_at_100_diff1\n value: 11.528981313529545\n - type: nauc_precision_at_100_max\n value: 35.642529490132404\n - type: nauc_precision_at_100_std\n value: 38.146151426052306\n - type: nauc_precision_at_10_diff1\n value: 18.783957183811836\n - type: nauc_precision_at_10_max\n value: 36.1982008334257\n - type: nauc_precision_at_10_std\n value: 25.09349473195891\n - type: nauc_precision_at_1_diff1\n value: 29.220540327267503\n - type: nauc_precision_at_1_max\n value: 26.81908580507911\n - type: nauc_precision_at_1_std\n value: 8.00840295809718\n - type: nauc_precision_at_20_diff1\n value: 17.458766320828214\n - type: nauc_precision_at_20_max\n value: 36.000404903025235\n - type: nauc_precision_at_20_std\n value: 29.1608044138323\n - type: nauc_precision_at_3_diff1\n value: 20.213669462067166\n - type: nauc_precision_at_3_max\n value: 31.120650847205912\n - type: nauc_precision_at_3_std\n value: 12.390972418818118\n - type: nauc_precision_at_5_diff1\n value: 20.114245715785678\n - type: nauc_precision_at_5_max\n value: 37.30360111495823\n - type: nauc_precision_at_5_std\n value: 19.053109037822853\n - type: nauc_recall_at_1000_diff1\n value: 9.85800049032612\n - type: nauc_recall_at_1000_max\n value: 32.48319160802687\n - type: nauc_recall_at_1000_std\n value: 43.79941601741161\n - type: nauc_recall_at_100_diff1\n value: 11.375255270968337\n - type: nauc_recall_at_100_max\n value: 35.1868784124497\n - type: nauc_recall_at_100_std\n value: 38.422680583482666\n - type: nauc_recall_at_10_diff1\n value: 18.445783123521938\n - type: nauc_recall_at_10_max\n value: 35.633267936276766\n - type: nauc_recall_at_10_std\n value: 24.94469506254716\n - type: nauc_recall_at_1_diff1\n value: 29.00799502838457\n - type: nauc_recall_at_1_max\n value: 26.64926291797503\n - type: nauc_recall_at_1_std\n value: 8.167291261637361\n - type: nauc_recall_at_20_diff1\n value: 17.314906604151936\n - type: nauc_recall_at_20_max\n value: 35.66067699203996\n - type: nauc_recall_at_20_std\n value: 29.400137012506082\n - type: nauc_recall_at_3_diff1\n value: 19.873710875648698\n - type: nauc_recall_at_3_max\n value: 30.92404718742849\n - type: nauc_recall_at_3_std\n value: 12.400871018075199\n - type: nauc_recall_at_5_diff1\n value: 19.869948324233192\n - type: nauc_recall_at_5_max\n value: 37.06832511687574\n - type: nauc_recall_at_5_std\n value: 19.0798814966156\n - type: ndcg_at_1\n value: 17.7\n - type: ndcg_at_10\n value: 15.379999999999999\n - type: ndcg_at_100\n value: 22.09\n - type: ndcg_at_1000\n value: 27.151999999999997\n - type: ndcg_at_20\n value: 17.576\n - type: ndcg_at_3\n value: 14.219999999999999\n - type: ndcg_at_5\n value: 12.579\n - type: precision_at_1\n value: 17.7\n - type: precision_at_10\n value: 8.08\n - type: precision_at_100\n value: 1.7840000000000003\n - type: precision_at_1000\n value: 0.3\n - type: precision_at_20\n value: 5.305\n - type: precision_at_3\n value: 13.167000000000002\n - type: precision_at_5\n value: 11.06\n - type: recall_at_1\n value: 3.6029999999999998\n - type: recall_at_10\n value: 16.413\n - type: recall_at_100\n value: 36.263\n - type: recall_at_1000\n value: 61.016999999999996\n - type: recall_at_20\n value: 21.587999999999997\n - type: recall_at_3\n value: 8.013\n - type: recall_at_5\n value: 11.198\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB SciFact-PL (default)\n revision: 47932a35f045ef8ed01ba82bf9ff67f6e109207e\n split: test\n type: clarin-knext/scifact-pl\n metrics:\n - type: main_score\n value: 64.764\n - type: map_at_1\n value: 49.778\n - type: map_at_10\n value: 59.88\n - type: map_at_100\n value: 60.707\n - type: map_at_1000\n value: 60.729\n - type: map_at_20\n value: 60.419999999999995\n - type: map_at_3\n value: 57.45400000000001\n - type: map_at_5\n value: 58.729\n - type: mrr_at_1\n value: 52.33333333333333\n - type: mrr_at_10\n value: 61.29193121693122\n - type: mrr_at_100\n value: 61.95817765126313\n - type: mrr_at_1000\n value: 61.97583284368782\n - type: mrr_at_20\n value: 61.72469949641003\n - type: mrr_at_3\n value: 59.44444444444444\n - type: mrr_at_5\n value: 60.494444444444454\n - type: nauc_map_at_1000_diff1\n value: 62.21235294015774\n - type: nauc_map_at_1000_max\n value: 48.83996609100249\n - type: nauc_map_at_1000_std\n value: 5.23892781043174\n - type: nauc_map_at_100_diff1\n value: 62.20170226789429\n - type: nauc_map_at_100_max\n value: 48.8391766453537\n - type: nauc_map_at_100_std\n value: 5.2664077457917715\n - type: nauc_map_at_10_diff1\n value: 61.961975488329024\n - type: nauc_map_at_10_max\n value: 48.397109987625186\n - type: nauc_map_at_10_std\n value: 4.314859710827481\n - type: nauc_map_at_1_diff1\n value: 65.0865197011516\n - type: nauc_map_at_1_max\n value: 41.38862781954889\n - type: nauc_map_at_1_std\n value: -0.9182122632530586\n - type: nauc_map_at_20_diff1\n value: 61.99173935851292\n - type: nauc_map_at_20_max\n value: 48.79961814179307\n - type: nauc_map_at_20_std\n value: 5.262181845825118\n - type: nauc_map_at_3_diff1\n value: 62.37910539880477\n - type: nauc_map_at_3_max\n value: 47.13627890977091\n - type: nauc_map_at_3_std\n value: 2.327897198087264\n - type: nauc_map_at_5_diff1\n value: 61.60080757149592\n - type: nauc_map_at_5_max\n value: 47.60052458345962\n - type: nauc_map_at_5_std\n value: 3.1770196981231047\n - type: nauc_mrr_at_1000_diff1\n value: 62.86810952814966\n - type: nauc_mrr_at_1000_max\n value: 52.13248094447774\n - type: nauc_mrr_at_1000_std\n value: 10.100485746570733\n - type: nauc_mrr_at_100_diff1\n value: 62.85364829491874\n - type: nauc_mrr_at_100_max\n value: 52.134528010631854\n - type: nauc_mrr_at_100_std\n value: 10.120945685447369\n - type: nauc_mrr_at_10_diff1\n value: 62.65679301829915\n - type: nauc_mrr_at_10_max\n value: 52.09270719182349\n - type: nauc_mrr_at_10_std\n value: 9.913834434725441\n - type: nauc_mrr_at_1_diff1\n value: 66.84108271415636\n - type: nauc_mrr_at_1_max\n value: 46.67646429855176\n - type: nauc_mrr_at_1_std\n value: 5.5505252956352304\n - type: nauc_mrr_at_20_diff1\n value: 62.72473227039611\n - type: nauc_mrr_at_20_max\n value: 52.13479097802757\n - type: nauc_mrr_at_20_std\n value: 10.188278833464084\n - type: nauc_mrr_at_3_diff1\n value: 63.797429185518496\n - type: nauc_mrr_at_3_max\n value: 52.16486999573481\n - type: nauc_mrr_at_3_std\n value: 9.094360767062762\n - type: nauc_mrr_at_5_diff1\n value: 62.592917975475494\n - type: nauc_mrr_at_5_max\n value: 52.330741486107414\n - type: nauc_mrr_at_5_std\n value: 9.742175534421389\n - type: nauc_ndcg_at_1000_diff1\n value: 61.38859337672476\n - type: nauc_ndcg_at_1000_max\n value: 51.48380058339184\n - type: nauc_ndcg_at_1000_std\n value: 9.670547660897673\n - type: nauc_ndcg_at_100_diff1\n value: 61.02438489641434\n - type: nauc_ndcg_at_100_max\n value: 51.781246646780865\n - type: nauc_ndcg_at_100_std\n value: 10.592961553245187\n - type: nauc_ndcg_at_10_diff1\n value: 60.03678353308358\n - type: nauc_ndcg_at_10_max\n value: 50.70725688848762\n - type: nauc_ndcg_at_10_std\n value: 7.9472446491016315\n - type: nauc_ndcg_at_1_diff1\n value: 66.84108271415636\n - type: nauc_ndcg_at_1_max\n value: 46.67646429855176\n - type: nauc_ndcg_at_1_std\n value: 5.5505252956352304\n - type: nauc_ndcg_at_20_diff1\n value: 59.828482718480224\n - type: nauc_ndcg_at_20_max\n value: 51.45831789601284\n - type: nauc_ndcg_at_20_std\n value: 10.722673683272049\n - type: nauc_ndcg_at_3_diff1\n value: 61.68982937524109\n - type: nauc_ndcg_at_3_max\n value: 49.745326748604775\n - type: nauc_ndcg_at_3_std\n value: 4.948298621202247\n - type: nauc_ndcg_at_5_diff1\n value: 59.67396171973207\n - type: nauc_ndcg_at_5_max\n value: 49.87855139298281\n - type: nauc_ndcg_at_5_std\n value: 6.08990428055584\n - type: nauc_precision_at_1000_diff1\n value: -1.594227972036865\n - type: nauc_precision_at_1000_max\n value: 32.48431723086185\n - type: nauc_precision_at_1000_std\n value: 53.84748466965268\n - type: nauc_precision_at_100_diff1\n value: 8.06411455192293\n - type: nauc_precision_at_100_max\n value: 39.91003601878948\n - type: nauc_precision_at_100_std\n value: 55.52979711075091\n - type: nauc_precision_at_10_diff1\n value: 26.610514456014066\n - type: nauc_precision_at_10_max\n value: 47.09062494321172\n - type: nauc_precision_at_10_std\n value: 33.91984226498748\n - type: nauc_precision_at_1_diff1\n value: 66.84108271415636\n - type: nauc_precision_at_1_max\n value: 46.67646429855176\n - type: nauc_precision_at_1_std\n value: 5.5505252956352304\n - type: nauc_precision_at_20_diff1\n value: 16.947688843085583\n - type: nauc_precision_at_20_max\n value: 45.40488186572008\n - type: nauc_precision_at_20_std\n value: 48.354421924500905\n - type: nauc_precision_at_3_diff1\n value: 49.11263981720622\n - type: nauc_precision_at_3_max\n value: 52.7084625111683\n - type: nauc_precision_at_3_std\n value: 16.734612173556453\n - type: nauc_precision_at_5_diff1\n value: 39.06503705015792\n - type: nauc_precision_at_5_max\n value: 52.21710506893391\n - type: nauc_precision_at_5_std\n value: 23.350948149460233\n - type: nauc_recall_at_1000_diff1\n value: 43.1559290382817\n - type: nauc_recall_at_1000_max\n value: 83.66013071895456\n - type: nauc_recall_at_1000_std\n value: 86.27450980392177\n - type: nauc_recall_at_100_diff1\n value: 46.016860850620375\n - type: nauc_recall_at_100_max\n value: 69.3944888744547\n - type: nauc_recall_at_100_std\n value: 55.286945696152735\n - type: nauc_recall_at_10_diff1\n value: 49.65877895350921\n - type: nauc_recall_at_10_max\n value: 53.02636695700889\n - type: nauc_recall_at_10_std\n value: 13.967608945823828\n - type: nauc_recall_at_1_diff1\n value: 65.0865197011516\n - type: nauc_recall_at_1_max\n value: 41.38862781954889\n - type: nauc_recall_at_1_std\n value: -0.9182122632530586\n - type: nauc_recall_at_20_diff1\n value: 43.355308229973524\n - type: nauc_recall_at_20_max\n value: 57.04187909533764\n - type: nauc_recall_at_20_std\n value: 33.578720846660524\n - type: nauc_recall_at_3_diff1\n value: 56.922996057428165\n - type: nauc_recall_at_3_max\n value: 50.74417041895424\n - type: nauc_recall_at_3_std\n value: 5.623890124328387\n - type: nauc_recall_at_5_diff1\n value: 50.55620076865238\n - type: nauc_recall_at_5_max\n value: 51.3316854622085\n - type: nauc_recall_at_5_std\n value: 8.995457887269255\n - type: ndcg_at_1\n value: 52.333\n - type: ndcg_at_10\n value: 64.764\n - type: ndcg_at_100\n value: 68.167\n - type: ndcg_at_1000\n value: 68.816\n - type: ndcg_at_20\n value: 66.457\n - type: ndcg_at_3\n value: 60.346\n - type: ndcg_at_5\n value: 62.365\n - type: precision_at_1\n value: 52.333\n - type: precision_at_10\n value: 8.799999999999999\n - type: precision_at_100\n value: 1.057\n - type: precision_at_1000\n value: 0.11100000000000002\n - type: precision_at_20\n value: 4.8\n - type: precision_at_3\n value: 23.889\n - type: precision_at_5\n value: 15.6\n - type: recall_at_1\n value: 49.778\n - type: recall_at_10\n value: 78.206\n - type: recall_at_100\n value: 93.10000000000001\n - type: recall_at_1000\n value: 98.333\n - type: recall_at_20\n value: 84.467\n - type: recall_at_3\n value: 66.367\n - type: recall_at_5\n value: 71.35000000000001\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB TRECCOVID-PL (default)\n revision: 81bcb408f33366c2a20ac54adafad1ae7e877fdd\n split: test\n type: clarin-knext/trec-covid-pl\n metrics:\n - type: main_score\n value: 72.18900000000001\n - type: map_at_1\n value: 0.214\n - type: map_at_10\n value: 1.755\n - type: map_at_100\n value: 9.944\n - type: map_at_1000\n value: 24.205\n - type: map_at_20\n value: 3.1510000000000002\n - type: map_at_3\n value: 0.6\n - type: map_at_5\n value: 0.9560000000000001\n - type: mrr_at_1\n value: 82.0\n - type: mrr_at_10\n value: 89.06666666666666\n - type: mrr_at_100\n value: 89.06666666666666\n - type: mrr_at_1000\n value: 89.06666666666666\n - type: mrr_at_20\n value: 89.06666666666666\n - type: mrr_at_3\n value: 87.66666666666666\n - type: mrr_at_5\n value: 89.06666666666666\n - type: nauc_map_at_1000_diff1\n value: -9.342037623635543\n - type: nauc_map_at_1000_max\n value: 45.71499810252398\n - type: nauc_map_at_1000_std\n value: 76.86482845196852\n - type: nauc_map_at_100_diff1\n value: -6.932395299866198\n - type: nauc_map_at_100_max\n value: 36.097801891181604\n - type: nauc_map_at_100_std\n value: 65.6085215411685\n - type: nauc_map_at_10_diff1\n value: -6.3654843824342775\n - type: nauc_map_at_10_max\n value: 9.564437521432714\n - type: nauc_map_at_10_std\n value: 21.8377319336476\n - type: nauc_map_at_1_diff1\n value: 8.269590874255034\n - type: nauc_map_at_1_max\n value: 3.482498491294516\n - type: nauc_map_at_1_std\n value: 8.985226819412189\n - type: nauc_map_at_20_diff1\n value: -4.971435767877232\n - type: nauc_map_at_20_max\n value: 22.88801858567121\n - type: nauc_map_at_20_std\n value: 32.38492618534027\n - type: nauc_map_at_3_diff1\n value: 1.1615973694623123\n - type: nauc_map_at_3_max\n value: 1.935417800315643\n - type: nauc_map_at_3_std\n value: 10.289328305818698\n - type: nauc_map_at_5_diff1\n value: -2.4675967231444105\n - type: nauc_map_at_5_max\n value: 2.4611483736622373\n - type: nauc_map_at_5_std\n value: 15.082324305750811\n - type: nauc_mrr_at_1000_diff1\n value: 13.098526703499063\n - type: nauc_mrr_at_1000_max\n value: 56.37362177417431\n - type: nauc_mrr_at_1000_std\n value: 73.2456769749587\n - type: nauc_mrr_at_100_diff1\n value: 13.098526703499063\n - type: nauc_mrr_at_100_max\n value: 56.37362177417431\n - type: nauc_mrr_at_100_std\n value: 73.2456769749587\n - type: nauc_mrr_at_10_diff1\n value: 13.098526703499063\n - type: nauc_mrr_at_10_max\n value: 56.37362177417431\n - type: nauc_mrr_at_10_std\n value: 73.2456769749587\n - type: nauc_mrr_at_1_diff1\n value: 12.099350148694809\n - type: nauc_mrr_at_1_max\n value: 53.75041304108387\n - type: nauc_mrr_at_1_std\n value: 68.84018063663402\n - type: nauc_mrr_at_20_diff1\n value: 13.098526703499063\n - type: nauc_mrr_at_20_max\n value: 56.37362177417431\n - type: nauc_mrr_at_20_std\n value: 73.2456769749587\n - type: nauc_mrr_at_3_diff1\n value: 12.173557857011161\n - type: nauc_mrr_at_3_max\n value: 57.540780562363395\n - type: nauc_mrr_at_3_std\n value: 75.42098189580211\n - type: nauc_mrr_at_5_diff1\n value: 13.098526703499063\n - type: nauc_mrr_at_5_max\n value: 56.37362177417431\n - type: nauc_mrr_at_5_std\n value: 73.2456769749587\n - type: nauc_ndcg_at_1000_diff1\n value: -8.951471847310401\n - type: nauc_ndcg_at_1000_max\n value: 43.86942237288822\n - type: nauc_ndcg_at_1000_std\n value: 74.61077735148591\n - type: nauc_ndcg_at_100_diff1\n value: -17.754559361083817\n - type: nauc_ndcg_at_100_max\n value: 53.97187119773482\n - type: nauc_ndcg_at_100_std\n value: 80.7944136146514\n - type: nauc_ndcg_at_10_diff1\n value: -26.637734697836414\n - type: nauc_ndcg_at_10_max\n value: 47.70102699133149\n - type: nauc_ndcg_at_10_std\n value: 70.26909560828646\n - type: nauc_ndcg_at_1_diff1\n value: -1.2250530785563207\n - type: nauc_ndcg_at_1_max\n value: 46.60509554140131\n - type: nauc_ndcg_at_1_std\n value: 62.63906581740976\n - type: nauc_ndcg_at_20_diff1\n value: -22.44286466550908\n - type: nauc_ndcg_at_20_max\n value: 55.40492058090103\n - type: nauc_ndcg_at_20_std\n value: 72.11813912145738\n - type: nauc_ndcg_at_3_diff1\n value: -14.8152721896563\n - type: nauc_ndcg_at_3_max\n value: 38.952259383027595\n - type: nauc_ndcg_at_3_std\n value: 59.819750166537766\n - type: nauc_ndcg_at_5_diff1\n value: -19.150105688904375\n - type: nauc_ndcg_at_5_max\n value: 42.311180547775315\n - type: nauc_ndcg_at_5_std\n value: 66.6632229321094\n - type: nauc_precision_at_1000_diff1\n value: -11.555591477978941\n - type: nauc_precision_at_1000_max\n value: 43.7311644834851\n - type: nauc_precision_at_1000_std\n value: 52.10644767999648\n - type: nauc_precision_at_100_diff1\n value: -16.94803099801117\n - type: nauc_precision_at_100_max\n value: 54.08281631067633\n - type: nauc_precision_at_100_std\n value: 82.77237347891331\n - type: nauc_precision_at_10_diff1\n value: -27.351332814863355\n - type: nauc_precision_at_10_max\n value: 48.08237549065846\n - type: nauc_precision_at_10_std\n value: 69.37250843534329\n - type: nauc_precision_at_1_diff1\n value: 12.099350148694809\n - type: nauc_precision_at_1_max\n value: 53.75041304108387\n - type: nauc_precision_at_1_std\n value: 68.84018063663402\n - type: nauc_precision_at_20_diff1\n value: -18.2422222283388\n - type: nauc_precision_at_20_max\n value: 59.517328129343696\n - type: nauc_precision_at_20_std\n value: 72.05149307342747\n - type: nauc_precision_at_3_diff1\n value: -10.226547543075897\n - type: nauc_precision_at_3_max\n value: 43.14684818832875\n - type: nauc_precision_at_3_std\n value: 57.31936467418288\n - type: nauc_precision_at_5_diff1\n value: -14.28521589468673\n - type: nauc_precision_at_5_max\n value: 41.633426753962596\n - type: nauc_precision_at_5_std\n value: 64.94400576804541\n - type: nauc_recall_at_1000_diff1\n value: -0.9648831207497152\n - type: nauc_recall_at_1000_max\n value: 31.70832946085005\n - type: nauc_recall_at_1000_std\n value: 63.21471613968869\n - type: nauc_recall_at_100_diff1\n value: -1.360254380933586\n - type: nauc_recall_at_100_max\n value: 25.960597782099605\n - type: nauc_recall_at_100_std\n value: 51.52757589609674\n - type: nauc_recall_at_10_diff1\n value: -0.3899439424189566\n - type: nauc_recall_at_10_max\n value: 5.094341897886072\n - type: nauc_recall_at_10_std\n value: 11.266045616925698\n - type: nauc_recall_at_1_diff1\n value: 8.269590874255034\n - type: nauc_recall_at_1_max\n value: 3.482498491294516\n - type: nauc_recall_at_1_std\n value: 8.985226819412189\n - type: nauc_recall_at_20_diff1\n value: 6.4797098359254175\n - type: nauc_recall_at_20_max\n value: 15.663700985336124\n - type: nauc_recall_at_20_std\n value: 17.154099587904913\n - type: nauc_recall_at_3_diff1\n value: 3.7245972450393507\n - type: nauc_recall_at_3_max\n value: 0.4063857187240345\n - type: nauc_recall_at_3_std\n value: 6.641948062821941\n - type: nauc_recall_at_5_diff1\n value: 4.013879477591466\n - type: nauc_recall_at_5_max\n value: -1.4266586618013566\n - type: nauc_recall_at_5_std\n value: 7.311601874411205\n - type: ndcg_at_1\n value: 75.0\n - type: ndcg_at_10\n value: 72.18900000000001\n - type: ndcg_at_100\n value: 54.022999999999996\n - type: ndcg_at_1000\n value: 49.492000000000004\n - type: ndcg_at_20\n value: 68.51\n - type: ndcg_at_3\n value: 73.184\n - type: ndcg_at_5\n value: 72.811\n - type: precision_at_1\n value: 82.0\n - type: precision_at_10\n value: 77.4\n - type: precision_at_100\n value: 55.24\n - type: precision_at_1000\n value: 21.822\n - type: precision_at_20\n value: 73.0\n - type: precision_at_3\n value: 79.333\n - type: precision_at_5\n value: 79.2\n - type: recall_at_1\n value: 0.214\n - type: recall_at_10\n value: 1.9980000000000002\n - type: recall_at_100\n value: 13.328999999999999\n - type: recall_at_1000\n value: 47.204\n - type: recall_at_20\n value: 3.7310000000000003\n - type: recall_at_3\n value: 0.628\n - type: recall_at_5\n value: 1.049\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CEDRClassification (default)\n revision: c0ba03d058e3e1b2f3fd20518875a4563dd12db4\n split: test\n type: ai-forever/cedr-classification\n metrics:\n - type: accuracy\n value: 47.30605738575983\n - type: f1\n value: 41.26091043925065\n - type: lrap\n value: 72.89452709883206\n - type: main_score\n value: 47.30605738575983\n task:\n type: MultilabelClassification\n - dataset:\n config: ru\n name: MTEB MIRACLReranking (ru)\n revision: 6d1962c527217f8927fca80f890f14f36b2802af\n split: dev\n type: miracl/mmteb-miracl-reranking\n metrics:\n - type: MAP@1(MIRACL)\n value: 20.721999999999998\n - type: MAP@10(MIRACL)\n value: 33.900999999999996\n - type: MAP@100(MIRACL)\n value: 36.813\n - type: MAP@1000(MIRACL)\n value: 36.813\n - type: MAP@20(MIRACL)\n value: 35.684\n - type: MAP@3(MIRACL)\n value: 28.141\n - type: MAP@5(MIRACL)\n value: 31.075000000000003\n - type: NDCG@1(MIRACL)\n value: 32.799\n - type: NDCG@10(MIRACL)\n value: 42.065000000000005\n - type: NDCG@100(MIRACL)\n value: 49.730999999999995\n - type: NDCG@1000(MIRACL)\n value: 49.730999999999995\n - type: NDCG@20(MIRACL)\n value: 46.0\n - type: NDCG@3(MIRACL)\n value: 34.481\n - type: NDCG@5(MIRACL)\n value: 37.452999999999996\n - type: P@1(MIRACL)\n value: 32.799\n - type: P@10(MIRACL)\n value: 11.668000000000001\n - type: P@100(MIRACL)\n value: 1.9529999999999998\n - type: P@1000(MIRACL)\n value: 0.19499999999999998\n - type: P@20(MIRACL)\n value: 7.51\n - type: P@3(MIRACL)\n value: 20.823\n - type: P@5(MIRACL)\n value: 16.728\n - type: Recall@1(MIRACL)\n value: 20.721999999999998\n - type: Recall@10(MIRACL)\n value: 54.762\n - type: Recall@100(MIRACL)\n value: 79.952\n - type: Recall@1000(MIRACL)\n value: 79.952\n - type: Recall@20(MIRACL)\n value: 66.26100000000001\n - type: Recall@3(MIRACL)\n value: 34.410000000000004\n - type: Recall@5(MIRACL)\n value: 42.659000000000006\n - type: main_score\n value: 42.065000000000005\n - type: nAUC_MAP@1000_diff1(MIRACL)\n value: 14.33534992502818\n - type: nAUC_MAP@1000_max(MIRACL)\n value: 12.367998764646115\n - type: nAUC_MAP@1000_std(MIRACL)\n value: 4.569686002935006\n - type: nAUC_MAP@100_diff1(MIRACL)\n value: 14.33534992502818\n - type: nAUC_MAP@100_max(MIRACL)\n value: 12.367998764646115\n - type: nAUC_MAP@100_std(MIRACL)\n value: 4.569686002935006\n - type: nAUC_MAP@10_diff1(MIRACL)\n value: 16.920323975680027\n - type: nAUC_MAP@10_max(MIRACL)\n value: 9.327171297204082\n - type: nAUC_MAP@10_std(MIRACL)\n value: 3.2039133783079015\n - type: nAUC_MAP@1_diff1(MIRACL)\n value: 28.698973487482206\n - type: nAUC_MAP@1_max(MIRACL)\n value: 2.9217687660885034\n - type: nAUC_MAP@1_std(MIRACL)\n value: -1.1247408800976524\n - type: nAUC_MAP@20_diff1(MIRACL)\n value: 15.359083081640476\n - type: nAUC_MAP@20_max(MIRACL)\n value: 11.310494233946345\n - type: nAUC_MAP@20_std(MIRACL)\n value: 4.4171898386022885\n - type: nAUC_MAP@3_diff1(MIRACL)\n value: 22.27430591851617\n - type: nAUC_MAP@3_max(MIRACL)\n value: 6.407438291284658\n - type: nAUC_MAP@3_std(MIRACL)\n value: 0.9799184530397409\n - type: nAUC_MAP@5_diff1(MIRACL)\n value: 19.20571689941054\n - type: nAUC_MAP@5_max(MIRACL)\n value: 7.987468654026893\n - type: nAUC_MAP@5_std(MIRACL)\n value: 1.8324246565938962\n - type: nAUC_NDCG@1000_diff1(MIRACL)\n value: 3.7537669018914768\n - type: nAUC_NDCG@1000_max(MIRACL)\n value: 20.7944707840533\n - type: nAUC_NDCG@1000_std(MIRACL)\n value: 8.444837055303063\n - type: nAUC_NDCG@100_diff1(MIRACL)\n value: 3.7537669018914768\n - type: nAUC_NDCG@100_max(MIRACL)\n value: 20.7944707840533\n - type: nAUC_NDCG@100_std(MIRACL)\n value: 8.444837055303063\n - type: nAUC_NDCG@10_diff1(MIRACL)\n value: 10.829575656103888\n - type: nAUC_NDCG@10_max(MIRACL)\n value: 13.0445496498929\n - type: nAUC_NDCG@10_std(MIRACL)\n value: 6.050412212625362\n - type: nAUC_NDCG@1_diff1(MIRACL)\n value: 19.1388712233292\n - type: nAUC_NDCG@1_max(MIRACL)\n value: 10.871900994781642\n - type: nAUC_NDCG@1_std(MIRACL)\n value: 3.218568248751811\n - type: nAUC_NDCG@20_diff1(MIRACL)\n value: 7.093172181746442\n - type: nAUC_NDCG@20_max(MIRACL)\n value: 16.955238078958836\n - type: nAUC_NDCG@20_std(MIRACL)\n value: 8.325656379573035\n - type: nAUC_NDCG@3_diff1(MIRACL)\n value: 17.134437303330802\n - type: nAUC_NDCG@3_max(MIRACL)\n value: 10.235328822955793\n - type: nAUC_NDCG@3_std(MIRACL)\n value: 3.2341358691084814\n - type: nAUC_NDCG@5_diff1(MIRACL)\n value: 14.733664618337636\n - type: nAUC_NDCG@5_max(MIRACL)\n value: 11.181897412035282\n - type: nAUC_NDCG@5_std(MIRACL)\n value: 3.642277088791985\n - type: nAUC_P@1000_diff1(MIRACL)\n value: -26.330038284867573\n - type: nAUC_P@1000_max(MIRACL)\n value: 28.450694137240458\n - type: nAUC_P@1000_std(MIRACL)\n value: 9.892993775474912\n - type: nAUC_P@100_diff1(MIRACL)\n value: -26.330038284867552\n - type: nAUC_P@100_max(MIRACL)\n value: 28.45069413724051\n - type: nAUC_P@100_std(MIRACL)\n value: 9.892993775474928\n - type: nAUC_P@10_diff1(MIRACL)\n value: -17.436937353231112\n - type: nAUC_P@10_max(MIRACL)\n value: 24.327018012947857\n - type: nAUC_P@10_std(MIRACL)\n value: 11.78803527706634\n - type: nAUC_P@1_diff1(MIRACL)\n value: 19.1388712233292\n - type: nAUC_P@1_max(MIRACL)\n value: 10.871900994781642\n - type: nAUC_P@1_std(MIRACL)\n value: 3.218568248751811\n - type: nAUC_P@20_diff1(MIRACL)\n value: -22.947528755272426\n - type: nAUC_P@20_max(MIRACL)\n value: 27.773093471902538\n - type: nAUC_P@20_std(MIRACL)\n value: 14.898619107087221\n - type: nAUC_P@3_diff1(MIRACL)\n value: 1.4100426412400944\n - type: nAUC_P@3_max(MIRACL)\n value: 17.397472872058845\n - type: nAUC_P@3_std(MIRACL)\n value: 8.240008229861875\n - type: nAUC_P@5_diff1(MIRACL)\n value: -7.971349332207021\n - type: nAUC_P@5_max(MIRACL)\n value: 22.198441167940963\n - type: nAUC_P@5_std(MIRACL)\n value: 9.00265164460082\n - type: nAUC_Recall@1000_diff1(MIRACL)\n value: -38.69835271863148\n - type: nAUC_Recall@1000_max(MIRACL)\n value: 50.9545152809108\n - type: nAUC_Recall@1000_std(MIRACL)\n value: 20.44270887092116\n - type: nAUC_Recall@100_diff1(MIRACL)\n value: -38.69835271863148\n - type: nAUC_Recall@100_max(MIRACL)\n value: 50.9545152809108\n - type: nAUC_Recall@100_std(MIRACL)\n value: 20.44270887092116\n - type: nAUC_Recall@10_diff1(MIRACL)\n value: -0.08109036309433801\n - type: nAUC_Recall@10_max(MIRACL)\n value: 12.696619907773568\n - type: nAUC_Recall@10_std(MIRACL)\n value: 8.791982704261589\n - type: nAUC_Recall@1_diff1(MIRACL)\n value: 28.698973487482206\n - type: nAUC_Recall@1_max(MIRACL)\n value: 2.9217687660885034\n - type: nAUC_Recall@1_std(MIRACL)\n value: -1.1247408800976524\n - type: nAUC_Recall@20_diff1(MIRACL)\n value: -13.312171017942623\n - type: nAUC_Recall@20_max(MIRACL)\n value: 24.19847346821666\n - type: nAUC_Recall@20_std(MIRACL)\n value: 15.8157702609797\n - type: nAUC_Recall@3_diff1(MIRACL)\n value: 16.909128321353343\n - type: nAUC_Recall@3_max(MIRACL)\n value: 6.552122731902991\n - type: nAUC_Recall@3_std(MIRACL)\n value: 1.9963898223457228\n - type: nAUC_Recall@5_diff1(MIRACL)\n value: 9.990292655247721\n - type: nAUC_Recall@5_max(MIRACL)\n value: 9.361722273507574\n - type: nAUC_Recall@5_std(MIRACL)\n value: 3.270918827854495\n task:\n type: Reranking\n \n - dataset:\n config: default\n name: MTEB SensitiveTopicsClassification (default)\n revision: 416b34a802308eac30e4192afc0ff99bb8dcc7f2\n split: test\n type: ai-forever/sensitive-topics-classification\n metrics:\n - type: accuracy\n value: 30.634765625\n - type: f1\n value: 32.647559808678665\n - type: lrap\n value: 45.94319661458259\n - type: main_score\n value: 30.634765625\n task:\n type: MultilabelClassification\n - dataset:\n config: default\n name: MTEB ATEC (default)\n revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865\n split: test\n type: C-MTEB/ATEC\n metrics:\n - type: cosine_pearson\n value: 47.541497334563296\n - type: cosine_spearman\n value: 49.06268944206629\n - type: euclidean_pearson\n value: 51.838926748581635\n - type: euclidean_spearman\n value: 48.930697157135356\n - type: main_score\n value: 49.06268944206629\n - type: manhattan_pearson\n value: 51.835306769406365\n - type: manhattan_spearman\n value: 48.86135493444834\n - type: pearson\n value: 47.541497334563296\n - type: spearman\n value: 49.06268944206629\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB AllegroReviews (default)\n revision: b89853e6de927b0e3bfa8ecc0e56fe4e02ceafc6\n split: test\n type: PL-MTEB/allegro-reviews\n metrics:\n - type: accuracy\n value: 49.51292246520874\n - type: f1\n value: 44.14350234332397\n - type: f1_weighted\n value: 51.65508998354552\n - type: main_score\n value: 49.51292246520874\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB AlloProfClusteringP2P (default)\n revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b\n split: test\n type: lyon-nlp/alloprof\n metrics:\n - type: main_score\n value: 63.883383458621665\n - type: v_measure\n value: 63.883383458621665\n - type: v_measure_std\n value: 2.693666879958465\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB 8TagsClustering\n revision: None\n split: test\n type: PL-MTEB/8tags-clustering\n metrics:\n - type: v_measure\n value: 43.657212124525546\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB AlloProfClusteringS2S (default)\n revision: 392ba3f5bcc8c51f578786c1fc3dae648662cb9b\n split: test\n type: lyon-nlp/alloprof\n metrics:\n - type: main_score\n value: 46.85924588755251\n - type: v_measure\n value: 46.85924588755251\n - type: v_measure_std\n value: 2.1918258880872377\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB AlloprofReranking (default)\n revision: e40c8a63ce02da43200eccb5b0846fcaa888f562\n split: test\n type: lyon-nlp/mteb-fr-reranking-alloprof-s2p\n metrics:\n - type: map\n value: 66.39013753839347\n - type: mrr\n value: 67.68045617786551\n - type: main_score\n value: 66.39013753839347\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB AlloprofRetrieval (default)\n revision: fcf295ea64c750f41fadbaa37b9b861558e1bfbd\n split: test\n type: lyon-nlp/alloprof\n metrics:\n - type: main_score\n value: 54.284\n - type: map_at_1\n value: 37.047000000000004\n - type: map_at_10\n value: 48.53\n - type: map_at_100\n value: 49.357\n - type: map_at_1000\n value: 49.39\n - type: map_at_20\n value: 49.064\n - type: map_at_3\n value: 45.675\n - type: map_at_5\n value: 47.441\n - type: mrr_at_1\n value: 37.04663212435233\n - type: mrr_at_10\n value: 48.5300326232969\n - type: mrr_at_100\n value: 49.35708199037581\n - type: mrr_at_1000\n value: 49.39005824603193\n - type: mrr_at_20\n value: 49.06417416464799\n - type: mrr_at_3\n value: 45.67501439263105\n - type: mrr_at_5\n value: 47.44099021301103\n - type: nauc_map_at_1000_diff1\n value: 43.32474221868009\n - type: nauc_map_at_1000_max\n value: 39.407334029058575\n - type: nauc_map_at_1000_std\n value: -2.3728154448932606\n - type: nauc_map_at_100_diff1\n value: 43.32336300929909\n - type: nauc_map_at_100_max\n value: 39.432174777554835\n - type: nauc_map_at_100_std\n value: -2.356396922384349\n - type: nauc_map_at_10_diff1\n value: 43.1606520154482\n - type: nauc_map_at_10_max\n value: 39.33734650558226\n - type: nauc_map_at_10_std\n value: -2.5156222475075256\n - type: nauc_map_at_1_diff1\n value: 46.2178975214499\n - type: nauc_map_at_1_max\n value: 36.26173199049361\n - type: nauc_map_at_1_std\n value: -3.0897555582816443\n - type: nauc_map_at_20_diff1\n value: 43.272980702916456\n - type: nauc_map_at_20_max\n value: 39.4896977052276\n - type: nauc_map_at_20_std\n value: -2.3305501742917043\n - type: nauc_map_at_3_diff1\n value: 43.49525042967079\n - type: nauc_map_at_3_max\n value: 38.66352501824728\n - type: nauc_map_at_3_std\n value: -3.202794391620473\n - type: nauc_map_at_5_diff1\n value: 43.2266692546611\n - type: nauc_map_at_5_max\n value: 38.77368661115743\n - type: nauc_map_at_5_std\n value: -3.0897532130127954\n - type: nauc_mrr_at_1000_diff1\n value: 43.32474221868009\n - type: nauc_mrr_at_1000_max\n value: 39.407334029058575\n - type: nauc_mrr_at_1000_std\n value: -2.3728154448932606\n - type: nauc_mrr_at_100_diff1\n value: 43.32336300929909\n - type: nauc_mrr_at_100_max\n value: 39.432174777554835\n - type: nauc_mrr_at_100_std\n value: -2.356396922384349\n - type: nauc_mrr_at_10_diff1\n value: 43.1606520154482\n - type: nauc_mrr_at_10_max\n value: 39.33734650558226\n - type: nauc_mrr_at_10_std\n value: -2.5156222475075256\n - type: nauc_mrr_at_1_diff1\n value: 46.2178975214499\n - type: nauc_mrr_at_1_max\n value: 36.26173199049361\n - type: nauc_mrr_at_1_std\n value: -3.0897555582816443\n - type: nauc_mrr_at_20_diff1\n value: 43.272980702916456\n - type: nauc_mrr_at_20_max\n value: 39.4896977052276\n - type: nauc_mrr_at_20_std\n value: -2.3305501742917043\n - type: nauc_mrr_at_3_diff1\n value: 43.49525042967079\n - type: nauc_mrr_at_3_max\n value: 38.66352501824728\n - type: nauc_mrr_at_3_std\n value: -3.202794391620473\n - type: nauc_mrr_at_5_diff1\n value: 43.2266692546611\n - type: nauc_mrr_at_5_max\n value: 38.77368661115743\n - type: nauc_mrr_at_5_std\n value: -3.0897532130127954\n - type: nauc_ndcg_at_1000_diff1\n value: 43.01903168202974\n - type: nauc_ndcg_at_1000_max\n value: 40.75496622942232\n - type: nauc_ndcg_at_1000_std\n value: -1.3150412981845496\n - type: nauc_ndcg_at_100_diff1\n value: 42.98016493758145\n - type: nauc_ndcg_at_100_max\n value: 41.55869635162325\n - type: nauc_ndcg_at_100_std\n value: -0.5355252976886055\n - type: nauc_ndcg_at_10_diff1\n value: 42.218755211347506\n - type: nauc_ndcg_at_10_max\n value: 41.305042275175765\n - type: nauc_ndcg_at_10_std\n value: -1.4034484444573714\n - type: nauc_ndcg_at_1_diff1\n value: 46.2178975214499\n - type: nauc_ndcg_at_1_max\n value: 36.26173199049361\n - type: nauc_ndcg_at_1_std\n value: -3.0897555582816443\n - type: nauc_ndcg_at_20_diff1\n value: 42.66574440095576\n - type: nauc_ndcg_at_20_max\n value: 42.014620115124515\n - type: nauc_ndcg_at_20_std\n value: -0.5176162553751498\n - type: nauc_ndcg_at_3_diff1\n value: 42.837450505106055\n - type: nauc_ndcg_at_3_max\n value: 39.525369733082414\n - type: nauc_ndcg_at_3_std\n value: -3.1605948245795155\n - type: nauc_ndcg_at_5_diff1\n value: 42.37951815451173\n - type: nauc_ndcg_at_5_max\n value: 39.78840132935179\n - type: nauc_ndcg_at_5_std\n value: -2.936898430768135\n - type: nauc_precision_at_1000_diff1\n value: 49.69224988612385\n - type: nauc_precision_at_1000_max\n value: 79.57897547128005\n - type: nauc_precision_at_1000_std\n value: 45.040371354764645\n - type: nauc_precision_at_100_diff1\n value: 42.70597486048422\n - type: nauc_precision_at_100_max\n value: 65.74628759606188\n - type: nauc_precision_at_100_std\n value: 25.49157745244855\n - type: nauc_precision_at_10_diff1\n value: 38.565609931689345\n - type: nauc_precision_at_10_max\n value: 50.0239696180852\n - type: nauc_precision_at_10_std\n value: 3.976354829503967\n - type: nauc_precision_at_1_diff1\n value: 46.2178975214499\n - type: nauc_precision_at_1_max\n value: 36.26173199049361\n - type: nauc_precision_at_1_std\n value: -3.0897555582816443\n - type: nauc_precision_at_20_diff1\n value: 40.4134718566864\n - type: nauc_precision_at_20_max\n value: 57.121778108665374\n - type: nauc_precision_at_20_std\n value: 11.46021975428544\n - type: nauc_precision_at_3_diff1\n value: 40.90538379461529\n - type: nauc_precision_at_3_max\n value: 42.18393248057992\n - type: nauc_precision_at_3_std\n value: -3.005249943837297\n - type: nauc_precision_at_5_diff1\n value: 39.60162965860782\n - type: nauc_precision_at_5_max\n value: 43.28317158174058\n - type: nauc_precision_at_5_std\n value: -2.3469094487738054\n - type: nauc_recall_at_1000_diff1\n value: 49.69224988612252\n - type: nauc_recall_at_1000_max\n value: 79.57897547127862\n - type: nauc_recall_at_1000_std\n value: 45.04037135476256\n - type: nauc_recall_at_100_diff1\n value: 42.70597486048432\n - type: nauc_recall_at_100_max\n value: 65.74628759606213\n - type: nauc_recall_at_100_std\n value: 25.491577452448727\n - type: nauc_recall_at_10_diff1\n value: 38.56560993168935\n - type: nauc_recall_at_10_max\n value: 50.02396961808522\n - type: nauc_recall_at_10_std\n value: 3.9763548295040314\n - type: nauc_recall_at_1_diff1\n value: 46.2178975214499\n - type: nauc_recall_at_1_max\n value: 36.26173199049361\n - type: nauc_recall_at_1_std\n value: -3.0897555582816443\n - type: nauc_recall_at_20_diff1\n value: 40.41347185668637\n - type: nauc_recall_at_20_max\n value: 57.12177810866533\n - type: nauc_recall_at_20_std\n value: 11.460219754285431\n - type: nauc_recall_at_3_diff1\n value: 40.90538379461527\n - type: nauc_recall_at_3_max\n value: 42.18393248057989\n - type: nauc_recall_at_3_std\n value: -3.005249943837297\n - type: nauc_recall_at_5_diff1\n value: 39.601629658607784\n - type: nauc_recall_at_5_max\n value: 43.28317158174053\n - type: nauc_recall_at_5_std\n value: -2.3469094487738054\n - type: ndcg_at_1\n value: 37.047000000000004\n - type: ndcg_at_10\n value: 54.284\n - type: ndcg_at_100\n value: 58.34\n - type: ndcg_at_1000\n value: 59.303\n - type: ndcg_at_20\n value: 56.235\n - type: ndcg_at_3\n value: 48.503\n - type: ndcg_at_5\n value: 51.686\n - type: precision_at_1\n value: 37.047000000000004\n - type: precision_at_10\n value: 7.237\n - type: precision_at_100\n value: 0.914\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.005\n - type: precision_at_3\n value: 18.898\n - type: precision_at_5\n value: 12.884\n - type: recall_at_1\n value: 37.047000000000004\n - type: recall_at_10\n value: 72.366\n - type: recall_at_100\n value: 91.408\n - type: recall_at_1000\n value: 99.136\n - type: recall_at_20\n value: 80.095\n - type: recall_at_3\n value: 56.693000000000005\n - type: recall_at_5\n value: 64.42099999999999\n task:\n type: Retrieval\n - dataset:\n config: en\n name: MTEB AmazonCounterfactualClassification (en)\n revision: e8379541af4e31359cca9fbcf4b00f2671dba205\n split: test\n type: mteb/amazon_counterfactual\n metrics:\n - type: accuracy\n value: 89.49253731343283\n - type: ap\n value: 61.88098616359918\n - type: ap_weighted\n value: 61.88098616359918\n - type: f1\n value: 84.76516623679144\n - type: f1_weighted\n value: 89.92745276292968\n - type: main_score\n value: 89.49253731343283\n task:\n type: Classification\n - dataset:\n config: de\n name: MTEB AmazonCounterfactualClassification (de)\n revision: e8379541af4e31359cca9fbcf4b00f2671dba205\n split: test\n type: mteb/amazon_counterfactual\n metrics:\n - type: accuracy\n value: 89.61456102783727\n - type: ap\n value: 93.11816566733742\n - type: ap_weighted\n value: 93.11816566733742\n - type: f1\n value: 88.27635757733722\n - type: f1_weighted\n value: 89.82581568285453\n - type: main_score\n value: 89.61456102783727\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB AmazonPolarityClassification (default)\n revision: e2d317d38cd51312af73b3d32a06d1a08b442046\n split: test\n type: mteb/amazon_polarity\n metrics:\n - type: accuracy\n value: 95.3825\n - type: ap\n value: 93.393033869502\n - type: ap_weighted\n value: 93.393033869502\n - type: f1\n value: 95.38109007966307\n - type: f1_weighted\n value: 95.38109007966305\n - type: main_score\n value: 95.3825\n task:\n type: Classification\n - dataset:\n config: en\n name: MTEB AmazonReviewsClassification (en)\n revision: 1399c76144fd37290681b995c656ef9b2e06e26d\n split: test\n type: mteb/amazon_reviews_multi\n metrics:\n - type: accuracy\n value: 49.768\n - type: f1\n value: 48.95084821944411\n - type: f1_weighted\n value: 48.9508482194441\n - type: main_score\n value: 49.768\n task:\n type: Classification\n - dataset:\n config: de\n name: MTEB AmazonReviewsClassification (de)\n revision: 1399c76144fd37290681b995c656ef9b2e06e26d\n split: test\n type: mteb/amazon_reviews_multi\n metrics:\n - type: accuracy\n value: 48.071999999999996\n - type: f1\n value: 47.24171107487612\n - type: f1_weighted\n value: 47.24171107487612\n - type: main_score\n value: 48.071999999999996\n task:\n type: Classification\n - dataset:\n config: es\n name: MTEB AmazonReviewsClassification (es)\n revision: 1399c76144fd37290681b995c656ef9b2e06e26d\n split: test\n type: mteb/amazon_reviews_multi\n metrics:\n - type: accuracy\n value: 48.102000000000004\n - type: f1\n value: 47.27193805278696\n - type: f1_weighted\n value: 47.27193805278696\n - type: main_score\n value: 48.102000000000004\n task:\n type: Classification\n - dataset:\n config: fr\n name: MTEB AmazonReviewsClassification (fr)\n revision: 1399c76144fd37290681b995c656ef9b2e06e26d\n split: test\n type: mteb/amazon_reviews_multi\n metrics:\n - type: accuracy\n value: 47.30800000000001\n - type: f1\n value: 46.41683358017851\n - type: f1_weighted\n value: 46.41683358017851\n - type: main_score\n value: 47.30800000000001\n task:\n type: Classification\n - dataset:\n config: zh\n name: MTEB AmazonReviewsClassification (zh)\n revision: 1399c76144fd37290681b995c656ef9b2e06e26d\n split: test\n type: mteb/amazon_reviews_multi\n metrics:\n - type: accuracy\n value: 44.944\n - type: f1\n value: 44.223824487744395\n - type: f1_weighted\n value: 44.22382448774439\n - type: main_score\n value: 44.944\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB ArguAna (default)\n revision: c22ab2a51041ffd869aaddef7af8d8215647e41a\n split: test\n type: mteb/arguana\n metrics:\n - type: map_at_1\n value: 29.232000000000003\n - type: map_at_10\n value: 45.117000000000004\n - type: map_at_100\n value: 45.977000000000004\n - type: map_at_1000\n value: 45.98\n - type: map_at_20\n value: 45.815\n - type: map_at_3\n value: 39.912\n - type: map_at_5\n value: 42.693\n - type: mrr_at_1\n value: 29.659000000000002\n - type: mrr_at_10\n value: 45.253\n - type: mrr_at_100\n value: 46.125\n - type: mrr_at_1000\n value: 46.129\n - type: mrr_at_20\n value: 45.964\n - type: mrr_at_3\n value: 40.043\n - type: mrr_at_5\n value: 42.870000000000005\n - type: ndcg_at_1\n value: 29.232000000000003\n - type: ndcg_at_10\n value: 54.327999999999996\n - type: ndcg_at_100\n value: 57.86\n - type: ndcg_at_1000\n value: 57.935\n - type: ndcg_at_20\n value: 56.794\n - type: ndcg_at_3\n value: 43.516\n - type: ndcg_at_5\n value: 48.512\n - type: precision_at_1\n value: 29.232000000000003\n - type: precision_at_10\n value: 8.393\n - type: precision_at_100\n value: 0.991\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.676\n - type: precision_at_3\n value: 17.994\n - type: precision_at_5\n value: 13.215\n - type: recall_at_1\n value: 29.232000000000003\n - type: recall_at_10\n value: 83.926\n - type: recall_at_100\n value: 99.075\n - type: recall_at_1000\n value: 99.644\n - type: recall_at_20\n value: 93.528\n - type: recall_at_3\n value: 53.983000000000004\n - type: recall_at_5\n value: 66.074\n - type: main_score\n value: 54.327999999999996\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB ArxivClusteringP2P (default)\n revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d\n split: test\n type: mteb/arxiv-clustering-p2p\n metrics:\n - type: main_score\n value: 46.6636824632419\n - type: v_measure\n value: 46.6636824632419\n - type: v_measure_std\n value: 13.817129140714963\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB ArxivClusteringS2S (default)\n revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53\n split: test\n type: mteb/arxiv-clustering-s2s\n metrics:\n - type: main_score\n value: 39.271141892800024\n - type: v_measure\n value: 39.271141892800024\n - type: v_measure_std\n value: 14.276782483454827\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB AskUbuntuDupQuestions (default)\n revision: 2000358ca161889fa9c082cb41daa8dcfb161a54\n split: test\n type: mteb/askubuntudupquestions-reranking\n metrics:\n - type: map\n value: 65.04363277324629\n - type: mrr\n value: 78.2372598162072\n - type: main_score\n value: 65.04363277324629\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB MindSmallReranking (default)\n revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69\n split: test\n type: mteb/mind_small\n metrics:\n - type: map\n value: 30.83\n - type: main_score\n value: 30.83\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB BIOSSES (default)\n revision: d3fb88f8f02e40887cd149695127462bbcf29b4a\n split: test\n type: mteb/biosses-sts\n metrics:\n - type: cosine_pearson\n value: 88.80382082011027\n - type: cosine_spearman\n value: 88.68876782169106\n - type: euclidean_pearson\n value: 87.00802890147176\n - type: euclidean_spearman\n value: 87.43211268192712\n - type: main_score\n value: 88.68876782169106\n - type: manhattan_pearson\n value: 87.14062537179474\n - type: manhattan_spearman\n value: 87.59115245033443\n - type: pearson\n value: 88.80382082011027\n - type: spearman\n value: 88.68876782169106\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB BQ (default)\n revision: e3dda5e115e487b39ec7e618c0c6a29137052a55\n split: test\n type: C-MTEB/BQ\n metrics:\n - type: cosine_pearson\n value: 61.588006604878196\n - type: cosine_spearman\n value: 63.20615427154465\n - type: euclidean_pearson\n value: 61.818547092516496\n - type: euclidean_spearman\n value: 63.21558009151778\n - type: main_score\n value: 63.20615427154465\n - type: manhattan_pearson\n value: 61.665588158487616\n - type: manhattan_spearman\n value: 63.051544488238584\n - type: pearson\n value: 61.588006604878196\n - type: spearman\n value: 63.20615427154465\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB BSARDRetrieval (default)\n revision: 5effa1b9b5fa3b0f9e12523e6e43e5f86a6e6d59\n split: test\n type: maastrichtlawtech/bsard\n metrics:\n - type: main_score\n value: 64.414\n - type: map_at_1\n value: 14.865\n - type: map_at_10\n value: 21.605\n - type: map_at_100\n value: 22.762\n - type: map_at_1000\n value: 22.854\n - type: map_at_20\n value: 22.259999999999998\n - type: map_at_3\n value: 20.119999999999997\n - type: map_at_5\n value: 20.931\n - type: mrr_at_1\n value: 14.864864864864865\n - type: mrr_at_10\n value: 21.605176605176606\n - type: mrr_at_100\n value: 22.7622306460065\n - type: mrr_at_1000\n value: 22.85383406410312\n - type: mrr_at_20\n value: 22.259528463088845\n - type: mrr_at_3\n value: 20.12012012012012\n - type: mrr_at_5\n value: 20.930930930930934\n - type: nauc_map_at_1000_diff1\n value: 17.486265968689338\n - type: nauc_map_at_1000_max\n value: 22.736799291688836\n - type: nauc_map_at_1000_std\n value: 9.831687441977147\n - type: nauc_map_at_100_diff1\n value: 17.50754492049086\n - type: nauc_map_at_100_max\n value: 22.77693662806787\n - type: nauc_map_at_100_std\n value: 9.853899509675395\n - type: nauc_map_at_10_diff1\n value: 17.42133968580952\n - type: nauc_map_at_10_max\n value: 22.45861793882279\n - type: nauc_map_at_10_std\n value: 8.964888472915938\n - type: nauc_map_at_1_diff1\n value: 19.433947086968093\n - type: nauc_map_at_1_max\n value: 24.75657047550517\n - type: nauc_map_at_1_std\n value: 15.122329157218505\n - type: nauc_map_at_20_diff1\n value: 17.429856756008785\n - type: nauc_map_at_20_max\n value: 22.438850987431017\n - type: nauc_map_at_20_std\n value: 9.172746012213558\n - type: nauc_map_at_3_diff1\n value: 18.218182689678475\n - type: nauc_map_at_3_max\n value: 23.57169444088667\n - type: nauc_map_at_3_std\n value: 10.464473559366356\n - type: nauc_map_at_5_diff1\n value: 18.6075342519133\n - type: nauc_map_at_5_max\n value: 23.308845973576673\n - type: nauc_map_at_5_std\n value: 9.364009996445652\n - type: nauc_mrr_at_1000_diff1\n value: 17.486265968689338\n - type: nauc_mrr_at_1000_max\n value: 22.736799291688836\n - type: nauc_mrr_at_1000_std\n value: 9.831687441977147\n - type: nauc_mrr_at_100_diff1\n value: 17.50754492049086\n - type: nauc_mrr_at_100_max\n value: 22.77693662806787\n - type: nauc_mrr_at_100_std\n value: 9.853899509675395\n - type: nauc_mrr_at_10_diff1\n value: 17.42133968580952\n - type: nauc_mrr_at_10_max\n value: 22.45861793882279\n - type: nauc_mrr_at_10_std\n value: 8.964888472915938\n - type: nauc_mrr_at_1_diff1\n value: 19.433947086968093\n - type: nauc_mrr_at_1_max\n value: 24.75657047550517\n - type: nauc_mrr_at_1_std\n value: 15.122329157218505\n - type: nauc_mrr_at_20_diff1\n value: 17.429856756008785\n - type: nauc_mrr_at_20_max\n value: 22.438850987431017\n - type: nauc_mrr_at_20_std\n value: 9.172746012213558\n - type: nauc_mrr_at_3_diff1\n value: 18.218182689678475\n - type: nauc_mrr_at_3_max\n value: 23.57169444088667\n - type: nauc_mrr_at_3_std\n value: 10.464473559366356\n - type: nauc_mrr_at_5_diff1\n value: 18.6075342519133\n - type: nauc_mrr_at_5_max\n value: 23.308845973576673\n - type: nauc_mrr_at_5_std\n value: 9.364009996445652\n - type: nauc_ndcg_at_1000_diff1\n value: 16.327871824135745\n - type: nauc_ndcg_at_1000_max\n value: 23.308241052911495\n - type: nauc_ndcg_at_1000_std\n value: 11.50905911184097\n - type: nauc_ndcg_at_100_diff1\n value: 16.676226744692773\n - type: nauc_ndcg_at_100_max\n value: 24.323253721240974\n - type: nauc_ndcg_at_100_std\n value: 11.952612443651557\n - type: nauc_ndcg_at_10_diff1\n value: 16.030325121764594\n - type: nauc_ndcg_at_10_max\n value: 21.306799242079542\n - type: nauc_ndcg_at_10_std\n value: 6.63359364302513\n - type: nauc_ndcg_at_1_diff1\n value: 19.433947086968093\n - type: nauc_ndcg_at_1_max\n value: 24.75657047550517\n - type: nauc_ndcg_at_1_std\n value: 15.122329157218505\n - type: nauc_ndcg_at_20_diff1\n value: 16.013173605999857\n - type: nauc_ndcg_at_20_max\n value: 21.607217260736576\n - type: nauc_ndcg_at_20_std\n value: 7.319482417138996\n - type: nauc_ndcg_at_3_diff1\n value: 17.97958548328493\n - type: nauc_ndcg_at_3_max\n value: 23.58346522810145\n - type: nauc_ndcg_at_3_std\n value: 9.392582854708314\n - type: nauc_ndcg_at_5_diff1\n value: 18.734733324685287\n - type: nauc_ndcg_at_5_max\n value: 23.273244317623742\n - type: nauc_ndcg_at_5_std\n value: 7.638611545253834\n - type: nauc_precision_at_1000_diff1\n value: 7.919843339380295\n - type: nauc_precision_at_1000_max\n value: 31.575386234270486\n - type: nauc_precision_at_1000_std\n value: 39.332224386769404\n - type: nauc_precision_at_100_diff1\n value: 15.018050960000052\n - type: nauc_precision_at_100_max\n value: 34.98209513759861\n - type: nauc_precision_at_100_std\n value: 26.970034484359022\n - type: nauc_precision_at_10_diff1\n value: 12.102191084210922\n - type: nauc_precision_at_10_max\n value: 18.112541150340675\n - type: nauc_precision_at_10_std\n value: 0.7358784689406018\n - type: nauc_precision_at_1_diff1\n value: 19.433947086968093\n - type: nauc_precision_at_1_max\n value: 24.75657047550517\n - type: nauc_precision_at_1_std\n value: 15.122329157218505\n - type: nauc_precision_at_20_diff1\n value: 12.018814361204328\n - type: nauc_precision_at_20_max\n value: 19.75123746049928\n - type: nauc_precision_at_20_std\n value: 3.012204650582264\n - type: nauc_precision_at_3_diff1\n value: 17.41375604940955\n - type: nauc_precision_at_3_max\n value: 23.699834627021037\n - type: nauc_precision_at_3_std\n value: 6.793486779050103\n - type: nauc_precision_at_5_diff1\n value: 19.194631963780257\n - type: nauc_precision_at_5_max\n value: 23.31708702442155\n - type: nauc_precision_at_5_std\n value: 3.4591358279667332\n - type: nauc_recall_at_1000_diff1\n value: 7.919843339380378\n - type: nauc_recall_at_1000_max\n value: 31.57538623427063\n - type: nauc_recall_at_1000_std\n value: 39.332224386769546\n - type: nauc_recall_at_100_diff1\n value: 15.018050960000085\n - type: nauc_recall_at_100_max\n value: 34.9820951375986\n - type: nauc_recall_at_100_std\n value: 26.97003448435901\n - type: nauc_recall_at_10_diff1\n value: 12.102191084210837\n - type: nauc_recall_at_10_max\n value: 18.112541150340594\n - type: nauc_recall_at_10_std\n value: 0.7358784689405188\n - type: nauc_recall_at_1_diff1\n value: 19.433947086968093\n - type: nauc_recall_at_1_max\n value: 24.75657047550517\n - type: nauc_recall_at_1_std\n value: 15.122329157218505\n - type: nauc_recall_at_20_diff1\n value: 12.01881436120429\n - type: nauc_recall_at_20_max\n value: 19.751237460499222\n - type: nauc_recall_at_20_std\n value: 3.0122046505822135\n - type: nauc_recall_at_3_diff1\n value: 17.413756049409503\n - type: nauc_recall_at_3_max\n value: 23.699834627020998\n - type: nauc_recall_at_3_std\n value: 6.793486779050083\n - type: nauc_recall_at_5_diff1\n value: 19.194631963780203\n - type: nauc_recall_at_5_max\n value: 23.3170870244215\n - type: nauc_recall_at_5_std\n value: 3.459135827966664\n - type: ndcg_at_1\n value: 14.865\n - type: ndcg_at_10\n value: 24.764\n - type: ndcg_at_100\n value: 30.861\n - type: ndcg_at_1000\n value: 33.628\n - type: ndcg_at_20\n value: 27.078000000000003\n - type: ndcg_at_3\n value: 21.675\n - type: ndcg_at_5\n value: 23.148\n - type: precision_at_1\n value: 14.865\n - type: precision_at_10\n value: 3.4680000000000004\n - type: precision_at_100\n value: 0.644\n - type: precision_at_1000\n value: 0.087\n - type: precision_at_20\n value: 2.185\n - type: precision_at_3\n value: 8.709\n - type: precision_at_5\n value: 5.946\n - type: recall_at_1\n value: 14.865\n - type: recall_at_10\n value: 34.685\n - type: recall_at_100\n value: 64.414\n - type: recall_at_1000\n value: 86.937\n - type: recall_at_20\n value: 43.694\n - type: recall_at_3\n value: 26.125999999999998\n - type: recall_at_5\n value: 29.73\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB Banking77Classification (default)\n revision: 0fd18e25b25c072e09e0d92ab615fda904d66300\n split: test\n type: mteb/banking77\n metrics:\n - type: accuracy\n value: 84.08116883116882\n - type: f1\n value: 84.05587055990273\n - type: f1_weighted\n value: 84.05587055990274\n - type: main_score\n value: 84.08116883116882\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB BiorxivClusteringP2P (default)\n revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40\n split: test\n type: mteb/biorxiv-clustering-p2p\n metrics:\n - type: main_score\n value: 38.1941007822277\n - type: v_measure\n value: 38.1941007822277\n - type: v_measure_std\n value: 0.7502113547288178\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB BiorxivClusteringS2S (default)\n revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908\n split: test\n type: mteb/biorxiv-clustering-s2s\n metrics:\n - type: main_score\n value: 34.42075599178318\n - type: v_measure\n value: 34.42075599178318\n - type: v_measure_std\n value: 0.600256720497283\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB BlurbsClusteringP2P (default)\n revision: a2dd5b02a77de3466a3eaa98ae586b5610314496\n split: test\n type: slvnwhrl/blurbs-clustering-p2p\n metrics:\n - type: main_score\n value: 41.634627363047265\n - type: v_measure\n value: 41.634627363047265\n - type: v_measure_std\n value: 9.726923191225307\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB BlurbsClusteringS2S (default)\n revision: 22793b6a6465bf00120ad525e38c51210858132c\n split: test\n type: slvnwhrl/blurbs-clustering-s2s\n metrics:\n - type: main_score\n value: 20.996468295584197\n - type: v_measure\n value: 20.996468295584197\n - type: v_measure_std\n value: 9.225766688272197\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB CBD (default)\n revision: 36ddb419bcffe6a5374c3891957912892916f28d\n split: test\n type: PL-MTEB/cbd\n metrics:\n - type: accuracy\n value: 69.99\n - type: ap\n value: 22.57826353116948\n - type: ap_weighted\n value: 22.57826353116948\n - type: f1\n value: 59.04574955548393\n - type: f1_weighted\n value: 74.36235022309789\n - type: main_score\n value: 69.99\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB CDSC-E (default)\n revision: 0a3d4aa409b22f80eb22cbf59b492637637b536d\n split: test\n type: PL-MTEB/cdsce-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 88.7\n - type: cosine_accuracy_threshold\n value: 97.37848043441772\n - type: cosine_ap\n value: 73.0405088928302\n - type: cosine_f1\n value: 63.52201257861635\n - type: cosine_f1_threshold\n value: 96.98888063430786\n - type: cosine_precision\n value: 78.90625\n - type: cosine_recall\n value: 53.1578947368421\n - type: dot_accuracy\n value: 84.89999999999999\n - type: dot_accuracy_threshold\n value: 43603.09753417969\n - type: dot_ap\n value: 56.98157569085279\n - type: dot_f1\n value: 57.606490872210955\n - type: dot_f1_threshold\n value: 40406.23779296875\n - type: dot_precision\n value: 46.864686468646866\n - type: dot_recall\n value: 74.73684210526315\n - type: euclidean_accuracy\n value: 88.5\n - type: euclidean_accuracy_threshold\n value: 498.0483055114746\n - type: euclidean_ap\n value: 72.97328234816734\n - type: euclidean_f1\n value: 63.722397476340696\n - type: euclidean_f1_threshold\n value: 508.6186408996582\n - type: euclidean_precision\n value: 79.52755905511812\n - type: euclidean_recall\n value: 53.1578947368421\n - type: main_score\n value: 73.0405088928302\n - type: manhattan_accuracy\n value: 88.6\n - type: manhattan_accuracy_threshold\n value: 12233.079528808594\n - type: manhattan_ap\n value: 72.92148503992615\n - type: manhattan_f1\n value: 63.69426751592356\n - type: manhattan_f1_threshold\n value: 12392.754364013672\n - type: manhattan_precision\n value: 80.64516129032258\n - type: manhattan_recall\n value: 52.63157894736842\n - type: max_accuracy\n value: 88.7\n - type: max_ap\n value: 73.0405088928302\n - type: max_f1\n value: 63.722397476340696\n - type: max_precision\n value: 80.64516129032258\n - type: max_recall\n value: 74.73684210526315\n - type: similarity_accuracy\n value: 88.7\n - type: similarity_accuracy_threshold\n value: 97.37848043441772\n - type: similarity_ap\n value: 73.0405088928302\n - type: similarity_f1\n value: 63.52201257861635\n - type: similarity_f1_threshold\n value: 96.98888063430786\n - type: similarity_precision\n value: 78.90625\n - type: similarity_recall\n value: 53.1578947368421\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB CDSC-R (default)\n revision: 1cd6abbb00df7d14be3dbd76a7dcc64b3a79a7cd\n split: test\n type: PL-MTEB/cdscr-sts\n metrics:\n - type: cosine_pearson\n value: 92.97492495289738\n - type: cosine_spearman\n value: 92.63248098608472\n - type: euclidean_pearson\n value: 92.04712487782031\n - type: euclidean_spearman\n value: 92.19679486755008\n - type: main_score\n value: 92.63248098608472\n - type: manhattan_pearson\n value: 92.0101187740438\n - type: manhattan_spearman\n value: 92.20926859332754\n - type: pearson\n value: 92.97492495289738\n - type: spearman\n value: 92.63248098608472\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB CLSClusteringP2P (default)\n revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476\n split: test\n type: C-MTEB/CLSClusteringP2P\n metrics:\n - type: main_score\n value: 39.96377851800628\n - type: v_measure\n value: 39.96377851800628\n - type: v_measure_std\n value: 0.9793033243093288\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB CLSClusteringS2S (default)\n revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f\n split: test\n type: C-MTEB/CLSClusteringS2S\n metrics:\n - type: main_score\n value: 38.788850224595784\n - type: v_measure\n value: 38.788850224595784\n - type: v_measure_std\n value: 1.0712604145916924\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB CMedQAv1\n revision: 8d7f1e942507dac42dc58017c1a001c3717da7df\n split: test\n type: C-MTEB/CMedQAv1-reranking\n metrics:\n - type: map\n value: 77.95952507806115\n - type: mrr\n value: 80.8643253968254\n - type: main_score\n value: 77.95952507806115\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB CMedQAv2\n revision: 23d186750531a14a0357ca22cd92d712fd512ea0\n split: test\n type: C-MTEB/CMedQAv2-reranking\n metrics:\n - type: map\n value: 78.21522500165045\n - type: mrr\n value: 81.28194444444443\n - type: main_score\n value: 78.21522500165045\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB CQADupstackAndroidRetrieval (default)\n revision: f46a197baaae43b4f621051089b82a364682dfeb\n split: test\n type: mteb/cqadupstack-android\n metrics:\n - type: map_at_1\n value: 33.377\n - type: map_at_10\n value: 46.371\n - type: map_at_100\n value: 47.829\n - type: map_at_1000\n value: 47.94\n - type: map_at_20\n value: 47.205000000000005\n - type: map_at_3\n value: 42.782\n - type: map_at_5\n value: 44.86\n - type: mrr_at_1\n value: 41.345\n - type: mrr_at_10\n value: 52.187\n - type: mrr_at_100\n value: 52.893\n - type: mrr_at_1000\n value: 52.929\n - type: mrr_at_20\n value: 52.637\n - type: mrr_at_3\n value: 49.714000000000006\n - type: mrr_at_5\n value: 51.373000000000005\n - type: ndcg_at_1\n value: 41.345\n - type: ndcg_at_10\n value: 52.946000000000005\n - type: ndcg_at_100\n value: 57.92699999999999\n - type: ndcg_at_1000\n value: 59.609\n - type: ndcg_at_20\n value: 54.900999999999996\n - type: ndcg_at_3\n value: 48.357\n - type: ndcg_at_5\n value: 50.739000000000004\n - type: precision_at_1\n value: 41.345\n - type: precision_at_10\n value: 10.186\n - type: precision_at_100\n value: 1.554\n - type: precision_at_1000\n value: 0.2\n - type: precision_at_20\n value: 5.959\n - type: precision_at_3\n value: 23.796\n - type: precision_at_5\n value: 17.024\n - type: recall_at_1\n value: 33.377\n - type: recall_at_10\n value: 65.067\n - type: recall_at_100\n value: 86.04899999999999\n - type: recall_at_1000\n value: 96.54899999999999\n - type: recall_at_20\n value: 72.071\n - type: recall_at_3\n value: 51.349999999999994\n - type: recall_at_5\n value: 58.41\n - type: main_score\n value: 52.946000000000005\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackEnglishRetrieval (default)\n revision: ad9991cb51e31e31e430383c75ffb2885547b5f0\n split: test\n type: mteb/cqadupstack-english\n metrics:\n - type: map_at_1\n value: 31.097\n - type: map_at_10\n value: 42.183\n - type: map_at_100\n value: 43.580999999999996\n - type: map_at_1000\n value: 43.718\n - type: map_at_20\n value: 42.921\n - type: map_at_3\n value: 38.963\n - type: map_at_5\n value: 40.815\n - type: mrr_at_1\n value: 39.745000000000005\n - type: mrr_at_10\n value: 48.736000000000004\n - type: mrr_at_100\n value: 49.405\n - type: mrr_at_1000\n value: 49.452\n - type: mrr_at_20\n value: 49.118\n - type: mrr_at_3\n value: 46.497\n - type: mrr_at_5\n value: 47.827999999999996\n - type: ndcg_at_1\n value: 39.745000000000005\n - type: ndcg_at_10\n value: 48.248000000000005\n - type: ndcg_at_100\n value: 52.956\n - type: ndcg_at_1000\n value: 54.99699999999999\n - type: ndcg_at_20\n value: 50.01\n - type: ndcg_at_3\n value: 43.946000000000005\n - type: ndcg_at_5\n value: 46.038000000000004\n - type: precision_at_1\n value: 39.745000000000005\n - type: precision_at_10\n value: 9.229\n - type: precision_at_100\n value: 1.5070000000000001\n - type: precision_at_1000\n value: 0.199\n - type: precision_at_20\n value: 5.489999999999999\n - type: precision_at_3\n value: 21.38\n - type: precision_at_5\n value: 15.274\n - type: recall_at_1\n value: 31.097\n - type: recall_at_10\n value: 58.617\n - type: recall_at_100\n value: 78.55199999999999\n - type: recall_at_1000\n value: 91.13900000000001\n - type: recall_at_20\n value: 64.92\n - type: recall_at_3\n value: 45.672000000000004\n - type: recall_at_5\n value: 51.669\n - type: main_score\n value: 48.248000000000005\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackGamingRetrieval (default)\n revision: 4885aa143210c98657558c04aaf3dc47cfb54340\n split: test\n type: mteb/cqadupstack-gaming\n metrics:\n - type: map_at_1\n value: 39.745000000000005\n - type: map_at_10\n value: 52.063\n - type: map_at_100\n value: 53.077\n - type: map_at_1000\n value: 53.13\n - type: map_at_20\n value: 52.66\n - type: map_at_3\n value: 48.662\n - type: map_at_5\n value: 50.507000000000005\n - type: mrr_at_1\n value: 45.391999999999996\n - type: mrr_at_10\n value: 55.528\n - type: mrr_at_100\n value: 56.16100000000001\n - type: mrr_at_1000\n value: 56.192\n - type: mrr_at_20\n value: 55.923\n - type: mrr_at_3\n value: 52.93600000000001\n - type: mrr_at_5\n value: 54.435\n - type: ndcg_at_1\n value: 45.391999999999996\n - type: ndcg_at_10\n value: 58.019\n - type: ndcg_at_100\n value: 61.936\n - type: ndcg_at_1000\n value: 63.015\n - type: ndcg_at_20\n value: 59.691\n - type: ndcg_at_3\n value: 52.294\n - type: ndcg_at_5\n value: 55.017\n - type: precision_at_1\n value: 45.391999999999996\n - type: precision_at_10\n value: 9.386\n - type: precision_at_100\n value: 1.232\n - type: precision_at_1000\n value: 0.136\n - type: precision_at_20\n value: 5.223\n - type: precision_at_3\n value: 23.177\n - type: precision_at_5\n value: 15.9\n - type: recall_at_1\n value: 39.745000000000005\n - type: recall_at_10\n value: 72.08099999999999\n - type: recall_at_100\n value: 88.85300000000001\n - type: recall_at_1000\n value: 96.569\n - type: recall_at_20\n value: 78.203\n - type: recall_at_3\n value: 56.957\n - type: recall_at_5\n value: 63.63100000000001\n - type: main_score\n value: 58.019\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackGisRetrieval (default)\n revision: 5003b3064772da1887988e05400cf3806fe491f2\n split: test\n type: mteb/cqadupstack-gis\n metrics:\n - type: map_at_1\n value: 26.651999999999997\n - type: map_at_10\n value: 35.799\n - type: map_at_100\n value: 36.846000000000004\n - type: map_at_1000\n value: 36.931000000000004\n - type: map_at_20\n value: 36.341\n - type: map_at_3\n value: 32.999\n - type: map_at_5\n value: 34.597\n - type: mrr_at_1\n value: 28.814\n - type: mrr_at_10\n value: 37.869\n - type: mrr_at_100\n value: 38.728\n - type: mrr_at_1000\n value: 38.795\n - type: mrr_at_20\n value: 38.317\n - type: mrr_at_3\n value: 35.235\n - type: mrr_at_5\n value: 36.738\n - type: ndcg_at_1\n value: 28.814\n - type: ndcg_at_10\n value: 41.028\n - type: ndcg_at_100\n value: 46.162\n - type: ndcg_at_1000\n value: 48.15\n - type: ndcg_at_20\n value: 42.824\n - type: ndcg_at_3\n value: 35.621\n - type: ndcg_at_5\n value: 38.277\n - type: precision_at_1\n value: 28.814\n - type: precision_at_10\n value: 6.361999999999999\n - type: precision_at_100\n value: 0.9450000000000001\n - type: precision_at_1000\n value: 0.11399999999999999\n - type: precision_at_20\n value: 3.6159999999999997\n - type: precision_at_3\n value: 15.140999999999998\n - type: precision_at_5\n value: 10.712000000000002\n - type: recall_at_1\n value: 26.651999999999997\n - type: recall_at_10\n value: 55.038\n - type: recall_at_100\n value: 78.806\n - type: recall_at_1000\n value: 93.485\n - type: recall_at_20\n value: 61.742\n - type: recall_at_3\n value: 40.682\n - type: recall_at_5\n value: 46.855000000000004\n - type: main_score\n value: 41.028\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackMathematicaRetrieval (default)\n revision: 90fceea13679c63fe563ded68f3b6f06e50061de\n split: test\n type: mteb/cqadupstack-mathematica\n metrics:\n - type: map_at_1\n value: 17.627000000000002\n - type: map_at_10\n value: 26.436999999999998\n - type: map_at_100\n value: 27.85\n - type: map_at_1000\n value: 27.955999999999996\n - type: map_at_20\n value: 27.233\n - type: map_at_3\n value: 23.777\n - type: map_at_5\n value: 25.122\n - type: mrr_at_1\n value: 22.387999999999998\n - type: mrr_at_10\n value: 31.589\n - type: mrr_at_100\n value: 32.641999999999996\n - type: mrr_at_1000\n value: 32.696999999999996\n - type: mrr_at_20\n value: 32.201\n - type: mrr_at_3\n value: 28.98\n - type: mrr_at_5\n value: 30.342000000000002\n - type: ndcg_at_1\n value: 22.387999999999998\n - type: ndcg_at_10\n value: 32.129999999999995\n - type: ndcg_at_100\n value: 38.562999999999995\n - type: ndcg_at_1000\n value: 40.903\n - type: ndcg_at_20\n value: 34.652\n - type: ndcg_at_3\n value: 27.26\n - type: ndcg_at_5\n value: 29.235\n - type: precision_at_1\n value: 22.387999999999998\n - type: precision_at_10\n value: 5.970000000000001\n - type: precision_at_100\n value: 1.068\n - type: precision_at_1000\n value: 0.13899999999999998\n - type: precision_at_20\n value: 3.6999999999999997\n - type: precision_at_3\n value: 13.267000000000001\n - type: precision_at_5\n value: 9.403\n - type: recall_at_1\n value: 17.627000000000002\n - type: recall_at_10\n value: 44.71\n - type: recall_at_100\n value: 72.426\n - type: recall_at_1000\n value: 88.64699999999999\n - type: recall_at_20\n value: 53.65\n - type: recall_at_3\n value: 30.989\n - type: recall_at_5\n value: 36.237\n - type: main_score\n value: 32.129999999999995\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackPhysicsRetrieval (default)\n revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4\n split: test\n type: mteb/cqadupstack-physics\n metrics:\n - type: map_at_1\n value: 30.891000000000002\n - type: map_at_10\n value: 41.519\n - type: map_at_100\n value: 42.896\n - type: map_at_1000\n value: 42.992999999999995\n - type: map_at_20\n value: 42.287\n - type: map_at_3\n value: 37.822\n - type: map_at_5\n value: 39.976\n - type: mrr_at_1\n value: 37.921\n - type: mrr_at_10\n value: 47.260999999999996\n - type: mrr_at_100\n value: 48.044\n - type: mrr_at_1000\n value: 48.08\n - type: mrr_at_20\n value: 47.699999999999996\n - type: mrr_at_3\n value: 44.513999999999996\n - type: mrr_at_5\n value: 46.064\n - type: ndcg_at_1\n value: 37.921\n - type: ndcg_at_10\n value: 47.806\n - type: ndcg_at_100\n value: 53.274\n - type: ndcg_at_1000\n value: 55.021\n - type: ndcg_at_20\n value: 49.973\n - type: ndcg_at_3\n value: 42.046\n - type: ndcg_at_5\n value: 44.835\n - type: precision_at_1\n value: 37.921\n - type: precision_at_10\n value: 8.767999999999999\n - type: precision_at_100\n value: 1.353\n - type: precision_at_1000\n value: 0.168\n - type: precision_at_20\n value: 5.135\n - type: precision_at_3\n value: 20.051\n - type: precision_at_5\n value: 14.398\n - type: recall_at_1\n value: 30.891000000000002\n - type: recall_at_10\n value: 60.897999999999996\n - type: recall_at_100\n value: 83.541\n - type: recall_at_1000\n value: 94.825\n - type: recall_at_20\n value: 68.356\n - type: recall_at_3\n value: 44.65\n - type: recall_at_5\n value: 51.919000000000004\n - type: main_score\n value: 47.806\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackProgrammersRetrieval (default)\n revision: 6184bc1440d2dbc7612be22b50686b8826d22b32\n split: test\n type: mteb/cqadupstack-programmers\n metrics:\n - type: map_at_1\n value: 27.654\n - type: map_at_10\n value: 38.025999999999996\n - type: map_at_100\n value: 39.425\n - type: map_at_1000\n value: 39.528\n - type: map_at_20\n value: 38.838\n - type: map_at_3\n value: 34.745\n - type: map_at_5\n value: 36.537\n - type: mrr_at_1\n value: 34.018\n - type: mrr_at_10\n value: 43.314\n - type: mrr_at_100\n value: 44.283\n - type: mrr_at_1000\n value: 44.327\n - type: mrr_at_20\n value: 43.929\n - type: mrr_at_3\n value: 40.868\n - type: mrr_at_5\n value: 42.317\n - type: ndcg_at_1\n value: 34.018\n - type: ndcg_at_10\n value: 43.887\n - type: ndcg_at_100\n value: 49.791000000000004\n - type: ndcg_at_1000\n value: 51.834\n - type: ndcg_at_20\n value: 46.376\n - type: ndcg_at_3\n value: 38.769999999999996\n - type: ndcg_at_5\n value: 41.144\n - type: precision_at_1\n value: 34.018\n - type: precision_at_10\n value: 8.001999999999999\n - type: precision_at_100\n value: 1.2630000000000001\n - type: precision_at_1000\n value: 0.16\n - type: precision_at_20\n value: 4.737\n - type: precision_at_3\n value: 18.417\n - type: precision_at_5\n value: 13.150999999999998\n - type: recall_at_1\n value: 27.654\n - type: recall_at_10\n value: 56.111\n - type: recall_at_100\n value: 81.136\n - type: recall_at_1000\n value: 94.788\n - type: recall_at_20\n value: 65.068\n - type: recall_at_3\n value: 41.713\n - type: recall_at_5\n value: 48.106\n - type: main_score\n value: 43.887\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackRetrieval (default)\n revision: CQADupstackRetrieval_is_a_combined_dataset\n split: test\n type: CQADupstackRetrieval_is_a_combined_dataset\n metrics:\n - type: main_score\n value: 42.58858333333333\n - type: ndcg_at_10\n value: 42.58858333333333\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackStatsRetrieval (default)\n revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a\n split: test\n type: mteb/cqadupstack-stats\n metrics:\n - type: map_at_1\n value: 24.501\n - type: map_at_10\n value: 32.814\n - type: map_at_100\n value: 33.754\n - type: map_at_1000\n value: 33.859\n - type: map_at_20\n value: 33.324\n - type: map_at_3\n value: 30.758000000000003\n - type: map_at_5\n value: 31.936999999999998\n - type: mrr_at_1\n value: 27.761000000000003\n - type: mrr_at_10\n value: 35.662\n - type: mrr_at_100\n value: 36.443999999999996\n - type: mrr_at_1000\n value: 36.516999999999996\n - type: mrr_at_20\n value: 36.085\n - type: mrr_at_3\n value: 33.742\n - type: mrr_at_5\n value: 34.931\n - type: ndcg_at_1\n value: 27.761000000000003\n - type: ndcg_at_10\n value: 37.208000000000006\n - type: ndcg_at_100\n value: 41.839\n - type: ndcg_at_1000\n value: 44.421\n - type: ndcg_at_20\n value: 38.917\n - type: ndcg_at_3\n value: 33.544000000000004\n - type: ndcg_at_5\n value: 35.374\n - type: precision_at_1\n value: 27.761000000000003\n - type: precision_at_10\n value: 5.92\n - type: precision_at_100\n value: 0.899\n - type: precision_at_1000\n value: 0.12\n - type: precision_at_20\n value: 3.4130000000000003\n - type: precision_at_3\n value: 15.031\n - type: precision_at_5\n value: 10.306999999999999\n - type: recall_at_1\n value: 24.501\n - type: recall_at_10\n value: 47.579\n - type: recall_at_100\n value: 69.045\n - type: recall_at_1000\n value: 88.032\n - type: recall_at_20\n value: 54.125\n - type: recall_at_3\n value: 37.202\n - type: recall_at_5\n value: 41.927\n - type: main_score\n value: 37.208000000000006\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackTexRetrieval (default)\n revision: 46989137a86843e03a6195de44b09deda022eec7\n split: test\n type: mteb/cqadupstack-tex\n metrics:\n - type: map_at_1\n value: 18.29\n - type: map_at_10\n value: 26.183\n - type: map_at_100\n value: 27.351999999999997\n - type: map_at_1000\n value: 27.483999999999998\n - type: map_at_20\n value: 26.798\n - type: map_at_3\n value: 23.629\n - type: map_at_5\n value: 24.937\n - type: mrr_at_1\n value: 22.299\n - type: mrr_at_10\n value: 30.189\n - type: mrr_at_100\n value: 31.098\n - type: mrr_at_1000\n value: 31.177\n - type: mrr_at_20\n value: 30.697000000000003\n - type: mrr_at_3\n value: 27.862\n - type: mrr_at_5\n value: 29.066\n - type: ndcg_at_1\n value: 22.299\n - type: ndcg_at_10\n value: 31.202\n - type: ndcg_at_100\n value: 36.617\n - type: ndcg_at_1000\n value: 39.544000000000004\n - type: ndcg_at_20\n value: 33.177\n - type: ndcg_at_3\n value: 26.639000000000003\n - type: ndcg_at_5\n value: 28.526\n - type: precision_at_1\n value: 22.299\n - type: precision_at_10\n value: 5.8020000000000005\n - type: precision_at_100\n value: 1.0070000000000001\n - type: precision_at_1000\n value: 0.14400000000000002\n - type: precision_at_20\n value: 3.505\n - type: precision_at_3\n value: 12.698\n - type: precision_at_5\n value: 9.174\n - type: recall_at_1\n value: 18.29\n - type: recall_at_10\n value: 42.254999999999995\n - type: recall_at_100\n value: 66.60000000000001\n - type: recall_at_1000\n value: 87.31400000000001\n - type: recall_at_20\n value: 49.572\n - type: recall_at_3\n value: 29.342000000000002\n - type: recall_at_5\n value: 34.221000000000004\n - type: main_score\n value: 31.202\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackUnixRetrieval (default)\n revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53\n split: test\n type: mteb/cqadupstack-unix\n metrics:\n - type: map_at_1\n value: 27.722\n - type: map_at_10\n value: 37.698\n - type: map_at_100\n value: 38.899\n - type: map_at_1000\n value: 38.998\n - type: map_at_20\n value: 38.381\n - type: map_at_3\n value: 34.244\n - type: map_at_5\n value: 36.295\n - type: mrr_at_1\n value: 32.183\n - type: mrr_at_10\n value: 41.429\n - type: mrr_at_100\n value: 42.308\n - type: mrr_at_1000\n value: 42.358000000000004\n - type: mrr_at_20\n value: 41.957\n - type: mrr_at_3\n value: 38.401999999999994\n - type: mrr_at_5\n value: 40.294999999999995\n - type: ndcg_at_1\n value: 32.183\n - type: ndcg_at_10\n value: 43.519000000000005\n - type: ndcg_at_100\n value: 48.786\n - type: ndcg_at_1000\n value: 50.861999999999995\n - type: ndcg_at_20\n value: 45.654\n - type: ndcg_at_3\n value: 37.521\n - type: ndcg_at_5\n value: 40.615\n - type: precision_at_1\n value: 32.183\n - type: precision_at_10\n value: 7.603\n - type: precision_at_100\n value: 1.135\n - type: precision_at_1000\n value: 0.14200000000000002\n - type: precision_at_20\n value: 4.408\n - type: precision_at_3\n value: 17.071\n - type: precision_at_5\n value: 12.668\n - type: recall_at_1\n value: 27.722\n - type: recall_at_10\n value: 57.230000000000004\n - type: recall_at_100\n value: 79.97999999999999\n - type: recall_at_1000\n value: 94.217\n - type: recall_at_20\n value: 64.864\n - type: recall_at_3\n value: 41.215\n - type: recall_at_5\n value: 48.774\n - type: main_score\n value: 43.519000000000005\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackWebmastersRetrieval (default)\n revision: 160c094312a0e1facb97e55eeddb698c0abe3571\n split: test\n type: mteb/cqadupstack-webmasters\n metrics:\n - type: map_at_1\n value: 25.852999999999998\n - type: map_at_10\n value: 35.394999999999996\n - type: map_at_100\n value: 37.291999999999994\n - type: map_at_1000\n value: 37.495\n - type: map_at_20\n value: 36.372\n - type: map_at_3\n value: 32.336\n - type: map_at_5\n value: 34.159\n - type: mrr_at_1\n value: 31.818\n - type: mrr_at_10\n value: 40.677\n - type: mrr_at_100\n value: 41.728\n - type: mrr_at_1000\n value: 41.778\n - type: mrr_at_20\n value: 41.301\n - type: mrr_at_3\n value: 38.208\n - type: mrr_at_5\n value: 39.592\n - type: ndcg_at_1\n value: 31.818\n - type: ndcg_at_10\n value: 41.559000000000005\n - type: ndcg_at_100\n value: 48.012\n - type: ndcg_at_1000\n value: 50.234\n - type: ndcg_at_20\n value: 44.15\n - type: ndcg_at_3\n value: 36.918\n - type: ndcg_at_5\n value: 39.227000000000004\n - type: precision_at_1\n value: 31.818\n - type: precision_at_10\n value: 8.043\n - type: precision_at_100\n value: 1.625\n - type: precision_at_1000\n value: 0.245\n - type: precision_at_20\n value: 5.2170000000000005\n - type: precision_at_3\n value: 17.655\n - type: precision_at_5\n value: 12.845999999999998\n - type: recall_at_1\n value: 25.852999999999998\n - type: recall_at_10\n value: 53.093\n - type: recall_at_100\n value: 81.05799999999999\n - type: recall_at_1000\n value: 94.657\n - type: recall_at_20\n value: 62.748000000000005\n - type: recall_at_3\n value: 39.300000000000004\n - type: recall_at_5\n value: 45.754\n - type: main_score\n value: 41.559000000000005\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CQADupstackWordpressRetrieval (default)\n revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4\n split: test\n type: mteb/cqadupstack-wordpress\n metrics:\n - type: map_at_1\n value: 19.23\n - type: map_at_10\n value: 28.128999999999998\n - type: map_at_100\n value: 29.195\n - type: map_at_1000\n value: 29.310000000000002\n - type: map_at_20\n value: 28.713\n - type: map_at_3\n value: 25.191000000000003\n - type: map_at_5\n value: 26.69\n - type: mrr_at_1\n value: 21.257\n - type: mrr_at_10\n value: 30.253999999999998\n - type: mrr_at_100\n value: 31.195\n - type: mrr_at_1000\n value: 31.270999999999997\n - type: mrr_at_20\n value: 30.747999999999998\n - type: mrr_at_3\n value: 27.633999999999997\n - type: mrr_at_5\n value: 28.937\n - type: ndcg_at_1\n value: 21.257\n - type: ndcg_at_10\n value: 33.511\n - type: ndcg_at_100\n value: 38.733000000000004\n - type: ndcg_at_1000\n value: 41.489\n - type: ndcg_at_20\n value: 35.476\n - type: ndcg_at_3\n value: 27.845\n - type: ndcg_at_5\n value: 30.264999999999997\n - type: precision_at_1\n value: 21.257\n - type: precision_at_10\n value: 5.619\n - type: precision_at_100\n value: 0.893\n - type: precision_at_1000\n value: 0.124\n - type: precision_at_20\n value: 3.29\n - type: precision_at_3\n value: 12.508\n - type: precision_at_5\n value: 8.946\n - type: recall_at_1\n value: 19.23\n - type: recall_at_10\n value: 48.185\n - type: recall_at_100\n value: 71.932\n - type: recall_at_1000\n value: 92.587\n - type: recall_at_20\n value: 55.533\n - type: recall_at_3\n value: 32.865\n - type: recall_at_5\n value: 38.577\n - type: main_score\n value: 33.511\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB ClimateFEVER (default)\n revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380\n split: test\n type: mteb/climate-fever\n metrics:\n - type: map_at_1\n value: 19.594\n - type: map_at_10\n value: 32.519\n - type: map_at_100\n value: 34.1\n - type: map_at_1000\n value: 34.263\n - type: map_at_20\n value: 33.353\n - type: map_at_3\n value: 27.898\n - type: map_at_5\n value: 30.524\n - type: mrr_at_1\n value: 46.515\n - type: mrr_at_10\n value: 56.958\n - type: mrr_at_100\n value: 57.54899999999999\n - type: mrr_at_1000\n value: 57.574999999999996\n - type: mrr_at_20\n value: 57.315000000000005\n - type: mrr_at_3\n value: 54.852999999999994\n - type: mrr_at_5\n value: 56.153\n - type: ndcg_at_1\n value: 46.515\n - type: ndcg_at_10\n value: 42.363\n - type: ndcg_at_100\n value: 48.233\n - type: ndcg_at_1000\n value: 50.993\n - type: ndcg_at_20\n value: 44.533\n - type: ndcg_at_3\n value: 37.297000000000004\n - type: ndcg_at_5\n value: 38.911\n - type: precision_at_1\n value: 46.515\n - type: precision_at_10\n value: 12.520999999999999\n - type: precision_at_100\n value: 1.8980000000000001\n - type: precision_at_1000\n value: 0.242\n - type: precision_at_20\n value: 7.212000000000001\n - type: precision_at_3\n value: 27.752\n - type: precision_at_5\n value: 20.391000000000002\n - type: recall_at_1\n value: 19.594\n - type: recall_at_10\n value: 46.539\n - type: recall_at_100\n value: 66.782\n - type: recall_at_1000\n value: 82.049\n - type: recall_at_20\n value: 52.611\n - type: recall_at_3\n value: 32.528\n - type: recall_at_5\n value: 38.933\n - type: main_score\n value: 42.363\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB CmedqaRetrieval (default)\n revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301\n split: dev\n type: C-MTEB/CmedqaRetrieval\n metrics:\n - type: main_score\n value: 35.927\n - type: map_at_1\n value: 20.144000000000002\n - type: map_at_10\n value: 29.94\n - type: map_at_100\n value: 31.630000000000003\n - type: map_at_1000\n value: 31.778000000000002\n - type: map_at_20\n value: 30.798\n - type: map_at_3\n value: 26.534999999999997\n - type: map_at_5\n value: 28.33\n - type: mrr_at_1\n value: 31.23280820205051\n - type: mrr_at_10\n value: 38.66781179421835\n - type: mrr_at_100\n value: 39.656936166081785\n - type: mrr_at_1000\n value: 39.724602893117414\n - type: mrr_at_20\n value: 39.21272461558451\n - type: mrr_at_3\n value: 36.30907726931729\n - type: mrr_at_5\n value: 37.59814953738436\n - type: nauc_map_at_1000_diff1\n value: 44.5755334437146\n - type: nauc_map_at_1000_max\n value: 40.726916781400746\n - type: nauc_map_at_1000_std\n value: -19.591835061497367\n - type: nauc_map_at_100_diff1\n value: 44.54542899921038\n - type: nauc_map_at_100_max\n value: 40.68305902532837\n - type: nauc_map_at_100_std\n value: -19.658902089283487\n - type: nauc_map_at_10_diff1\n value: 44.56110529630953\n - type: nauc_map_at_10_max\n value: 39.89826167846008\n - type: nauc_map_at_10_std\n value: -20.62910633667902\n - type: nauc_map_at_1_diff1\n value: 50.82120107004449\n - type: nauc_map_at_1_max\n value: 33.208851367861584\n - type: nauc_map_at_1_std\n value: -20.29409730258174\n - type: nauc_map_at_20_diff1\n value: 44.51171242433788\n - type: nauc_map_at_20_max\n value: 40.30431132782945\n - type: nauc_map_at_20_std\n value: -20.290524142792417\n - type: nauc_map_at_3_diff1\n value: 45.80394138665133\n - type: nauc_map_at_3_max\n value: 37.766191281426956\n - type: nauc_map_at_3_std\n value: -21.223601997333876\n - type: nauc_map_at_5_diff1\n value: 45.00457218474283\n - type: nauc_map_at_5_max\n value: 38.901044576388365\n - type: nauc_map_at_5_std\n value: -20.893069613941634\n - type: nauc_mrr_at_1000_diff1\n value: 50.09855359231429\n - type: nauc_mrr_at_1000_max\n value: 46.481000170008826\n - type: nauc_mrr_at_1000_std\n value: -16.053461377096102\n - type: nauc_mrr_at_100_diff1\n value: 50.08205026347746\n - type: nauc_mrr_at_100_max\n value: 46.47262126963331\n - type: nauc_mrr_at_100_std\n value: -16.049112778748693\n - type: nauc_mrr_at_10_diff1\n value: 50.02363239081706\n - type: nauc_mrr_at_10_max\n value: 46.39287859062042\n - type: nauc_mrr_at_10_std\n value: -16.280866744769657\n - type: nauc_mrr_at_1_diff1\n value: 55.692503735317445\n - type: nauc_mrr_at_1_max\n value: 47.334834529801014\n - type: nauc_mrr_at_1_std\n value: -16.985483585693512\n - type: nauc_mrr_at_20_diff1\n value: 50.07725225722074\n - type: nauc_mrr_at_20_max\n value: 46.47279295070193\n - type: nauc_mrr_at_20_std\n value: -16.15168364678318\n - type: nauc_mrr_at_3_diff1\n value: 51.18685337274134\n - type: nauc_mrr_at_3_max\n value: 46.7286365021621\n - type: nauc_mrr_at_3_std\n value: -16.708451287313718\n - type: nauc_mrr_at_5_diff1\n value: 50.46777237893576\n - type: nauc_mrr_at_5_max\n value: 46.5352076502249\n - type: nauc_mrr_at_5_std\n value: -16.557413659905034\n - type: nauc_ndcg_at_1000_diff1\n value: 43.974299434438066\n - type: nauc_ndcg_at_1000_max\n value: 43.44628675071857\n - type: nauc_ndcg_at_1000_std\n value: -15.3495102005021\n - type: nauc_ndcg_at_100_diff1\n value: 43.336365081508504\n - type: nauc_ndcg_at_100_max\n value: 43.11345604460776\n - type: nauc_ndcg_at_100_std\n value: -15.571128070860615\n - type: nauc_ndcg_at_10_diff1\n value: 43.41266214720136\n - type: nauc_ndcg_at_10_max\n value: 41.519676787851914\n - type: nauc_ndcg_at_10_std\n value: -19.217175017223568\n - type: nauc_ndcg_at_1_diff1\n value: 55.692503735317445\n - type: nauc_ndcg_at_1_max\n value: 47.334834529801014\n - type: nauc_ndcg_at_1_std\n value: -16.985483585693512\n - type: nauc_ndcg_at_20_diff1\n value: 43.351653862834496\n - type: nauc_ndcg_at_20_max\n value: 42.11608469750499\n - type: nauc_ndcg_at_20_std\n value: -18.485363540641664\n - type: nauc_ndcg_at_3_diff1\n value: 45.64193888236677\n - type: nauc_ndcg_at_3_max\n value: 42.497135099009995\n - type: nauc_ndcg_at_3_std\n value: -18.764012041130094\n - type: nauc_ndcg_at_5_diff1\n value: 44.523392133895186\n - type: nauc_ndcg_at_5_max\n value: 41.564242030096345\n - type: nauc_ndcg_at_5_std\n value: -19.31080790984941\n - type: nauc_precision_at_1000_diff1\n value: 6.383464615714393\n - type: nauc_precision_at_1000_max\n value: 27.439930931284657\n - type: nauc_precision_at_1000_std\n value: 19.070716188143034\n - type: nauc_precision_at_100_diff1\n value: 12.599136754501284\n - type: nauc_precision_at_100_max\n value: 35.886310962337795\n - type: nauc_precision_at_100_std\n value: 14.06587592659196\n - type: nauc_precision_at_10_diff1\n value: 25.388891173150206\n - type: nauc_precision_at_10_max\n value: 46.10269270777384\n - type: nauc_precision_at_10_std\n value: -5.993803607158499\n - type: nauc_precision_at_1_diff1\n value: 55.692503735317445\n - type: nauc_precision_at_1_max\n value: 47.334834529801014\n - type: nauc_precision_at_1_std\n value: -16.985483585693512\n - type: nauc_precision_at_20_diff1\n value: 20.984013463099707\n - type: nauc_precision_at_20_max\n value: 42.9471854616888\n - type: nauc_precision_at_20_std\n value: -0.8045549929346024\n - type: nauc_precision_at_3_diff1\n value: 36.191850547148356\n - type: nauc_precision_at_3_max\n value: 48.09923832376049\n - type: nauc_precision_at_3_std\n value: -13.159407051271321\n - type: nauc_precision_at_5_diff1\n value: 31.04967966700407\n - type: nauc_precision_at_5_max\n value: 47.62867673349624\n - type: nauc_precision_at_5_std\n value: -10.345790325137353\n - type: nauc_recall_at_1000_diff1\n value: 11.03436839065707\n - type: nauc_recall_at_1000_max\n value: 42.32265076651575\n - type: nauc_recall_at_1000_std\n value: 30.478521053399206\n - type: nauc_recall_at_100_diff1\n value: 24.788349084510806\n - type: nauc_recall_at_100_max\n value: 36.72097184821956\n - type: nauc_recall_at_100_std\n value: -0.2241144179522076\n - type: nauc_recall_at_10_diff1\n value: 31.613053567704885\n - type: nauc_recall_at_10_max\n value: 34.4597322828833\n - type: nauc_recall_at_10_std\n value: -18.00022912690819\n - type: nauc_recall_at_1_diff1\n value: 50.82120107004449\n - type: nauc_recall_at_1_max\n value: 33.208851367861584\n - type: nauc_recall_at_1_std\n value: -20.29409730258174\n - type: nauc_recall_at_20_diff1\n value: 30.277002670708384\n - type: nauc_recall_at_20_max\n value: 35.212475675060375\n - type: nauc_recall_at_20_std\n value: -15.822788854733687\n - type: nauc_recall_at_3_diff1\n value: 38.87844958322257\n - type: nauc_recall_at_3_max\n value: 34.66914910044104\n - type: nauc_recall_at_3_std\n value: -20.234707300209127\n - type: nauc_recall_at_5_diff1\n value: 35.551139991687776\n - type: nauc_recall_at_5_max\n value: 34.61009958820695\n - type: nauc_recall_at_5_std\n value: -19.519180149293444\n - type: ndcg_at_1\n value: 31.233\n - type: ndcg_at_10\n value: 35.927\n - type: ndcg_at_100\n value: 43.037\n - type: ndcg_at_1000\n value: 45.900999999999996\n - type: ndcg_at_20\n value: 38.39\n - type: ndcg_at_3\n value: 31.366\n - type: ndcg_at_5\n value: 33.108\n - type: precision_at_1\n value: 31.233\n - type: precision_at_10\n value: 8.15\n - type: precision_at_100\n value: 1.402\n - type: precision_at_1000\n value: 0.17700000000000002\n - type: precision_at_20\n value: 4.91\n - type: precision_at_3\n value: 17.871000000000002\n - type: precision_at_5\n value: 12.948\n - type: recall_at_1\n value: 20.144000000000002\n - type: recall_at_10\n value: 44.985\n - type: recall_at_100\n value: 74.866\n - type: recall_at_1000\n value: 94.477\n - type: recall_at_20\n value: 53.37\n - type: recall_at_3\n value: 31.141000000000002\n - type: recall_at_5\n value: 36.721\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB Cmnli (default)\n revision: None\n split: validation\n type: C-MTEB/CMNLI\n metrics:\n - type: cos_sim_accuracy\n value: 71.25676488274203\n - type: cos_sim_accuracy_threshold\n value: 78.11152935028076\n - type: cos_sim_ap\n value: 79.10444825556077\n - type: cos_sim_f1\n value: 74.10750923266312\n - type: cos_sim_f1_threshold\n value: 75.2312421798706\n - type: cos_sim_precision\n value: 66.02083714129044\n - type: cos_sim_recall\n value: 84.45171849427169\n - type: dot_accuracy\n value: 68.11785929043896\n - type: dot_accuracy_threshold\n value: 34783.23974609375\n - type: dot_ap\n value: 75.80201827987712\n - type: dot_f1\n value: 72.31670990679349\n - type: dot_f1_threshold\n value: 31978.036499023438\n - type: dot_precision\n value: 61.386623164763456\n - type: dot_recall\n value: 87.98223053542202\n - type: euclidean_accuracy\n value: 71.41310883944678\n - type: euclidean_accuracy_threshold\n value: 1374.9353408813477\n - type: euclidean_ap\n value: 79.23359768836457\n - type: euclidean_f1\n value: 74.38512297540491\n - type: euclidean_f1_threshold\n value: 1512.6035690307617\n - type: euclidean_precision\n value: 64.97816593886463\n - type: euclidean_recall\n value: 86.97685293429974\n - type: manhattan_accuracy\n value: 71.32892363199038\n - type: manhattan_accuracy_threshold\n value: 33340.49072265625\n - type: manhattan_ap\n value: 79.11973684118587\n - type: manhattan_f1\n value: 74.29401993355481\n - type: manhattan_f1_threshold\n value: 36012.52746582031\n - type: manhattan_precision\n value: 66.81605975723622\n - type: manhattan_recall\n value: 83.65676876315175\n - type: max_accuracy\n value: 71.41310883944678\n - type: max_ap\n value: 79.23359768836457\n - type: max_f1\n value: 74.38512297540491\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB CovidRetrieval (default)\n revision: 1271c7809071a13532e05f25fb53511ffce77117\n split: dev\n type: C-MTEB/CovidRetrieval\n metrics:\n - type: main_score\n value: 78.917\n - type: map_at_1\n value: 67.281\n - type: map_at_10\n value: 75.262\n - type: map_at_100\n value: 75.60900000000001\n - type: map_at_1000\n value: 75.618\n - type: map_at_20\n value: 75.50200000000001\n - type: map_at_3\n value: 73.455\n - type: map_at_5\n value: 74.657\n - type: mrr_at_1\n value: 67.43940990516333\n - type: mrr_at_10\n value: 75.27367989696756\n - type: mrr_at_100\n value: 75.62029353306437\n - type: mrr_at_1000\n value: 75.62934741874726\n - type: mrr_at_20\n value: 75.51356607409173\n - type: mrr_at_3\n value: 73.5159817351598\n - type: mrr_at_5\n value: 74.73832103969093\n - type: nauc_map_at_1000_diff1\n value: 77.26666391867634\n - type: nauc_map_at_1000_max\n value: 49.928541012203496\n - type: nauc_map_at_1000_std\n value: -40.494469470474456\n - type: nauc_map_at_100_diff1\n value: 77.26087423162396\n - type: nauc_map_at_100_max\n value: 49.944275615664424\n - type: nauc_map_at_100_std\n value: -40.48299992715398\n - type: nauc_map_at_10_diff1\n value: 76.97400113500906\n - type: nauc_map_at_10_max\n value: 49.84177029115674\n - type: nauc_map_at_10_std\n value: -40.829250876511445\n - type: nauc_map_at_1_diff1\n value: 81.44050620630395\n - type: nauc_map_at_1_max\n value: 48.97711944070578\n - type: nauc_map_at_1_std\n value: -38.963689457570254\n - type: nauc_map_at_20_diff1\n value: 77.21791353089375\n - type: nauc_map_at_20_max\n value: 49.958206759079424\n - type: nauc_map_at_20_std\n value: -40.53067571658996\n - type: nauc_map_at_3_diff1\n value: 77.3555925208868\n - type: nauc_map_at_3_max\n value: 49.32158146451256\n - type: nauc_map_at_3_std\n value: -41.93552426981978\n - type: nauc_map_at_5_diff1\n value: 77.07099950431504\n - type: nauc_map_at_5_max\n value: 49.54190504495002\n - type: nauc_map_at_5_std\n value: -41.814968130918096\n - type: nauc_mrr_at_1000_diff1\n value: 77.31388774540477\n - type: nauc_mrr_at_1000_max\n value: 49.96779699175759\n - type: nauc_mrr_at_1000_std\n value: -40.43739645160277\n - type: nauc_mrr_at_100_diff1\n value: 77.30817786449413\n - type: nauc_mrr_at_100_max\n value: 49.982514428937655\n - type: nauc_mrr_at_100_std\n value: -40.42876582797744\n - type: nauc_mrr_at_10_diff1\n value: 77.02048060465756\n - type: nauc_mrr_at_10_max\n value: 49.87937207270602\n - type: nauc_mrr_at_10_std\n value: -40.77596560333177\n - type: nauc_mrr_at_1_diff1\n value: 81.27219599516599\n - type: nauc_mrr_at_1_max\n value: 49.3083394026327\n - type: nauc_mrr_at_1_std\n value: -38.31023037552026\n - type: nauc_mrr_at_20_diff1\n value: 77.26497089316055\n - type: nauc_mrr_at_20_max\n value: 49.996257597621415\n - type: nauc_mrr_at_20_std\n value: -40.476723608868014\n - type: nauc_mrr_at_3_diff1\n value: 77.38971294099257\n - type: nauc_mrr_at_3_max\n value: 49.38110328987404\n - type: nauc_mrr_at_3_std\n value: -41.7118646715979\n - type: nauc_mrr_at_5_diff1\n value: 77.08286142519952\n - type: nauc_mrr_at_5_max\n value: 49.655249374588685\n - type: nauc_mrr_at_5_std\n value: -41.48173039989406\n - type: nauc_ndcg_at_1000_diff1\n value: 76.47399204021758\n - type: nauc_ndcg_at_1000_max\n value: 50.55770139961048\n - type: nauc_ndcg_at_1000_std\n value: -39.55650430279072\n - type: nauc_ndcg_at_100_diff1\n value: 76.29355616618253\n - type: nauc_ndcg_at_100_max\n value: 51.003608112592936\n - type: nauc_ndcg_at_100_std\n value: -39.24769744605206\n - type: nauc_ndcg_at_10_diff1\n value: 74.88697528447634\n - type: nauc_ndcg_at_10_max\n value: 50.398416372815234\n - type: nauc_ndcg_at_10_std\n value: -40.76526585772833\n - type: nauc_ndcg_at_1_diff1\n value: 81.27219599516599\n - type: nauc_ndcg_at_1_max\n value: 49.3083394026327\n - type: nauc_ndcg_at_1_std\n value: -38.31023037552026\n - type: nauc_ndcg_at_20_diff1\n value: 75.85463512091866\n - type: nauc_ndcg_at_20_max\n value: 50.97338683654334\n - type: nauc_ndcg_at_20_std\n value: -39.353128774903404\n - type: nauc_ndcg_at_3_diff1\n value: 75.94015726123543\n - type: nauc_ndcg_at_3_max\n value: 49.22194251063148\n - type: nauc_ndcg_at_3_std\n value: -43.040457030630435\n - type: nauc_ndcg_at_5_diff1\n value: 75.19166189770303\n - type: nauc_ndcg_at_5_max\n value: 49.65696229797189\n - type: nauc_ndcg_at_5_std\n value: -42.81534909184424\n - type: nauc_precision_at_1000_diff1\n value: -14.830901395815788\n - type: nauc_precision_at_1000_max\n value: 19.686297136854623\n - type: nauc_precision_at_1000_std\n value: 61.19310360166978\n - type: nauc_precision_at_100_diff1\n value: 20.55469986751769\n - type: nauc_precision_at_100_max\n value: 50.78431835075583\n - type: nauc_precision_at_100_std\n value: 31.54986568374813\n - type: nauc_precision_at_10_diff1\n value: 45.991938532558656\n - type: nauc_precision_at_10_max\n value: 46.386318595630385\n - type: nauc_precision_at_10_std\n value: -23.463011435224608\n - type: nauc_precision_at_1_diff1\n value: 81.27219599516599\n - type: nauc_precision_at_1_max\n value: 49.3083394026327\n - type: nauc_precision_at_1_std\n value: -38.31023037552026\n - type: nauc_precision_at_20_diff1\n value: 41.53180472410822\n - type: nauc_precision_at_20_max\n value: 49.89800247204318\n - type: nauc_precision_at_20_std\n value: -2.4192847331537095\n - type: nauc_precision_at_3_diff1\n value: 67.37504651209993\n - type: nauc_precision_at_3_max\n value: 47.893537208629496\n - type: nauc_precision_at_3_std\n value: -43.2362212382819\n - type: nauc_precision_at_5_diff1\n value: 60.03438883791718\n - type: nauc_precision_at_5_max\n value: 48.29770502354206\n - type: nauc_precision_at_5_std\n value: -40.39588448271546\n - type: nauc_recall_at_1000_diff1\n value: 71.04741174480844\n - type: nauc_recall_at_1000_max\n value: 93.19056506596002\n - type: nauc_recall_at_1000_std\n value: 62.96994797650912\n - type: nauc_recall_at_100_diff1\n value: 65.00418176852641\n - type: nauc_recall_at_100_max\n value: 85.27352708427193\n - type: nauc_recall_at_100_std\n value: 2.8812005546518886\n - type: nauc_recall_at_10_diff1\n value: 61.263254794998865\n - type: nauc_recall_at_10_max\n value: 54.17618329507141\n - type: nauc_recall_at_10_std\n value: -39.80603966142593\n - type: nauc_recall_at_1_diff1\n value: 81.44050620630395\n - type: nauc_recall_at_1_max\n value: 48.97711944070578\n - type: nauc_recall_at_1_std\n value: -38.963689457570254\n - type: nauc_recall_at_20_diff1\n value: 64.42106091745396\n - type: nauc_recall_at_20_max\n value: 63.10796640821887\n - type: nauc_recall_at_20_std\n value: -22.60117424572222\n - type: nauc_recall_at_3_diff1\n value: 70.66311436592945\n - type: nauc_recall_at_3_max\n value: 48.69498944323469\n - type: nauc_recall_at_3_std\n value: -47.37847524874532\n - type: nauc_recall_at_5_diff1\n value: 66.12701111728848\n - type: nauc_recall_at_5_max\n value: 49.91763957934711\n - type: nauc_recall_at_5_std\n value: -48.173252920584126\n - type: ndcg_at_1\n value: 67.43900000000001\n - type: ndcg_at_10\n value: 78.917\n - type: ndcg_at_100\n value: 80.53399999999999\n - type: ndcg_at_1000\n value: 80.768\n - type: ndcg_at_20\n value: 79.813\n - type: ndcg_at_3\n value: 75.37\n - type: ndcg_at_5\n value: 77.551\n - type: precision_at_1\n value: 67.43900000000001\n - type: precision_at_10\n value: 9.115\n - type: precision_at_100\n value: 0.985\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.737\n - type: precision_at_3\n value: 27.081\n - type: precision_at_5\n value: 17.345\n - type: recall_at_1\n value: 67.281\n - type: recall_at_10\n value: 90.2\n - type: recall_at_100\n value: 97.576\n - type: recall_at_1000\n value: 99.368\n - type: recall_at_20\n value: 93.783\n - type: recall_at_3\n value: 80.822\n - type: recall_at_5\n value: 86.091\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB DBPedia (default)\n revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659\n split: test\n type: mteb/dbpedia\n metrics:\n - type: map_at_1\n value: 9.041\n - type: map_at_10\n value: 18.662\n - type: map_at_100\n value: 26.054\n - type: map_at_1000\n value: 27.769\n - type: map_at_20\n value: 21.499\n - type: map_at_3\n value: 13.628000000000002\n - type: map_at_5\n value: 15.617\n - type: mrr_at_1\n value: 67.25\n - type: mrr_at_10\n value: 74.673\n - type: mrr_at_100\n value: 75.022\n - type: mrr_at_1000\n value: 75.031\n - type: mrr_at_20\n value: 74.895\n - type: mrr_at_3\n value: 73.042\n - type: mrr_at_5\n value: 74.179\n - type: ndcg_at_1\n value: 55.75\n - type: ndcg_at_10\n value: 41.004000000000005\n - type: ndcg_at_100\n value: 44.912\n - type: ndcg_at_1000\n value: 51.946000000000005\n - type: ndcg_at_20\n value: 40.195\n - type: ndcg_at_3\n value: 45.803\n - type: ndcg_at_5\n value: 42.976\n - type: precision_at_1\n value: 67.25\n - type: precision_at_10\n value: 31.874999999999996\n - type: precision_at_100\n value: 10.37\n - type: precision_at_1000\n value: 2.1430000000000002\n - type: precision_at_20\n value: 24.275\n - type: precision_at_3\n value: 48.417\n - type: precision_at_5\n value: 40.2\n - type: recall_at_1\n value: 9.041\n - type: recall_at_10\n value: 23.592\n - type: recall_at_100\n value: 49.476\n - type: recall_at_1000\n value: 71.677\n - type: recall_at_20\n value: 30.153000000000002\n - type: recall_at_3\n value: 14.777000000000001\n - type: recall_at_5\n value: 17.829\n - type: main_score\n value: 41.004000000000005\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB DuRetrieval (default)\n revision: a1a333e290fe30b10f3f56498e3a0d911a693ced\n split: dev\n type: C-MTEB/DuRetrieval\n metrics:\n - type: main_score\n value: 83.134\n - type: map_at_1\n value: 23.907999999999998\n - type: map_at_10\n value: 74.566\n - type: map_at_100\n value: 77.706\n - type: map_at_1000\n value: 77.762\n - type: map_at_20\n value: 76.943\n - type: map_at_3\n value: 50.971999999999994\n - type: map_at_5\n value: 64.429\n - type: mrr_at_1\n value: 84.8\n - type: mrr_at_10\n value: 89.73218253968246\n - type: mrr_at_100\n value: 89.82853630655774\n - type: mrr_at_1000\n value: 89.83170411703153\n - type: mrr_at_20\n value: 89.79582030091501\n - type: mrr_at_3\n value: 89.32499999999992\n - type: mrr_at_5\n value: 89.58749999999992\n - type: nauc_map_at_1000_diff1\n value: -2.2736020650163717\n - type: nauc_map_at_1000_max\n value: 45.3937519555142\n - type: nauc_map_at_1000_std\n value: 10.824778228268581\n - type: nauc_map_at_100_diff1\n value: -2.2662939752750066\n - type: nauc_map_at_100_max\n value: 45.423960626031366\n - type: nauc_map_at_100_std\n value: 10.804239351738717\n - type: nauc_map_at_10_diff1\n value: 0.9395752585654343\n - type: nauc_map_at_10_max\n value: 42.53814836940551\n - type: nauc_map_at_10_std\n value: 0.7199313235265218\n - type: nauc_map_at_1_diff1\n value: 45.19415865267676\n - type: nauc_map_at_1_max\n value: -1.7261947382471912\n - type: nauc_map_at_1_std\n value: -32.16144291613605\n - type: nauc_map_at_20_diff1\n value: -1.884514152147472\n - type: nauc_map_at_20_max\n value: 44.830401115927174\n - type: nauc_map_at_20_std\n value: 8.118530414377219\n - type: nauc_map_at_3_diff1\n value: 25.678881127059967\n - type: nauc_map_at_3_max\n value: 12.191400431839758\n - type: nauc_map_at_3_std\n value: -27.201740587642327\n - type: nauc_map_at_5_diff1\n value: 13.227128780829572\n - type: nauc_map_at_5_max\n value: 26.978282739708977\n - type: nauc_map_at_5_std\n value: -17.555610348070584\n - type: nauc_mrr_at_1000_diff1\n value: 21.073512437502178\n - type: nauc_mrr_at_1000_max\n value: 64.9680257861005\n - type: nauc_mrr_at_1000_std\n value: 19.626288754404293\n - type: nauc_mrr_at_100_diff1\n value: 21.074637426957732\n - type: nauc_mrr_at_100_max\n value: 64.97612675661915\n - type: nauc_mrr_at_100_std\n value: 19.649504127800878\n - type: nauc_mrr_at_10_diff1\n value: 21.12003267626651\n - type: nauc_mrr_at_10_max\n value: 65.24362289059766\n - type: nauc_mrr_at_10_std\n value: 19.92351276180984\n - type: nauc_mrr_at_1_diff1\n value: 22.711430629147635\n - type: nauc_mrr_at_1_max\n value: 58.4059429497403\n - type: nauc_mrr_at_1_std\n value: 11.967886722567973\n - type: nauc_mrr_at_20_diff1\n value: 20.98220830510272\n - type: nauc_mrr_at_20_max\n value: 65.05737535197835\n - type: nauc_mrr_at_20_std\n value: 19.66672900782771\n - type: nauc_mrr_at_3_diff1\n value: 20.924796220048528\n - type: nauc_mrr_at_3_max\n value: 65.71388669932584\n - type: nauc_mrr_at_3_std\n value: 20.05912197134477\n - type: nauc_mrr_at_5_diff1\n value: 20.61978649468208\n - type: nauc_mrr_at_5_max\n value: 65.50709154526211\n - type: nauc_mrr_at_5_std\n value: 20.241434276181838\n - type: nauc_ndcg_at_1000_diff1\n value: 0.25363171946133656\n - type: nauc_ndcg_at_1000_max\n value: 54.12840465309885\n - type: nauc_ndcg_at_1000_std\n value: 20.749184325412546\n - type: nauc_ndcg_at_100_diff1\n value: 0.15649430250272792\n - type: nauc_ndcg_at_100_max\n value: 54.47995322413234\n - type: nauc_ndcg_at_100_std\n value: 21.266786634233267\n - type: nauc_ndcg_at_10_diff1\n value: 0.14579250840386346\n - type: nauc_ndcg_at_10_max\n value: 49.8643037948353\n - type: nauc_ndcg_at_10_std\n value: 12.960701643914216\n - type: nauc_ndcg_at_1_diff1\n value: 22.711430629147635\n - type: nauc_ndcg_at_1_max\n value: 58.4059429497403\n - type: nauc_ndcg_at_1_std\n value: 11.967886722567973\n - type: nauc_ndcg_at_20_diff1\n value: -0.6701559981776763\n - type: nauc_ndcg_at_20_max\n value: 52.95443437012488\n - type: nauc_ndcg_at_20_std\n value: 16.708883972005758\n - type: nauc_ndcg_at_3_diff1\n value: -0.19084922341962388\n - type: nauc_ndcg_at_3_max\n value: 46.2110230886874\n - type: nauc_ndcg_at_3_std\n value: 13.363250229683038\n - type: nauc_ndcg_at_5_diff1\n value: 0.9840019268192548\n - type: nauc_ndcg_at_5_max\n value: 43.56594891798146\n - type: nauc_ndcg_at_5_std\n value: 8.577017104088146\n - type: nauc_precision_at_1000_diff1\n value: -30.779179091501145\n - type: nauc_precision_at_1000_max\n value: 16.056094258615673\n - type: nauc_precision_at_1000_std\n value: 49.96303902363283\n - type: nauc_precision_at_100_diff1\n value: -31.583236638899585\n - type: nauc_precision_at_100_max\n value: 19.16571713603373\n - type: nauc_precision_at_100_std\n value: 51.870647903980036\n - type: nauc_precision_at_10_diff1\n value: -35.62134572732597\n - type: nauc_precision_at_10_max\n value: 31.6935186494612\n - type: nauc_precision_at_10_std\n value: 46.68659723766723\n - type: nauc_precision_at_1_diff1\n value: 22.711430629147635\n - type: nauc_precision_at_1_max\n value: 58.4059429497403\n - type: nauc_precision_at_1_std\n value: 11.967886722567973\n - type: nauc_precision_at_20_diff1\n value: -33.875460046920495\n - type: nauc_precision_at_20_max\n value: 24.188420133566442\n - type: nauc_precision_at_20_std\n value: 50.02387762958483\n - type: nauc_precision_at_3_diff1\n value: -28.875998450906827\n - type: nauc_precision_at_3_max\n value: 44.77058831167941\n - type: nauc_precision_at_3_std\n value: 31.77993710437207\n - type: nauc_precision_at_5_diff1\n value: -34.92525440306491\n - type: nauc_precision_at_5_max\n value: 39.855219917077086\n - type: nauc_precision_at_5_std\n value: 37.95432046169299\n - type: nauc_recall_at_1000_diff1\n value: -14.293309371874733\n - type: nauc_recall_at_1000_max\n value: 59.06948692482579\n - type: nauc_recall_at_1000_std\n value: 62.586254868312686\n - type: nauc_recall_at_100_diff1\n value: -4.344100947212704\n - type: nauc_recall_at_100_max\n value: 58.42120421043602\n - type: nauc_recall_at_100_std\n value: 46.48562009316997\n - type: nauc_recall_at_10_diff1\n value: 0.04948662912161709\n - type: nauc_recall_at_10_max\n value: 42.42809687119093\n - type: nauc_recall_at_10_std\n value: 0.6892504250411409\n - type: nauc_recall_at_1_diff1\n value: 45.19415865267676\n - type: nauc_recall_at_1_max\n value: -1.7261947382471912\n - type: nauc_recall_at_1_std\n value: -32.16144291613605\n - type: nauc_recall_at_20_diff1\n value: -7.634587864605111\n - type: nauc_recall_at_20_max\n value: 49.21327187174134\n - type: nauc_recall_at_20_std\n value: 16.408481068336346\n - type: nauc_recall_at_3_diff1\n value: 24.72546591038644\n - type: nauc_recall_at_3_max\n value: 6.620763400972902\n - type: nauc_recall_at_3_std\n value: -29.994703323331684\n - type: nauc_recall_at_5_diff1\n value: 12.65527364845842\n - type: nauc_recall_at_5_max\n value: 20.400121385794694\n - type: nauc_recall_at_5_std\n value: -22.34284568447213\n - type: ndcg_at_1\n value: 84.8\n - type: ndcg_at_10\n value: 83.134\n - type: ndcg_at_100\n value: 86.628\n - type: ndcg_at_1000\n value: 87.151\n - type: ndcg_at_20\n value: 85.092\n - type: ndcg_at_3\n value: 81.228\n - type: ndcg_at_5\n value: 80.2\n - type: precision_at_1\n value: 84.8\n - type: precision_at_10\n value: 40.394999999999996\n - type: precision_at_100\n value: 4.745\n - type: precision_at_1000\n value: 0.488\n - type: precision_at_20\n value: 22.245\n - type: precision_at_3\n value: 73.25\n - type: precision_at_5\n value: 61.86000000000001\n - type: recall_at_1\n value: 23.907999999999998\n - type: recall_at_10\n value: 85.346\n - type: recall_at_100\n value: 96.515\n - type: recall_at_1000\n value: 99.156\n - type: recall_at_20\n value: 91.377\n - type: recall_at_3\n value: 54.135\n - type: recall_at_5\n value: 70.488\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB EcomRetrieval (default)\n revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9\n split: dev\n type: C-MTEB/EcomRetrieval\n metrics:\n - type: main_score\n value: 60.887\n - type: map_at_1\n value: 46.6\n - type: map_at_10\n value: 56.035000000000004\n - type: map_at_100\n value: 56.741\n - type: map_at_1000\n value: 56.764\n - type: map_at_20\n value: 56.513999999999996\n - type: map_at_3\n value: 53.733\n - type: map_at_5\n value: 54.913000000000004\n - type: mrr_at_1\n value: 46.6\n - type: mrr_at_10\n value: 56.034523809523776\n - type: mrr_at_100\n value: 56.74056360434383\n - type: mrr_at_1000\n value: 56.76373487222486\n - type: mrr_at_20\n value: 56.51374873879128\n - type: mrr_at_3\n value: 53.73333333333328\n - type: mrr_at_5\n value: 54.91333333333327\n - type: nauc_map_at_1000_diff1\n value: 65.13546939953387\n - type: nauc_map_at_1000_max\n value: 43.358890946774494\n - type: nauc_map_at_1000_std\n value: -9.973282105235036\n - type: nauc_map_at_100_diff1\n value: 65.12449309472493\n - type: nauc_map_at_100_max\n value: 43.377100882923145\n - type: nauc_map_at_100_std\n value: -9.971781228240555\n - type: nauc_map_at_10_diff1\n value: 64.83020018537475\n - type: nauc_map_at_10_max\n value: 43.25969482323034\n - type: nauc_map_at_10_std\n value: -10.120272176001547\n - type: nauc_map_at_1_diff1\n value: 69.58727592100516\n - type: nauc_map_at_1_max\n value: 38.236494689522026\n - type: nauc_map_at_1_std\n value: -14.833390831689597\n - type: nauc_map_at_20_diff1\n value: 65.01159809914586\n - type: nauc_map_at_20_max\n value: 43.33440319829618\n - type: nauc_map_at_20_std\n value: -10.039958228659726\n - type: nauc_map_at_3_diff1\n value: 65.2396323885909\n - type: nauc_map_at_3_max\n value: 42.26904017378952\n - type: nauc_map_at_3_std\n value: -11.793017036934044\n - type: nauc_map_at_5_diff1\n value: 64.96397227898036\n - type: nauc_map_at_5_max\n value: 43.231333789145424\n - type: nauc_map_at_5_std\n value: -10.349933732151372\n - type: nauc_mrr_at_1000_diff1\n value: 65.13546939953387\n - type: nauc_mrr_at_1000_max\n value: 43.358890946774494\n - type: nauc_mrr_at_1000_std\n value: -9.973282105235036\n - type: nauc_mrr_at_100_diff1\n value: 65.12449309472493\n - type: nauc_mrr_at_100_max\n value: 43.377100882923145\n - type: nauc_mrr_at_100_std\n value: -9.971781228240555\n - type: nauc_mrr_at_10_diff1\n value: 64.83020018537475\n - type: nauc_mrr_at_10_max\n value: 43.25969482323034\n - type: nauc_mrr_at_10_std\n value: -10.120272176001547\n - type: nauc_mrr_at_1_diff1\n value: 69.58727592100516\n - type: nauc_mrr_at_1_max\n value: 38.236494689522026\n - type: nauc_mrr_at_1_std\n value: -14.833390831689597\n - type: nauc_mrr_at_20_diff1\n value: 65.01159809914586\n - type: nauc_mrr_at_20_max\n value: 43.33440319829618\n - type: nauc_mrr_at_20_std\n value: -10.039958228659726\n - type: nauc_mrr_at_3_diff1\n value: 65.2396323885909\n - type: nauc_mrr_at_3_max\n value: 42.26904017378952\n - type: nauc_mrr_at_3_std\n value: -11.793017036934044\n - type: nauc_mrr_at_5_diff1\n value: 64.96397227898036\n - type: nauc_mrr_at_5_max\n value: 43.231333789145424\n - type: nauc_mrr_at_5_std\n value: -10.349933732151372\n - type: nauc_ndcg_at_1000_diff1\n value: 64.26802655199876\n - type: nauc_ndcg_at_1000_max\n value: 45.854310744745185\n - type: nauc_ndcg_at_1000_std\n value: -6.184417305204082\n - type: nauc_ndcg_at_100_diff1\n value: 63.99268329609827\n - type: nauc_ndcg_at_100_max\n value: 46.31270128748375\n - type: nauc_ndcg_at_100_std\n value: -6.1393433180558965\n - type: nauc_ndcg_at_10_diff1\n value: 62.6735104141137\n - type: nauc_ndcg_at_10_max\n value: 45.54954799462398\n - type: nauc_ndcg_at_10_std\n value: -7.348851199024871\n - type: nauc_ndcg_at_1_diff1\n value: 69.58727592100516\n - type: nauc_ndcg_at_1_max\n value: 38.236494689522026\n - type: nauc_ndcg_at_1_std\n value: -14.833390831689597\n - type: nauc_ndcg_at_20_diff1\n value: 63.25899651677274\n - type: nauc_ndcg_at_20_max\n value: 45.952196968886014\n - type: nauc_ndcg_at_20_std\n value: -6.807607465125713\n - type: nauc_ndcg_at_3_diff1\n value: 63.65618337476822\n - type: nauc_ndcg_at_3_max\n value: 43.507890965228945\n - type: nauc_ndcg_at_3_std\n value: -10.73845622217601\n - type: nauc_ndcg_at_5_diff1\n value: 63.079162432921855\n - type: nauc_ndcg_at_5_max\n value: 45.38303443868148\n - type: nauc_ndcg_at_5_std\n value: -8.063657824835534\n - type: nauc_precision_at_1000_diff1\n value: 63.01459977930557\n - type: nauc_precision_at_1000_max\n value: 92.4253034547151\n - type: nauc_precision_at_1000_std\n value: 84.4845513963158\n - type: nauc_precision_at_100_diff1\n value: 57.17217119405878\n - type: nauc_precision_at_100_max\n value: 80.70049725316484\n - type: nauc_precision_at_100_std\n value: 41.78392287147403\n - type: nauc_precision_at_10_diff1\n value: 53.115665404390725\n - type: nauc_precision_at_10_max\n value: 55.73825657341263\n - type: nauc_precision_at_10_std\n value: 5.406226305013257\n - type: nauc_precision_at_1_diff1\n value: 69.58727592100516\n - type: nauc_precision_at_1_max\n value: 38.236494689522026\n - type: nauc_precision_at_1_std\n value: -14.833390831689597\n - type: nauc_precision_at_20_diff1\n value: 53.77730697622828\n - type: nauc_precision_at_20_max\n value: 61.88170819253054\n - type: nauc_precision_at_20_std\n value: 13.678730470003856\n - type: nauc_precision_at_3_diff1\n value: 58.580196992291455\n - type: nauc_precision_at_3_max\n value: 47.404834585376626\n - type: nauc_precision_at_3_std\n value: -7.374978769024051\n - type: nauc_precision_at_5_diff1\n value: 56.44564652606437\n - type: nauc_precision_at_5_max\n value: 53.08973975162324\n - type: nauc_precision_at_5_std\n value: 0.22762700141423803\n - type: nauc_recall_at_1000_diff1\n value: 63.01459977930565\n - type: nauc_recall_at_1000_max\n value: 92.42530345471532\n - type: nauc_recall_at_1000_std\n value: 84.48455139631602\n - type: nauc_recall_at_100_diff1\n value: 57.17217119405904\n - type: nauc_recall_at_100_max\n value: 80.70049725316468\n - type: nauc_recall_at_100_std\n value: 41.783922871474275\n - type: nauc_recall_at_10_diff1\n value: 53.11566540439087\n - type: nauc_recall_at_10_max\n value: 55.738256573412656\n - type: nauc_recall_at_10_std\n value: 5.406226305013377\n - type: nauc_recall_at_1_diff1\n value: 69.58727592100516\n - type: nauc_recall_at_1_max\n value: 38.236494689522026\n - type: nauc_recall_at_1_std\n value: -14.833390831689597\n - type: nauc_recall_at_20_diff1\n value: 53.77730697622846\n - type: nauc_recall_at_20_max\n value: 61.881708192530525\n - type: nauc_recall_at_20_std\n value: 13.678730470003947\n - type: nauc_recall_at_3_diff1\n value: 58.5801969922914\n - type: nauc_recall_at_3_max\n value: 47.40483458537654\n - type: nauc_recall_at_3_std\n value: -7.37497876902413\n - type: nauc_recall_at_5_diff1\n value: 56.445646526064394\n - type: nauc_recall_at_5_max\n value: 53.08973975162332\n - type: nauc_recall_at_5_std\n value: 0.22762700141428024\n - type: ndcg_at_1\n value: 46.6\n - type: ndcg_at_10\n value: 60.887\n - type: ndcg_at_100\n value: 64.18199999999999\n - type: ndcg_at_1000\n value: 64.726\n - type: ndcg_at_20\n value: 62.614999999999995\n - type: ndcg_at_3\n value: 56.038\n - type: ndcg_at_5\n value: 58.150999999999996\n - type: precision_at_1\n value: 46.6\n - type: precision_at_10\n value: 7.630000000000001\n - type: precision_at_100\n value: 0.914\n - type: precision_at_1000\n value: 0.096\n - type: precision_at_20\n value: 4.154999999999999\n - type: precision_at_3\n value: 20.9\n - type: precision_at_5\n value: 13.56\n - type: recall_at_1\n value: 46.6\n - type: recall_at_10\n value: 76.3\n - type: recall_at_100\n value: 91.4\n - type: recall_at_1000\n value: 95.6\n - type: recall_at_20\n value: 83.1\n - type: recall_at_3\n value: 62.7\n - type: recall_at_5\n value: 67.80000000000001\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB EmotionClassification (default)\n revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37\n split: test\n type: mteb/emotion\n metrics:\n - type: accuracy\n value: 73.29999999999998\n - type: f1\n value: 67.71473706580302\n - type: f1_weighted\n value: 74.83537255312045\n - type: main_score\n value: 73.29999999999998\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB FEVER (default)\n revision: bea83ef9e8fb933d90a2f1d5515737465d613e12\n split: test\n type: mteb/fever\n metrics:\n - type: map_at_1\n value: 78.371\n - type: map_at_10\n value: 85.762\n - type: map_at_100\n value: 85.954\n - type: map_at_1000\n value: 85.966\n - type: map_at_20\n value: 85.887\n - type: map_at_3\n value: 84.854\n - type: map_at_5\n value: 85.408\n - type: mrr_at_1\n value: 84.443\n - type: mrr_at_10\n value: 90.432\n - type: mrr_at_100\n value: 90.483\n - type: mrr_at_1000\n value: 90.484\n - type: mrr_at_20\n value: 90.473\n - type: mrr_at_3\n value: 89.89399999999999\n - type: mrr_at_5\n value: 90.244\n - type: ndcg_at_1\n value: 84.443\n - type: ndcg_at_10\n value: 89.05499999999999\n - type: ndcg_at_100\n value: 89.68\n - type: ndcg_at_1000\n value: 89.87899999999999\n - type: ndcg_at_20\n value: 89.381\n - type: ndcg_at_3\n value: 87.73100000000001\n - type: ndcg_at_5\n value: 88.425\n - type: precision_at_1\n value: 84.443\n - type: precision_at_10\n value: 10.520999999999999\n - type: precision_at_100\n value: 1.103\n - type: precision_at_1000\n value: 0.11399999999999999\n - type: precision_at_20\n value: 5.362\n - type: precision_at_3\n value: 33.198\n - type: precision_at_5\n value: 20.441000000000003\n - type: recall_at_1\n value: 78.371\n - type: recall_at_10\n value: 94.594\n - type: recall_at_100\n value: 96.97099999999999\n - type: recall_at_1000\n value: 98.18\n - type: recall_at_20\n value: 95.707\n - type: recall_at_3\n value: 90.853\n - type: recall_at_5\n value: 92.74799999999999\n - type: main_score\n value: 89.05499999999999\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB FiQA2018 (default)\n revision: 27a168819829fe9bcd655c2df245fb19452e8e06\n split: test\n type: mteb/fiqa\n metrics:\n - type: map_at_1\n value: 23.810000000000002\n - type: map_at_10\n value: 39.051\n - type: map_at_100\n value: 41.231\n - type: map_at_1000\n value: 41.376000000000005\n - type: map_at_20\n value: 40.227000000000004\n - type: map_at_3\n value: 33.915\n - type: map_at_5\n value: 36.459\n - type: mrr_at_1\n value: 48.148\n - type: mrr_at_10\n value: 55.765\n - type: mrr_at_100\n value: 56.495\n - type: mrr_at_1000\n value: 56.525999999999996\n - type: mrr_at_20\n value: 56.213\n - type: mrr_at_3\n value: 53.086\n - type: mrr_at_5\n value: 54.513999999999996\n - type: ndcg_at_1\n value: 48.148\n - type: ndcg_at_10\n value: 47.349999999999994\n - type: ndcg_at_100\n value: 54.61899999999999\n - type: ndcg_at_1000\n value: 56.830000000000005\n - type: ndcg_at_20\n value: 50.143\n - type: ndcg_at_3\n value: 43.108000000000004\n - type: ndcg_at_5\n value: 44.023\n - type: precision_at_1\n value: 48.148\n - type: precision_at_10\n value: 13.441\n - type: precision_at_100\n value: 2.085\n - type: precision_at_1000\n value: 0.248\n - type: precision_at_20\n value: 7.870000000000001\n - type: precision_at_3\n value: 28.909000000000002\n - type: precision_at_5\n value: 20.957\n - type: recall_at_1\n value: 23.810000000000002\n - type: recall_at_10\n value: 54.303000000000004\n - type: recall_at_100\n value: 81.363\n - type: recall_at_1000\n value: 94.391\n - type: recall_at_20\n value: 63.056999999999995\n - type: recall_at_3\n value: 38.098\n - type: recall_at_5\n value: 44.414\n - type: main_score\n value: 47.349999999999994\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB GeoreviewClassification (default)\n revision: 3765c0d1de6b7d264bc459433c45e5a75513839c\n split: test\n type: ai-forever/georeview-classification\n metrics:\n - type: accuracy\n value: 48.0126953125\n - type: f1\n value: 47.65764016160488\n - type: f1_weighted\n value: 47.65701659482088\n - type: main_score\n value: 48.0126953125\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB GeoreviewClusteringP2P (default)\n revision: 97a313c8fc85b47f13f33e7e9a95c1ad888c7fec\n split: test\n type: ai-forever/georeview-clustering-p2p\n metrics:\n - type: main_score\n value: 73.62357853672266\n - type: v_measure\n value: 73.62357853672266\n - type: v_measure_std\n value: 0.5942247545535766\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB GerDaLIR (default)\n revision: 0bb47f1d73827e96964edb84dfe552f62f4fd5eb\n split: test\n type: jinaai/ger_da_lir\n metrics:\n - type: main_score\n value: 16.227\n - type: map_at_1\n value: 8.082\n - type: map_at_10\n value: 12.959999999999999\n - type: map_at_100\n value: 13.923\n - type: map_at_1000\n value: 14.030999999999999\n - type: map_at_20\n value: 13.453000000000001\n - type: map_at_3\n value: 11.018\n - type: map_at_5\n value: 12.056000000000001\n - type: mrr_at_1\n value: 8.993332249146203\n - type: mrr_at_10\n value: 13.994013092850247\n - type: mrr_at_100\n value: 14.913737673149308\n - type: mrr_at_1000\n value: 15.00843809934407\n - type: mrr_at_20\n value: 14.470268462334007\n - type: mrr_at_3\n value: 12.000596302921846\n - type: mrr_at_5\n value: 13.070689000921561\n - type: nauc_map_at_1000_diff1\n value: 28.559639584013286\n - type: nauc_map_at_1000_max\n value: 25.533800126086714\n - type: nauc_map_at_1000_std\n value: 9.826551026628666\n - type: nauc_map_at_100_diff1\n value: 28.544724499331696\n - type: nauc_map_at_100_max\n value: 25.46734324526386\n - type: nauc_map_at_100_std\n value: 9.739314481785591\n - type: nauc_map_at_10_diff1\n value: 28.77447517718118\n - type: nauc_map_at_10_max\n value: 24.7431615237795\n - type: nauc_map_at_10_std\n value: 8.349878188033646\n - type: nauc_map_at_1_diff1\n value: 37.405452629895514\n - type: nauc_map_at_1_max\n value: 24.444208978394023\n - type: nauc_map_at_1_std\n value: 4.043820373810528\n - type: nauc_map_at_20_diff1\n value: 28.69764217789062\n - type: nauc_map_at_20_max\n value: 25.111848355996496\n - type: nauc_map_at_20_std\n value: 9.034829905305918\n - type: nauc_map_at_3_diff1\n value: 30.89053285076882\n - type: nauc_map_at_3_max\n value: 24.862886115911152\n - type: nauc_map_at_3_std\n value: 6.654260832396586\n - type: nauc_map_at_5_diff1\n value: 29.230629676604263\n - type: nauc_map_at_5_max\n value: 24.374302288018583\n - type: nauc_map_at_5_std\n value: 7.341846952319046\n - type: nauc_mrr_at_1000_diff1\n value: 28.086147932781426\n - type: nauc_mrr_at_1000_max\n value: 25.98698528264653\n - type: nauc_mrr_at_1000_std\n value: 9.917554348624545\n - type: nauc_mrr_at_100_diff1\n value: 28.069163279791336\n - type: nauc_mrr_at_100_max\n value: 25.949440010886804\n - type: nauc_mrr_at_100_std\n value: 9.874340979732578\n - type: nauc_mrr_at_10_diff1\n value: 28.239920869530046\n - type: nauc_mrr_at_10_max\n value: 25.351271409498576\n - type: nauc_mrr_at_10_std\n value: 8.669862759875162\n - type: nauc_mrr_at_1_diff1\n value: 35.96543040207856\n - type: nauc_mrr_at_1_max\n value: 25.488936487231967\n - type: nauc_mrr_at_1_std\n value: 4.76439131038345\n - type: nauc_mrr_at_20_diff1\n value: 28.18865871284607\n - type: nauc_mrr_at_20_max\n value: 25.67121763344746\n - type: nauc_mrr_at_20_std\n value: 9.297910707519472\n - type: nauc_mrr_at_3_diff1\n value: 30.166714199740717\n - type: nauc_mrr_at_3_max\n value: 25.541792491964877\n - type: nauc_mrr_at_3_std\n value: 7.083090296398472\n - type: nauc_mrr_at_5_diff1\n value: 28.68475284656478\n - type: nauc_mrr_at_5_max\n value: 24.994071363482835\n - type: nauc_mrr_at_5_std\n value: 7.687507254902365\n - type: nauc_ndcg_at_1000_diff1\n value: 25.292792613586467\n - type: nauc_ndcg_at_1000_max\n value: 29.211905289377178\n - type: nauc_ndcg_at_1000_std\n value: 18.088867467320355\n - type: nauc_ndcg_at_100_diff1\n value: 25.026905011089152\n - type: nauc_ndcg_at_100_max\n value: 27.98822281254431\n - type: nauc_ndcg_at_100_std\n value: 16.69456904301902\n - type: nauc_ndcg_at_10_diff1\n value: 25.972279051109503\n - type: nauc_ndcg_at_10_max\n value: 24.86486482734957\n - type: nauc_ndcg_at_10_std\n value: 10.398605822106353\n - type: nauc_ndcg_at_1_diff1\n value: 36.134710485184826\n - type: nauc_ndcg_at_1_max\n value: 25.384572790326025\n - type: nauc_ndcg_at_1_std\n value: 4.591863033771824\n - type: nauc_ndcg_at_20_diff1\n value: 25.850033660205536\n - type: nauc_ndcg_at_20_max\n value: 25.944243193140515\n - type: nauc_ndcg_at_20_std\n value: 12.392409721204892\n - type: nauc_ndcg_at_3_diff1\n value: 29.1966056380018\n - type: nauc_ndcg_at_3_max\n value: 24.978843156259913\n - type: nauc_ndcg_at_3_std\n value: 7.353914459205087\n - type: nauc_ndcg_at_5_diff1\n value: 26.795315295756282\n - type: nauc_ndcg_at_5_max\n value: 24.1196789150412\n - type: nauc_ndcg_at_5_std\n value: 8.311970988265172\n - type: nauc_precision_at_1000_diff1\n value: 9.128270550217984\n - type: nauc_precision_at_1000_max\n value: 35.79286915973607\n - type: nauc_precision_at_1000_std\n value: 39.15669472887154\n - type: nauc_precision_at_100_diff1\n value: 14.770289799034384\n - type: nauc_precision_at_100_max\n value: 34.58262232264337\n - type: nauc_precision_at_100_std\n value: 34.101148102981384\n - type: nauc_precision_at_10_diff1\n value: 19.899104673118178\n - type: nauc_precision_at_10_max\n value: 26.636940338985625\n - type: nauc_precision_at_10_std\n value: 15.73871357255849\n - type: nauc_precision_at_1_diff1\n value: 36.134710485184826\n - type: nauc_precision_at_1_max\n value: 25.384572790326025\n - type: nauc_precision_at_1_std\n value: 4.591863033771824\n - type: nauc_precision_at_20_diff1\n value: 19.423457975148942\n - type: nauc_precision_at_20_max\n value: 29.58123490878582\n - type: nauc_precision_at_20_std\n value: 20.847850110821618\n - type: nauc_precision_at_3_diff1\n value: 24.986416623492918\n - type: nauc_precision_at_3_max\n value: 25.973548400472975\n - type: nauc_precision_at_3_std\n value: 9.486410455972823\n - type: nauc_precision_at_5_diff1\n value: 21.237741424923332\n - type: nauc_precision_at_5_max\n value: 24.647141028200164\n - type: nauc_precision_at_5_std\n value: 11.102785032334147\n - type: nauc_recall_at_1000_diff1\n value: 15.999714888817829\n - type: nauc_recall_at_1000_max\n value: 44.34701908906545\n - type: nauc_recall_at_1000_std\n value: 51.13471291594717\n - type: nauc_recall_at_100_diff1\n value: 17.401714890483706\n - type: nauc_recall_at_100_max\n value: 33.39042631654808\n - type: nauc_recall_at_100_std\n value: 33.944446168451584\n - type: nauc_recall_at_10_diff1\n value: 20.30036232399894\n - type: nauc_recall_at_10_max\n value: 24.006718284396786\n - type: nauc_recall_at_10_std\n value: 14.049375108518669\n - type: nauc_recall_at_1_diff1\n value: 37.405452629895514\n - type: nauc_recall_at_1_max\n value: 24.444208978394023\n - type: nauc_recall_at_1_std\n value: 4.043820373810528\n - type: nauc_recall_at_20_diff1\n value: 20.23582802609045\n - type: nauc_recall_at_20_max\n value: 26.408063410785243\n - type: nauc_recall_at_20_std\n value: 18.617479515468112\n - type: nauc_recall_at_3_diff1\n value: 25.53221830103098\n - type: nauc_recall_at_3_max\n value: 24.283712329152678\n - type: nauc_recall_at_3_std\n value: 8.428947805841867\n - type: nauc_recall_at_5_diff1\n value: 21.741499601020823\n - type: nauc_recall_at_5_max\n value: 22.754924586295296\n - type: nauc_recall_at_5_std\n value: 9.966736688169814\n - type: ndcg_at_1\n value: 8.977\n - type: ndcg_at_10\n value: 16.227\n - type: ndcg_at_100\n value: 21.417\n - type: ndcg_at_1000\n value: 24.451\n - type: ndcg_at_20\n value: 17.982\n - type: ndcg_at_3\n value: 12.206999999999999\n - type: ndcg_at_5\n value: 14.059\n - type: precision_at_1\n value: 8.977\n - type: precision_at_10\n value: 2.933\n - type: precision_at_100\n value: 0.59\n - type: precision_at_1000\n value: 0.087\n - type: precision_at_20\n value: 1.8599999999999999\n - type: precision_at_3\n value: 5.550999999999999\n - type: precision_at_5\n value: 4.340999999999999\n - type: recall_at_1\n value: 8.082\n - type: recall_at_10\n value: 25.52\n - type: recall_at_100\n value: 50.32\n - type: recall_at_1000\n value: 74.021\n - type: recall_at_20\n value: 32.229\n - type: recall_at_3\n value: 14.66\n - type: recall_at_5\n value: 19.062\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB GermanDPR (default)\n revision: 5129d02422a66be600ac89cd3e8531b4f97d347d\n split: test\n type: deepset/germandpr\n metrics:\n - type: main_score\n value: 82.422\n - type: map_at_1\n value: 64.39\n - type: map_at_10\n value: 77.273\n - type: map_at_100\n value: 77.375\n - type: map_at_1000\n value: 77.376\n - type: map_at_20\n value: 77.351\n - type: map_at_3\n value: 75.46300000000001\n - type: map_at_5\n value: 76.878\n - type: mrr_at_1\n value: 64.19512195121952\n - type: mrr_at_10\n value: 77.15842044134736\n - type: mrr_at_100\n value: 77.2604854308704\n - type: mrr_at_1000\n value: 77.26087882190109\n - type: mrr_at_20\n value: 77.23572154560611\n - type: mrr_at_3\n value: 75.34959349593504\n - type: mrr_at_5\n value: 76.76422764227652\n - type: nauc_map_at_1000_diff1\n value: 49.73135253389972\n - type: nauc_map_at_1000_max\n value: 8.665570717396145\n - type: nauc_map_at_1000_std\n value: -25.920927572114522\n - type: nauc_map_at_100_diff1\n value: 49.729170775336605\n - type: nauc_map_at_100_max\n value: 8.66717979705074\n - type: nauc_map_at_100_std\n value: -25.918338868918596\n - type: nauc_map_at_10_diff1\n value: 49.708681691445925\n - type: nauc_map_at_10_max\n value: 8.830640635692113\n - type: nauc_map_at_10_std\n value: -25.843238986304858\n - type: nauc_map_at_1_diff1\n value: 51.750022350988914\n - type: nauc_map_at_1_max\n value: 3.599863010364626\n - type: nauc_map_at_1_std\n value: -27.670122127567314\n - type: nauc_map_at_20_diff1\n value: 49.72609185887161\n - type: nauc_map_at_20_max\n value: 8.766556053409218\n - type: nauc_map_at_20_std\n value: -25.85975887517904\n - type: nauc_map_at_3_diff1\n value: 49.328512536255595\n - type: nauc_map_at_3_max\n value: 9.475682028996795\n - type: nauc_map_at_3_std\n value: -26.277349632171017\n - type: nauc_map_at_5_diff1\n value: 49.42801822186142\n - type: nauc_map_at_5_max\n value: 8.788822474357252\n - type: nauc_map_at_5_std\n value: -25.959260882028573\n - type: nauc_mrr_at_1000_diff1\n value: 50.13038598302397\n - type: nauc_mrr_at_1000_max\n value: 8.734338637484832\n - type: nauc_mrr_at_1000_std\n value: -26.653343549855908\n - type: nauc_mrr_at_100_diff1\n value: 50.12820392111392\n - type: nauc_mrr_at_100_max\n value: 8.735940503917966\n - type: nauc_mrr_at_100_std\n value: -26.65074918231251\n - type: nauc_mrr_at_10_diff1\n value: 50.10567888458267\n - type: nauc_mrr_at_10_max\n value: 8.898451291748575\n - type: nauc_mrr_at_10_std\n value: -26.572046921975655\n - type: nauc_mrr_at_1_diff1\n value: 52.22769994409465\n - type: nauc_mrr_at_1_max\n value: 3.6490820146062015\n - type: nauc_mrr_at_1_std\n value: -28.535100562320498\n - type: nauc_mrr_at_20_diff1\n value: 50.12462222100699\n - type: nauc_mrr_at_20_max\n value: 8.83487018268756\n - type: nauc_mrr_at_20_std\n value: -26.591437036958332\n - type: nauc_mrr_at_3_diff1\n value: 49.6987353700016\n - type: nauc_mrr_at_3_max\n value: 9.531003760756258\n - type: nauc_mrr_at_3_std\n value: -26.949799063124818\n - type: nauc_mrr_at_5_diff1\n value: 49.823881656376585\n - type: nauc_mrr_at_5_max\n value: 8.850404667985085\n - type: nauc_mrr_at_5_std\n value: -26.680008966088582\n - type: nauc_ndcg_at_1000_diff1\n value: 49.41721203361181\n - type: nauc_ndcg_at_1000_max\n value: 9.41093067609825\n - type: nauc_ndcg_at_1000_std\n value: -25.499543637737567\n - type: nauc_ndcg_at_100_diff1\n value: 49.32810419509252\n - type: nauc_ndcg_at_100_max\n value: 9.476216458766897\n - type: nauc_ndcg_at_100_std\n value: -25.393856250990414\n - type: nauc_ndcg_at_10_diff1\n value: 49.181984436623694\n - type: nauc_ndcg_at_10_max\n value: 10.65234732763274\n - type: nauc_ndcg_at_10_std\n value: -24.737669349012297\n - type: nauc_ndcg_at_1_diff1\n value: 51.750022350988914\n - type: nauc_ndcg_at_1_max\n value: 3.599863010364626\n - type: nauc_ndcg_at_1_std\n value: -27.670122127567314\n - type: nauc_ndcg_at_20_diff1\n value: 49.275394594995056\n - type: nauc_ndcg_at_20_max\n value: 10.402059796651923\n - type: nauc_ndcg_at_20_std\n value: -24.82329915806705\n - type: nauc_ndcg_at_3_diff1\n value: 48.22614352152889\n - type: nauc_ndcg_at_3_max\n value: 11.67464280791404\n - type: nauc_ndcg_at_3_std\n value: -25.867824868234095\n - type: nauc_ndcg_at_5_diff1\n value: 48.35583502987241\n - type: nauc_ndcg_at_5_max\n value: 10.494278750448451\n - type: nauc_ndcg_at_5_std\n value: -25.11599634172764\n - type: nauc_precision_at_1000_diff1\n value: .nan\n - type: nauc_precision_at_1000_max\n value: .nan\n - type: nauc_precision_at_1000_std\n value: .nan\n - type: nauc_precision_at_100_diff1\n value: -56.39478136433852\n - type: nauc_precision_at_100_max\n value: 86.93518577529493\n - type: nauc_precision_at_100_std\n value: 100.0\n - type: nauc_precision_at_10_diff1\n value: 38.662829729133094\n - type: nauc_precision_at_10_max\n value: 56.38018435740605\n - type: nauc_precision_at_10_std\n value: 6.288091897081105\n - type: nauc_precision_at_1_diff1\n value: 51.750022350988914\n - type: nauc_precision_at_1_max\n value: 3.599863010364626\n - type: nauc_precision_at_1_std\n value: -27.670122127567314\n - type: nauc_precision_at_20_diff1\n value: 34.739153182429085\n - type: nauc_precision_at_20_max\n value: 84.86908403000989\n - type: nauc_precision_at_20_std\n value: 29.156199421219455\n - type: nauc_precision_at_3_diff1\n value: 42.09287362529135\n - type: nauc_precision_at_3_max\n value: 23.629152759287074\n - type: nauc_precision_at_3_std\n value: -23.721376911302492\n - type: nauc_precision_at_5_diff1\n value: 36.03866171924644\n - type: nauc_precision_at_5_max\n value: 29.166173558775327\n - type: nauc_precision_at_5_std\n value: -15.096374563068448\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: -56.39478136433541\n - type: nauc_recall_at_100_max\n value: 86.93518577528111\n - type: nauc_recall_at_100_std\n value: 100.0\n - type: nauc_recall_at_10_diff1\n value: 38.66282972913384\n - type: nauc_recall_at_10_max\n value: 56.3801843574071\n - type: nauc_recall_at_10_std\n value: 6.288091897082639\n - type: nauc_recall_at_1_diff1\n value: 51.750022350988914\n - type: nauc_recall_at_1_max\n value: 3.599863010364626\n - type: nauc_recall_at_1_std\n value: -27.670122127567314\n - type: nauc_recall_at_20_diff1\n value: 34.7391531824321\n - type: nauc_recall_at_20_max\n value: 84.86908403001016\n - type: nauc_recall_at_20_std\n value: 29.156199421220748\n - type: nauc_recall_at_3_diff1\n value: 42.09287362529107\n - type: nauc_recall_at_3_max\n value: 23.629152759286946\n - type: nauc_recall_at_3_std\n value: -23.72137691130291\n - type: nauc_recall_at_5_diff1\n value: 36.0386617192469\n - type: nauc_recall_at_5_max\n value: 29.1661735587759\n - type: nauc_recall_at_5_std\n value: -15.09637456306774\n - type: ndcg_at_1\n value: 64.39\n - type: ndcg_at_10\n value: 82.422\n - type: ndcg_at_100\n value: 82.86099999999999\n - type: ndcg_at_1000\n value: 82.87299999999999\n - type: ndcg_at_20\n value: 82.67999999999999\n - type: ndcg_at_3\n value: 78.967\n - type: ndcg_at_5\n value: 81.50699999999999\n - type: precision_at_1\n value: 64.39\n - type: precision_at_10\n value: 9.795\n - type: precision_at_100\n value: 0.9990000000000001\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.946\n - type: precision_at_3\n value: 29.691000000000003\n - type: precision_at_5\n value: 19.044\n - type: recall_at_1\n value: 64.39\n - type: recall_at_10\n value: 97.951\n - type: recall_at_100\n value: 99.902\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 98.92699999999999\n - type: recall_at_3\n value: 89.07300000000001\n - type: recall_at_5\n value: 95.22\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB GermanQuAD-Retrieval (default)\n revision: f5c87ae5a2e7a5106606314eef45255f03151bb3\n split: test\n type: mteb/germanquad-retrieval\n metrics:\n - type: main_score\n value: 94.15532365396247\n - type: map_at_1\n value: 90.789\n - type: map_at_10\n value: 94.24\n - type: map_at_100\n value: 94.283\n - type: map_at_1000\n value: 94.284\n - type: map_at_20\n value: 94.272\n - type: map_at_3\n value: 93.913\n - type: map_at_5\n value: 94.155\n - type: mrr_at_1\n value: 90.78947368421053\n - type: mrr_at_10\n value: 94.23987411056376\n - type: mrr_at_100\n value: 94.28320936825\n - type: mrr_at_1000\n value: 94.28350209115848\n - type: mrr_at_20\n value: 94.271919092559\n - type: mrr_at_3\n value: 93.91258318209313\n - type: mrr_at_5\n value: 94.15532365396247\n - type: nauc_map_at_1000_diff1\n value: 89.29089310650436\n - type: nauc_map_at_1000_max\n value: 73.83868784032414\n - type: nauc_map_at_1000_std\n value: -11.635778561889989\n - type: nauc_map_at_100_diff1\n value: 89.29077225707755\n - type: nauc_map_at_100_max\n value: 73.84002740580378\n - type: nauc_map_at_100_std\n value: -11.644096256165092\n - type: nauc_map_at_10_diff1\n value: 89.29117612292366\n - type: nauc_map_at_10_max\n value: 73.97487984981221\n - type: nauc_map_at_10_std\n value: -11.35191794373827\n - type: nauc_map_at_1_diff1\n value: 89.35436544117584\n - type: nauc_map_at_1_max\n value: 70.35936815057701\n - type: nauc_map_at_1_std\n value: -13.598996360976903\n - type: nauc_map_at_20_diff1\n value: 89.2530394052653\n - type: nauc_map_at_20_max\n value: 73.83537529419839\n - type: nauc_map_at_20_std\n value: -11.628272822028478\n - type: nauc_map_at_3_diff1\n value: 89.375111893546\n - type: nauc_map_at_3_max\n value: 74.78900366026112\n - type: nauc_map_at_3_std\n value: -12.720905253503274\n - type: nauc_map_at_5_diff1\n value: 89.35358300820893\n - type: nauc_map_at_5_max\n value: 74.31996219723239\n - type: nauc_map_at_5_std\n value: -10.768642638210867\n - type: nauc_mrr_at_1000_diff1\n value: 89.29089310650436\n - type: nauc_mrr_at_1000_max\n value: 73.83868784032414\n - type: nauc_mrr_at_1000_std\n value: -11.635778561889989\n - type: nauc_mrr_at_100_diff1\n value: 89.29077225707755\n - type: nauc_mrr_at_100_max\n value: 73.84002740580378\n - type: nauc_mrr_at_100_std\n value: -11.644096256165092\n - type: nauc_mrr_at_10_diff1\n value: 89.29117612292366\n - type: nauc_mrr_at_10_max\n value: 73.97487984981221\n - type: nauc_mrr_at_10_std\n value: -11.35191794373827\n - type: nauc_mrr_at_1_diff1\n value: 89.35436544117584\n - type: nauc_mrr_at_1_max\n value: 70.35936815057701\n - type: nauc_mrr_at_1_std\n value: -13.598996360976903\n - type: nauc_mrr_at_20_diff1\n value: 89.2530394052653\n - type: nauc_mrr_at_20_max\n value: 73.83537529419839\n - type: nauc_mrr_at_20_std\n value: -11.628272822028478\n - type: nauc_mrr_at_3_diff1\n value: 89.375111893546\n - type: nauc_mrr_at_3_max\n value: 74.78900366026112\n - type: nauc_mrr_at_3_std\n value: -12.720905253503274\n - type: nauc_mrr_at_5_diff1\n value: 89.35358300820893\n - type: nauc_mrr_at_5_max\n value: 74.31996219723239\n - type: nauc_mrr_at_5_std\n value: -10.768642638210867\n - type: nauc_ndcg_at_1000_diff1\n value: 89.27620775856863\n - type: nauc_ndcg_at_1000_max\n value: 74.2985757362615\n - type: nauc_ndcg_at_1000_std\n value: -11.236142819703023\n - type: nauc_ndcg_at_100_diff1\n value: 89.27284787540731\n - type: nauc_ndcg_at_100_max\n value: 74.33539303365968\n - type: nauc_ndcg_at_100_std\n value: -11.469413615851936\n - type: nauc_ndcg_at_10_diff1\n value: 89.21496710661724\n - type: nauc_ndcg_at_10_max\n value: 75.02035398490516\n - type: nauc_ndcg_at_10_std\n value: -9.903255803665814\n - type: nauc_ndcg_at_1_diff1\n value: 89.35436544117584\n - type: nauc_ndcg_at_1_max\n value: 70.35936815057701\n - type: nauc_ndcg_at_1_std\n value: -13.598996360976903\n - type: nauc_ndcg_at_20_diff1\n value: 89.03561289544179\n - type: nauc_ndcg_at_20_max\n value: 74.4006766600049\n - type: nauc_ndcg_at_20_std\n value: -11.129237862587743\n - type: nauc_ndcg_at_3_diff1\n value: 89.46540193201693\n - type: nauc_ndcg_at_3_max\n value: 76.87093548368378\n - type: nauc_ndcg_at_3_std\n value: -12.484902872086767\n - type: nauc_ndcg_at_5_diff1\n value: 89.39924941584766\n - type: nauc_ndcg_at_5_max\n value: 75.96975269092722\n - type: nauc_ndcg_at_5_std\n value: -8.180295581144833\n - type: nauc_precision_at_1000_diff1\n value: 100.0\n - type: nauc_precision_at_1000_max\n value: 100.0\n - type: nauc_precision_at_1000_std\n value: 100.0\n - type: nauc_precision_at_100_diff1\n value: 86.93074003795302\n - type: nauc_precision_at_100_max\n value: 100.0\n - type: nauc_precision_at_100_std\n value: -174.07785375176616\n - type: nauc_precision_at_10_diff1\n value: 87.43064119412082\n - type: nauc_precision_at_10_max\n value: 90.60785783417448\n - type: nauc_precision_at_10_std\n value: 15.378710059645906\n - type: nauc_precision_at_1_diff1\n value: 89.35436544117584\n - type: nauc_precision_at_1_max\n value: 70.35936815057701\n - type: nauc_precision_at_1_std\n value: -13.598996360976903\n - type: nauc_precision_at_20_diff1\n value: 78.78206037685919\n - type: nauc_precision_at_20_max\n value: 82.52264166455923\n - type: nauc_precision_at_20_std\n value: -5.95806599216658\n - type: nauc_precision_at_3_diff1\n value: 90.12709256456401\n - type: nauc_precision_at_3_max\n value: 90.72678805838154\n - type: nauc_precision_at_3_std\n value: -11.047599315631993\n - type: nauc_precision_at_5_diff1\n value: 89.9066873566561\n - type: nauc_precision_at_5_max\n value: 93.51571626543664\n - type: nauc_precision_at_5_std\n value: 22.632403279126162\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: 86.93074003793416\n - type: nauc_recall_at_100_max\n value: 100.0\n - type: nauc_recall_at_100_std\n value: -174.07785375175723\n - type: nauc_recall_at_10_diff1\n value: 87.43064119411991\n - type: nauc_recall_at_10_max\n value: 90.60785783417579\n - type: nauc_recall_at_10_std\n value: 15.378710059643607\n - type: nauc_recall_at_1_diff1\n value: 89.35436544117584\n - type: nauc_recall_at_1_max\n value: 70.35936815057701\n - type: nauc_recall_at_1_std\n value: -13.598996360976903\n - type: nauc_recall_at_20_diff1\n value: 78.78206037685645\n - type: nauc_recall_at_20_max\n value: 82.52264166455791\n - type: nauc_recall_at_20_std\n value: -5.958065992168697\n - type: nauc_recall_at_3_diff1\n value: 90.12709256456463\n - type: nauc_recall_at_3_max\n value: 90.7267880583832\n - type: nauc_recall_at_3_std\n value: -11.047599315631881\n - type: nauc_recall_at_5_diff1\n value: 89.90668735665676\n - type: nauc_recall_at_5_max\n value: 93.51571626543753\n - type: nauc_recall_at_5_std\n value: 22.632403279126112\n - type: ndcg_at_1\n value: 90.789\n - type: ndcg_at_10\n value: 95.46\n - type: ndcg_at_100\n value: 95.652\n - type: ndcg_at_1000\n value: 95.659\n - type: ndcg_at_20\n value: 95.575\n - type: ndcg_at_3\n value: 94.82000000000001\n - type: ndcg_at_5\n value: 95.26400000000001\n - type: precision_at_1\n value: 90.789\n - type: precision_at_10\n value: 9.908999999999999\n - type: precision_at_100\n value: 1.0\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.977\n - type: precision_at_3\n value: 32.471\n - type: precision_at_5\n value: 19.701\n - type: recall_at_1\n value: 90.789\n - type: recall_at_10\n value: 99.093\n - type: recall_at_100\n value: 99.955\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 99.546\n - type: recall_at_3\n value: 97.414\n - type: recall_at_5\n value: 98.503\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB GermanSTSBenchmark (default)\n revision: e36907544d44c3a247898ed81540310442329e20\n split: test\n type: jinaai/german-STSbenchmark\n metrics:\n - type: cosine_pearson\n value: 86.55319003300265\n - type: cosine_spearman\n value: 87.50267373081324\n - type: euclidean_pearson\n value: 87.41630636501863\n - type: euclidean_spearman\n value: 88.02170803409365\n - type: main_score\n value: 87.50267373081324\n - type: manhattan_pearson\n value: 87.33703179056744\n - type: manhattan_spearman\n value: 87.99192826922514\n - type: pearson\n value: 86.55319003300265\n - type: spearman\n value: 87.50267373081324\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB HALClusteringS2S (default)\n revision: e06ebbbb123f8144bef1a5d18796f3dec9ae2915\n split: test\n type: lyon-nlp/clustering-hal-s2s\n metrics:\n - type: main_score\n value: 27.477557517301303\n - type: v_measure\n value: 27.477557517301303\n - type: v_measure_std\n value: 3.3525736581861336\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB HeadlineClassification (default)\n revision: 2fe05ee6b5832cda29f2ef7aaad7b7fe6a3609eb\n split: test\n type: ai-forever/headline-classification\n metrics:\n - type: accuracy\n value: 75.0830078125\n - type: f1\n value: 75.08863209267814\n - type: f1_weighted\n value: 75.08895979060917\n - type: main_score\n value: 75.0830078125\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB HotpotQA (default)\n revision: ab518f4d6fcca38d87c25209f94beba119d02014\n split: test\n type: mteb/hotpotqa\n metrics:\n - type: map_at_1\n value: 38.143\n - type: map_at_10\n value: 55.916999999999994\n - type: map_at_100\n value: 56.706\n - type: map_at_1000\n value: 56.77100000000001\n - type: map_at_20\n value: 56.367\n - type: map_at_3\n value: 53.111\n - type: map_at_5\n value: 54.839000000000006\n - type: mrr_at_1\n value: 76.286\n - type: mrr_at_10\n value: 81.879\n - type: mrr_at_100\n value: 82.09100000000001\n - type: mrr_at_1000\n value: 82.101\n - type: mrr_at_20\n value: 82.01\n - type: mrr_at_3\n value: 80.972\n - type: mrr_at_5\n value: 81.537\n - type: ndcg_at_1\n value: 76.286\n - type: ndcg_at_10\n value: 64.673\n - type: ndcg_at_100\n value: 67.527\n - type: ndcg_at_1000\n value: 68.857\n - type: ndcg_at_20\n value: 65.822\n - type: ndcg_at_3\n value: 60.616\n - type: ndcg_at_5\n value: 62.827999999999996\n - type: precision_at_1\n value: 76.286\n - type: precision_at_10\n value: 13.196\n - type: precision_at_100\n value: 1.544\n - type: precision_at_1000\n value: 0.172\n - type: precision_at_20\n value: 6.968000000000001\n - type: precision_at_3\n value: 37.992\n - type: precision_at_5\n value: 24.54\n - type: recall_at_1\n value: 38.143\n - type: recall_at_10\n value: 65.982\n - type: recall_at_100\n value: 77.225\n - type: recall_at_1000\n value: 86.077\n - type: recall_at_20\n value: 69.68299999999999\n - type: recall_at_3\n value: 56.989000000000004\n - type: recall_at_5\n value: 61.35\n - type: main_score\n value: 64.673\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB IFlyTek (default)\n revision: 421605374b29664c5fc098418fe20ada9bd55f8a\n split: validation\n type: C-MTEB/IFlyTek-classification\n metrics:\n - type: accuracy\n value: 41.67756829549827\n - type: f1\n value: 33.929325579581636\n - type: f1_weighted\n value: 43.03952025643197\n - type: main_score\n value: 41.67756829549827\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB ImdbClassification (default)\n revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7\n split: test\n type: mteb/imdb\n metrics:\n - type: accuracy\n value: 91.90440000000001\n - type: ap\n value: 88.78663714603425\n - type: ap_weighted\n value: 88.78663714603425\n - type: f1\n value: 91.89564361975891\n - type: f1_weighted\n value: 91.89564361975891\n - type: main_score\n value: 91.90440000000001\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB InappropriatenessClassification (default)\n revision: 601651fdc45ef243751676e62dd7a19f491c0285\n split: test\n type: ai-forever/inappropriateness-classification\n metrics:\n - type: accuracy\n value: 61.0498046875\n - type: ap\n value: 57.04240566648215\n - type: ap_weighted\n value: 57.04240566648215\n - type: f1\n value: 60.867630038606954\n - type: f1_weighted\n value: 60.867630038606954\n - type: main_score\n value: 61.0498046875\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB JDReview (default)\n revision: b7c64bd89eb87f8ded463478346f76731f07bf8b\n split: test\n type: C-MTEB/JDReview-classification\n metrics:\n - type: accuracy\n value: 83.50844277673546\n - type: ap\n value: 48.46732380712268\n - type: ap_weighted\n value: 48.46732380712268\n - type: f1\n value: 77.43967451387445\n - type: f1_weighted\n value: 84.78462929014114\n - type: main_score\n value: 83.50844277673546\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB KinopoiskClassification (default)\n revision: 5911f26666ac11af46cb9c6849d0dc80a378af24\n split: test\n type: ai-forever/kinopoisk-sentiment-classification\n metrics:\n - type: accuracy\n value: 62.393333333333324\n - type: f1\n value: 61.35940129568015\n - type: f1_weighted\n value: 61.35940129568015\n - type: main_score\n value: 62.393333333333324\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB LCQMC (default)\n revision: 17f9b096f80380fce5ed12a9be8be7784b337daf\n split: test\n type: C-MTEB/LCQMC\n metrics:\n - type: cosine_pearson\n value: 67.74375505907872\n - type: cosine_spearman\n value: 75.94582231399434\n - type: euclidean_pearson\n value: 74.52501692443582\n - type: euclidean_spearman\n value: 75.88428434746646\n - type: main_score\n value: 75.94582231399434\n - type: manhattan_pearson\n value: 74.55015441749529\n - type: manhattan_spearman\n value: 75.83288262176175\n - type: pearson\n value: 67.74375505907872\n - type: spearman\n value: 75.94582231399434\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB LEMBNarrativeQARetrieval (default)\n revision: 6e346642246bfb4928c560ee08640dc84d074e8c\n split: test\n type: dwzhu/LongEmbed\n metrics:\n - type: map_at_1\n value: 23.093\n - type: map_at_10\n value: 30.227999999999998\n - type: map_at_100\n value: 31.423000000000002\n - type: map_at_1000\n value: 31.533\n - type: map_at_20\n value: 30.835\n - type: map_at_3\n value: 27.983999999999998\n - type: map_at_5\n value: 29.253\n - type: mrr_at_1\n value: 23.093\n - type: mrr_at_10\n value: 30.227999999999998\n - type: mrr_at_100\n value: 31.423000000000002\n - type: mrr_at_1000\n value: 31.533\n - type: mrr_at_20\n value: 30.835\n - type: mrr_at_3\n value: 27.983999999999998\n - type: mrr_at_5\n value: 29.253\n - type: ndcg_at_1\n value: 23.093\n - type: ndcg_at_10\n value: 34.297\n - type: ndcg_at_100\n value: 41.049\n - type: ndcg_at_1000\n value: 43.566\n - type: ndcg_at_20\n value: 36.52\n - type: ndcg_at_3\n value: 29.629\n - type: ndcg_at_5\n value: 31.926\n - type: precision_at_1\n value: 23.093\n - type: precision_at_10\n value: 4.735\n - type: precision_at_100\n value: 0.8109999999999999\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 2.8080000000000003\n - type: precision_at_3\n value: 11.468\n - type: precision_at_5\n value: 8.001\n - type: recall_at_1\n value: 23.093\n - type: recall_at_10\n value: 47.354\n - type: recall_at_100\n value: 81.147\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 56.16799999999999\n - type: recall_at_3\n value: 34.405\n - type: recall_at_5\n value: 40.004\n - type: main_score\n value: 34.297\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB LEMBNeedleRetrieval (default)\n revision: 6e346642246bfb4928c560ee08640dc84d074e8c\n split: test_256\n type: dwzhu/LongEmbed\n metrics:\n - type: map_at_1\n value: 64.0\n - type: map_at_10\n value: 77.083\n - type: map_at_100\n value: 77.265\n - type: map_at_1000\n value: 77.265\n - type: map_at_20\n value: 77.265\n - type: map_at_3\n value: 76.333\n - type: map_at_5\n value: 76.833\n - type: mrr_at_1\n value: 64.0\n - type: mrr_at_10\n value: 77.083\n - type: mrr_at_100\n value: 77.265\n - type: mrr_at_1000\n value: 77.265\n - type: mrr_at_20\n value: 77.265\n - type: mrr_at_3\n value: 76.333\n - type: mrr_at_5\n value: 76.833\n - type: ndcg_at_1\n value: 64.0\n - type: ndcg_at_10\n value: 82.325\n - type: ndcg_at_100\n value: 82.883\n - type: ndcg_at_1000\n value: 82.883\n - type: ndcg_at_20\n value: 82.883\n - type: ndcg_at_3\n value: 80.833\n - type: ndcg_at_5\n value: 81.694\n - type: precision_at_1\n value: 64.0\n - type: precision_at_10\n value: 9.8\n - type: precision_at_100\n value: 1.0\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 5.0\n - type: precision_at_3\n value: 31.333\n - type: precision_at_5\n value: 19.2\n - type: recall_at_1\n value: 64.0\n - type: recall_at_10\n value: 98.0\n - type: recall_at_100\n value: 100.0\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 100.0\n - type: recall_at_3\n value: 94.0\n - type: recall_at_5\n value: 96.0\n - type: main_score\n value: 64.0\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB LEMBPasskeyRetrieval (default)\n revision: 6e346642246bfb4928c560ee08640dc84d074e8c\n split: test_256\n type: dwzhu/LongEmbed\n metrics:\n - type: map_at_1\n value: 100.0\n - type: map_at_10\n value: 100.0\n - type: map_at_100\n value: 100.0\n - type: map_at_1000\n value: 100.0\n - type: map_at_20\n value: 100.0\n - type: map_at_3\n value: 100.0\n - type: map_at_5\n value: 100.0\n - type: mrr_at_1\n value: 100.0\n - type: mrr_at_10\n value: 100.0\n - type: mrr_at_100\n value: 100.0\n - type: mrr_at_1000\n value: 100.0\n - type: mrr_at_20\n value: 100.0\n - type: mrr_at_3\n value: 100.0\n - type: mrr_at_5\n value: 100.0\n - type: ndcg_at_1\n value: 100.0\n - type: ndcg_at_10\n value: 100.0\n - type: ndcg_at_100\n value: 100.0\n - type: ndcg_at_1000\n value: 100.0\n - type: ndcg_at_20\n value: 100.0\n - type: ndcg_at_3\n value: 100.0\n - type: ndcg_at_5\n value: 100.0\n - type: precision_at_1\n value: 100.0\n - type: precision_at_10\n value: 10.0\n - type: precision_at_100\n value: 1.0\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 5.0\n - type: precision_at_3\n value: 33.333\n - type: precision_at_5\n value: 20.0\n - type: recall_at_1\n value: 100.0\n - type: recall_at_10\n value: 100.0\n - type: recall_at_100\n value: 100.0\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 100.0\n - type: recall_at_3\n value: 100.0\n - type: recall_at_5\n value: 100.0\n - type: main_score\n value: 100.0\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB LEMBQMSumRetrieval (default)\n revision: 6e346642246bfb4928c560ee08640dc84d074e8c\n split: test\n type: dwzhu/LongEmbed\n metrics:\n - type: map_at_1\n value: 24.361\n - type: map_at_10\n value: 33.641\n - type: map_at_100\n value: 35.104\n - type: map_at_1000\n value: 35.127\n - type: map_at_20\n value: 34.388999999999996\n - type: map_at_3\n value: 30.255\n - type: map_at_5\n value: 32.079\n - type: mrr_at_1\n value: 24.361\n - type: mrr_at_10\n value: 33.641\n - type: mrr_at_100\n value: 35.104\n - type: mrr_at_1000\n value: 35.127\n - type: mrr_at_20\n value: 34.388999999999996\n - type: mrr_at_3\n value: 30.255\n - type: mrr_at_5\n value: 32.079\n - type: ndcg_at_1\n value: 24.361\n - type: ndcg_at_10\n value: 39.337\n - type: ndcg_at_100\n value: 47.384\n - type: ndcg_at_1000\n value: 47.75\n - type: ndcg_at_20\n value: 42.077999999999996\n - type: ndcg_at_3\n value: 32.235\n - type: ndcg_at_5\n value: 35.524\n - type: precision_at_1\n value: 24.361\n - type: precision_at_10\n value: 5.783\n - type: precision_at_100\n value: 0.975\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 3.435\n - type: precision_at_3\n value: 12.661\n - type: precision_at_5\n value: 9.193999999999999\n - type: recall_at_1\n value: 24.361\n - type: recall_at_10\n value: 57.826\n - type: recall_at_100\n value: 97.51100000000001\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 68.697\n - type: recall_at_3\n value: 37.983\n - type: recall_at_5\n value: 45.972\n - type: main_score\n value: 39.337\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB LEMBSummScreenFDRetrieval (default)\n revision: 6e346642246bfb4928c560ee08640dc84d074e8c\n split: validation\n type: dwzhu/LongEmbed\n metrics:\n - type: map_at_1\n value: 84.821\n - type: map_at_10\n value: 90.11200000000001\n - type: map_at_100\n value: 90.158\n - type: map_at_1000\n value: 90.158\n - type: map_at_20\n value: 90.137\n - type: map_at_3\n value: 89.385\n - type: map_at_5\n value: 89.876\n - type: mrr_at_1\n value: 84.821\n - type: mrr_at_10\n value: 90.11200000000001\n - type: mrr_at_100\n value: 90.158\n - type: mrr_at_1000\n value: 90.158\n - type: mrr_at_20\n value: 90.137\n - type: mrr_at_3\n value: 89.385\n - type: mrr_at_5\n value: 89.876\n - type: ndcg_at_1\n value: 84.821\n - type: ndcg_at_10\n value: 92.334\n - type: ndcg_at_100\n value: 92.535\n - type: ndcg_at_1000\n value: 92.535\n - type: ndcg_at_20\n value: 92.414\n - type: ndcg_at_3\n value: 90.887\n - type: ndcg_at_5\n value: 91.758\n - type: precision_at_1\n value: 84.821\n - type: precision_at_10\n value: 9.911\n - type: precision_at_100\n value: 1.0\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.97\n - type: precision_at_3\n value: 31.746000000000002\n - type: precision_at_5\n value: 19.464000000000002\n - type: recall_at_1\n value: 84.821\n - type: recall_at_10\n value: 99.107\n - type: recall_at_100\n value: 100.0\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 99.405\n - type: recall_at_3\n value: 95.238\n - type: recall_at_5\n value: 97.321\n - type: main_score\n value: 92.334\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB LEMBWikimQARetrieval (default)\n revision: 6e346642246bfb4928c560ee08640dc84d074e8c\n split: test\n type: dwzhu/LongEmbed\n metrics:\n - type: map_at_1\n value: 53.667\n - type: map_at_10\n value: 61.719\n - type: map_at_100\n value: 62.471\n - type: map_at_1000\n value: 62.492000000000004\n - type: map_at_20\n value: 62.153000000000006\n - type: map_at_3\n value: 59.167\n - type: map_at_5\n value: 60.95\n - type: mrr_at_1\n value: 53.667\n - type: mrr_at_10\n value: 61.719\n - type: mrr_at_100\n value: 62.471\n - type: mrr_at_1000\n value: 62.492000000000004\n - type: mrr_at_20\n value: 62.153000000000006\n - type: mrr_at_3\n value: 59.167\n - type: mrr_at_5\n value: 60.95\n - type: ndcg_at_1\n value: 53.667\n - type: ndcg_at_10\n value: 66.018\n - type: ndcg_at_100\n value: 69.726\n - type: ndcg_at_1000\n value: 70.143\n - type: ndcg_at_20\n value: 67.61399999999999\n - type: ndcg_at_3\n value: 60.924\n - type: ndcg_at_5\n value: 64.10900000000001\n - type: precision_at_1\n value: 53.667\n - type: precision_at_10\n value: 7.9670000000000005\n - type: precision_at_100\n value: 0.97\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.3\n - type: precision_at_3\n value: 22.0\n - type: precision_at_5\n value: 14.732999999999999\n - type: recall_at_1\n value: 53.667\n - type: recall_at_10\n value: 79.667\n - type: recall_at_100\n value: 97.0\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 86.0\n - type: recall_at_3\n value: 66.0\n - type: recall_at_5\n value: 73.667\n - type: main_score\n value: 66.018\n task:\n type: Retrieval\n - dataset:\n config: deu-deu\n name: MTEB MLQARetrieval (deu-deu)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 67.548\n - type: map_at_1\n value: 56.559000000000005\n - type: map_at_10\n value: 63.867\n - type: map_at_100\n value: 64.429\n - type: map_at_1000\n value: 64.457\n - type: map_at_20\n value: 64.215\n - type: map_at_3\n value: 62.109\n - type: map_at_5\n value: 63.101\n - type: mrr_at_1\n value: 56.56990915134057\n - type: mrr_at_10\n value: 63.86820789324668\n - type: mrr_at_100\n value: 64.42973602152581\n - type: mrr_at_1000\n value: 64.45818598090155\n - type: mrr_at_20\n value: 64.2163052263868\n - type: mrr_at_3\n value: 62.10946155550634\n - type: mrr_at_5\n value: 63.10104143585199\n - type: nauc_map_at_1000_diff1\n value: 73.78440163370111\n - type: nauc_map_at_1000_max\n value: 66.37875518052162\n - type: nauc_map_at_1000_std\n value: -17.063915098135396\n - type: nauc_map_at_100_diff1\n value: 73.77180802985815\n - type: nauc_map_at_100_max\n value: 66.38365998362033\n - type: nauc_map_at_100_std\n value: -17.053345109661972\n - type: nauc_map_at_10_diff1\n value: 73.70041876696037\n - type: nauc_map_at_10_max\n value: 66.33213342705997\n - type: nauc_map_at_10_std\n value: -17.40657791273925\n - type: nauc_map_at_1_diff1\n value: 76.8784374396948\n - type: nauc_map_at_1_max\n value: 64.07170606935357\n - type: nauc_map_at_1_std\n value: -18.464213686790654\n - type: nauc_map_at_20_diff1\n value: 73.72371377231813\n - type: nauc_map_at_20_max\n value: 66.42108121059451\n - type: nauc_map_at_20_std\n value: -17.05384923889036\n - type: nauc_map_at_3_diff1\n value: 74.08287018839246\n - type: nauc_map_at_3_max\n value: 66.42422337760333\n - type: nauc_map_at_3_std\n value: -17.79503404131652\n - type: nauc_map_at_5_diff1\n value: 73.9294779027339\n - type: nauc_map_at_5_max\n value: 66.51752041065726\n - type: nauc_map_at_5_std\n value: -17.67309805113804\n - type: nauc_mrr_at_1000_diff1\n value: 73.78389736923545\n - type: nauc_mrr_at_1000_max\n value: 66.37929720858341\n - type: nauc_mrr_at_1000_std\n value: -17.058591711291278\n - type: nauc_mrr_at_100_diff1\n value: 73.77126451253136\n - type: nauc_mrr_at_100_max\n value: 66.38405917246607\n - type: nauc_mrr_at_100_std\n value: -17.047251035212863\n - type: nauc_mrr_at_10_diff1\n value: 73.69960470665124\n - type: nauc_mrr_at_10_max\n value: 66.33265194210313\n - type: nauc_mrr_at_10_std\n value: -17.399659076827998\n - type: nauc_mrr_at_1_diff1\n value: 76.8689850260726\n - type: nauc_mrr_at_1_max\n value: 64.09858188287487\n - type: nauc_mrr_at_1_std\n value: -18.46064784201847\n - type: nauc_mrr_at_20_diff1\n value: 73.72312682063128\n - type: nauc_mrr_at_20_max\n value: 66.42181932858745\n - type: nauc_mrr_at_20_std\n value: -17.04690257511092\n - type: nauc_mrr_at_3_diff1\n value: 74.08287018839246\n - type: nauc_mrr_at_3_max\n value: 66.42422337760333\n - type: nauc_mrr_at_3_std\n value: -17.79503404131652\n - type: nauc_mrr_at_5_diff1\n value: 73.9294779027339\n - type: nauc_mrr_at_5_max\n value: 66.51752041065726\n - type: nauc_mrr_at_5_std\n value: -17.67309805113804\n - type: nauc_ndcg_at_1000_diff1\n value: 72.97825548342801\n - type: nauc_ndcg_at_1000_max\n value: 66.96275437178257\n - type: nauc_ndcg_at_1000_std\n value: -15.611902299641587\n - type: nauc_ndcg_at_100_diff1\n value: 72.58724738936613\n - type: nauc_ndcg_at_100_max\n value: 67.16774012704182\n - type: nauc_ndcg_at_100_std\n value: -14.945088654796812\n - type: nauc_ndcg_at_10_diff1\n value: 72.16253640477947\n - type: nauc_ndcg_at_10_max\n value: 67.01746849484621\n - type: nauc_ndcg_at_10_std\n value: -16.46102507270809\n - type: nauc_ndcg_at_1_diff1\n value: 76.8689850260726\n - type: nauc_ndcg_at_1_max\n value: 64.09858188287487\n - type: nauc_ndcg_at_1_std\n value: -18.46064784201847\n - type: nauc_ndcg_at_20_diff1\n value: 72.19995325129975\n - type: nauc_ndcg_at_20_max\n value: 67.39639713797962\n - type: nauc_ndcg_at_20_std\n value: -15.091689370748531\n - type: nauc_ndcg_at_3_diff1\n value: 73.13123604206514\n - type: nauc_ndcg_at_3_max\n value: 67.23123167871547\n - type: nauc_ndcg_at_3_std\n value: -17.492755234009156\n - type: nauc_ndcg_at_5_diff1\n value: 72.8154718929895\n - type: nauc_ndcg_at_5_max\n value: 67.44578008373777\n - type: nauc_ndcg_at_5_std\n value: -17.251840358751362\n - type: nauc_precision_at_1000_diff1\n value: 47.89748325983604\n - type: nauc_precision_at_1000_max\n value: 70.47466197804906\n - type: nauc_precision_at_1000_std\n value: 72.66193512114775\n - type: nauc_precision_at_100_diff1\n value: 59.493743734005356\n - type: nauc_precision_at_100_max\n value: 74.02140147220713\n - type: nauc_precision_at_100_std\n value: 17.26664098026236\n - type: nauc_precision_at_10_diff1\n value: 64.94415011040277\n - type: nauc_precision_at_10_max\n value: 69.6963814950747\n - type: nauc_precision_at_10_std\n value: -11.663043657012954\n - type: nauc_precision_at_1_diff1\n value: 76.8689850260726\n - type: nauc_precision_at_1_max\n value: 64.09858188287487\n - type: nauc_precision_at_1_std\n value: -18.46064784201847\n - type: nauc_precision_at_20_diff1\n value: 63.145886909986416\n - type: nauc_precision_at_20_max\n value: 72.95708033630744\n - type: nauc_precision_at_20_std\n value: -1.5039593629280323\n - type: nauc_precision_at_3_diff1\n value: 69.88902201644449\n - type: nauc_precision_at_3_max\n value: 69.80499971089935\n - type: nauc_precision_at_3_std\n value: -16.444680766676647\n - type: nauc_precision_at_5_diff1\n value: 68.60869967062919\n - type: nauc_precision_at_5_max\n value: 70.75998207564281\n - type: nauc_precision_at_5_std\n value: -15.62613396998262\n - type: nauc_recall_at_1000_diff1\n value: 62.6646436338833\n - type: nauc_recall_at_1000_max\n value: 86.17801636476078\n - type: nauc_recall_at_1000_std\n value: 71.84718775540334\n - type: nauc_recall_at_100_diff1\n value: 61.110492191439505\n - type: nauc_recall_at_100_max\n value: 75.45730686603042\n - type: nauc_recall_at_100_std\n value: 16.202465011589428\n - type: nauc_recall_at_10_diff1\n value: 65.1522196516815\n - type: nauc_recall_at_10_max\n value: 69.7626435962161\n - type: nauc_recall_at_10_std\n value: -11.801178474770449\n - type: nauc_recall_at_1_diff1\n value: 76.8784374396948\n - type: nauc_recall_at_1_max\n value: 64.07170606935357\n - type: nauc_recall_at_1_std\n value: -18.464213686790654\n - type: nauc_recall_at_20_diff1\n value: 63.40332739504143\n - type: nauc_recall_at_20_max\n value: 73.04113661090965\n - type: nauc_recall_at_20_std\n value: -1.6609741140266947\n - type: nauc_recall_at_3_diff1\n value: 70.03728086098866\n - type: nauc_recall_at_3_max\n value: 69.85953774320521\n - type: nauc_recall_at_3_std\n value: -16.482993123411706\n - type: nauc_recall_at_5_diff1\n value: 68.77396121765933\n - type: nauc_recall_at_5_max\n value: 70.8231205493519\n - type: nauc_recall_at_5_std\n value: -15.668037770700863\n - type: ndcg_at_1\n value: 56.57\n - type: ndcg_at_10\n value: 67.548\n - type: ndcg_at_100\n value: 70.421\n - type: ndcg_at_1000\n value: 71.198\n - type: ndcg_at_20\n value: 68.829\n - type: ndcg_at_3\n value: 63.88700000000001\n - type: ndcg_at_5\n value: 65.689\n - type: precision_at_1\n value: 56.57\n - type: precision_at_10\n value: 7.922\n - type: precision_at_100\n value: 0.9299999999999999\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.216\n - type: precision_at_3\n value: 23.015\n - type: precision_at_5\n value: 14.691\n - type: recall_at_1\n value: 56.559000000000005\n - type: recall_at_10\n value: 79.182\n - type: recall_at_100\n value: 92.946\n - type: recall_at_1000\n value: 99.092\n - type: recall_at_20\n value: 84.27900000000001\n - type: recall_at_3\n value: 69.023\n - type: recall_at_5\n value: 73.432\n task:\n type: Retrieval\n - dataset:\n config: deu-spa\n name: MTEB MLQARetrieval (deu-spa)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 70.645\n - type: map_at_1\n value: 58.423\n - type: map_at_10\n value: 66.613\n - type: map_at_100\n value: 67.14099999999999\n - type: map_at_1000\n value: 67.161\n - type: map_at_20\n value: 66.965\n - type: map_at_3\n value: 64.714\n - type: map_at_5\n value: 65.835\n - type: mrr_at_1\n value: 58.4225352112676\n - type: mrr_at_10\n value: 66.61321260898735\n - type: mrr_at_100\n value: 67.13991570812132\n - type: mrr_at_1000\n value: 67.1598532168174\n - type: mrr_at_20\n value: 66.96384710024888\n - type: mrr_at_3\n value: 64.71361502347425\n - type: mrr_at_5\n value: 65.83474178403769\n - type: nauc_map_at_1000_diff1\n value: 73.9485117118935\n - type: nauc_map_at_1000_max\n value: 65.74479869396299\n - type: nauc_map_at_1000_std\n value: -20.300269749495563\n - type: nauc_map_at_100_diff1\n value: 73.93900406302829\n - type: nauc_map_at_100_max\n value: 65.75508449194885\n - type: nauc_map_at_100_std\n value: -20.265330791570175\n - type: nauc_map_at_10_diff1\n value: 73.84863233472605\n - type: nauc_map_at_10_max\n value: 65.89377317378211\n - type: nauc_map_at_10_std\n value: -20.404123131964695\n - type: nauc_map_at_1_diff1\n value: 76.73627284218519\n - type: nauc_map_at_1_max\n value: 62.94957512510876\n - type: nauc_map_at_1_std\n value: -20.99649749330682\n - type: nauc_map_at_20_diff1\n value: 73.88712006109598\n - type: nauc_map_at_20_max\n value: 65.82057018162664\n - type: nauc_map_at_20_std\n value: -20.269476512431915\n - type: nauc_map_at_3_diff1\n value: 74.21419190161502\n - type: nauc_map_at_3_max\n value: 65.64993368062119\n - type: nauc_map_at_3_std\n value: -21.34641749007071\n - type: nauc_map_at_5_diff1\n value: 74.0119419385777\n - type: nauc_map_at_5_max\n value: 65.69809416369732\n - type: nauc_map_at_5_std\n value: -21.16901556082261\n - type: nauc_mrr_at_1000_diff1\n value: 73.94915184134923\n - type: nauc_mrr_at_1000_max\n value: 65.74522469633418\n - type: nauc_mrr_at_1000_std\n value: -20.303028367132246\n - type: nauc_mrr_at_100_diff1\n value: 73.93964394728808\n - type: nauc_mrr_at_100_max\n value: 65.75550992323707\n - type: nauc_mrr_at_100_std\n value: -20.26808820438918\n - type: nauc_mrr_at_10_diff1\n value: 73.84863233472605\n - type: nauc_mrr_at_10_max\n value: 65.89377317378211\n - type: nauc_mrr_at_10_std\n value: -20.404123131964695\n - type: nauc_mrr_at_1_diff1\n value: 76.73627284218519\n - type: nauc_mrr_at_1_max\n value: 62.94957512510876\n - type: nauc_mrr_at_1_std\n value: -20.99649749330682\n - type: nauc_mrr_at_20_diff1\n value: 73.88775721128745\n - type: nauc_mrr_at_20_max\n value: 65.820991355628\n - type: nauc_mrr_at_20_std\n value: -20.272216587019734\n - type: nauc_mrr_at_3_diff1\n value: 74.21419190161502\n - type: nauc_mrr_at_3_max\n value: 65.64993368062119\n - type: nauc_mrr_at_3_std\n value: -21.34641749007071\n - type: nauc_mrr_at_5_diff1\n value: 74.0119419385777\n - type: nauc_mrr_at_5_max\n value: 65.69809416369732\n - type: nauc_mrr_at_5_std\n value: -21.16901556082261\n - type: nauc_ndcg_at_1000_diff1\n value: 73.29396365944277\n - type: nauc_ndcg_at_1000_max\n value: 66.44879592109541\n - type: nauc_ndcg_at_1000_std\n value: -19.285991058788195\n - type: nauc_ndcg_at_100_diff1\n value: 73.0159172721162\n - type: nauc_ndcg_at_100_max\n value: 66.76216389231388\n - type: nauc_ndcg_at_100_std\n value: -18.27931368094887\n - type: nauc_ndcg_at_10_diff1\n value: 72.42096650774693\n - type: nauc_ndcg_at_10_max\n value: 67.48592688463306\n - type: nauc_ndcg_at_10_std\n value: -18.91453756077581\n - type: nauc_ndcg_at_1_diff1\n value: 76.73627284218519\n - type: nauc_ndcg_at_1_max\n value: 62.94957512510876\n - type: nauc_ndcg_at_1_std\n value: -20.99649749330682\n - type: nauc_ndcg_at_20_diff1\n value: 72.53699362385684\n - type: nauc_ndcg_at_20_max\n value: 67.22763976357872\n - type: nauc_ndcg_at_20_std\n value: -18.299910635008338\n - type: nauc_ndcg_at_3_diff1\n value: 73.3698453761989\n - type: nauc_ndcg_at_3_max\n value: 66.71056987289383\n - type: nauc_ndcg_at_3_std\n value: -21.405154376652803\n - type: nauc_ndcg_at_5_diff1\n value: 72.9491030712935\n - type: nauc_ndcg_at_5_max\n value: 66.85786103137077\n - type: nauc_ndcg_at_5_std\n value: -21.04005053344073\n - type: nauc_precision_at_1000_diff1\n value: 17.02462370967451\n - type: nauc_precision_at_1000_max\n value: 48.03260752496052\n - type: nauc_precision_at_1000_std\n value: 87.56077915079334\n - type: nauc_precision_at_100_diff1\n value: 58.590352501194985\n - type: nauc_precision_at_100_max\n value: 78.2649015433222\n - type: nauc_precision_at_100_std\n value: 28.05030453158992\n - type: nauc_precision_at_10_diff1\n value: 64.89497928764766\n - type: nauc_precision_at_10_max\n value: 75.93257124951242\n - type: nauc_precision_at_10_std\n value: -9.825306994117462\n - type: nauc_precision_at_1_diff1\n value: 76.73627284218519\n - type: nauc_precision_at_1_max\n value: 62.94957512510876\n - type: nauc_precision_at_1_std\n value: -20.99649749330682\n - type: nauc_precision_at_20_diff1\n value: 62.11366204321558\n - type: nauc_precision_at_20_max\n value: 75.9571427846493\n - type: nauc_precision_at_20_std\n value: -0.94585212808191\n - type: nauc_precision_at_3_diff1\n value: 70.52940972112398\n - type: nauc_precision_at_3_max\n value: 70.3402053170779\n - type: nauc_precision_at_3_std\n value: -21.579778424241304\n - type: nauc_precision_at_5_diff1\n value: 68.78962580223575\n - type: nauc_precision_at_5_max\n value: 71.41410894398376\n - type: nauc_precision_at_5_std\n value: -20.415603405161956\n - type: nauc_recall_at_1000_diff1\n value: 55.88625447348128\n - type: nauc_recall_at_1000_max\n value: 100.0\n - type: nauc_recall_at_1000_std\n value: 100.0\n - type: nauc_recall_at_100_diff1\n value: 61.17942268389525\n - type: nauc_recall_at_100_max\n value: 81.12207841563487\n - type: nauc_recall_at_100_std\n value: 27.141215257528113\n - type: nauc_recall_at_10_diff1\n value: 64.8949792876478\n - type: nauc_recall_at_10_max\n value: 75.93257124951249\n - type: nauc_recall_at_10_std\n value: -9.825306994117323\n - type: nauc_recall_at_1_diff1\n value: 76.73627284218519\n - type: nauc_recall_at_1_max\n value: 62.94957512510876\n - type: nauc_recall_at_1_std\n value: -20.99649749330682\n - type: nauc_recall_at_20_diff1\n value: 63.07808719241162\n - type: nauc_recall_at_20_max\n value: 76.96808746317542\n - type: nauc_recall_at_20_std\n value: -1.5235053258631275\n - type: nauc_recall_at_3_diff1\n value: 70.52940972112405\n - type: nauc_recall_at_3_max\n value: 70.3402053170779\n - type: nauc_recall_at_3_std\n value: -21.57977842424124\n - type: nauc_recall_at_5_diff1\n value: 68.78962580223575\n - type: nauc_recall_at_5_max\n value: 71.41410894398392\n - type: nauc_recall_at_5_std\n value: -20.415603405161793\n - type: ndcg_at_1\n value: 58.423\n - type: ndcg_at_10\n value: 70.645\n - type: ndcg_at_100\n value: 73.277\n - type: ndcg_at_1000\n value: 73.785\n - type: ndcg_at_20\n value: 71.918\n - type: ndcg_at_3\n value: 66.679\n - type: ndcg_at_5\n value: 68.72200000000001\n - type: precision_at_1\n value: 58.423\n - type: precision_at_10\n value: 8.338\n - type: precision_at_100\n value: 0.959\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.423\n - type: precision_at_3\n value: 24.113\n - type: precision_at_5\n value: 15.47\n - type: recall_at_1\n value: 58.423\n - type: recall_at_10\n value: 83.38\n - type: recall_at_100\n value: 95.887\n - type: recall_at_1000\n value: 99.831\n - type: recall_at_20\n value: 88.39399999999999\n - type: recall_at_3\n value: 72.33800000000001\n - type: recall_at_5\n value: 77.352\n task:\n type: Retrieval\n - dataset:\n config: deu-eng\n name: MTEB MLQARetrieval (deu-eng)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 67.067\n - type: map_at_1\n value: 55.861000000000004\n - type: map_at_10\n value: 63.42100000000001\n - type: map_at_100\n value: 64.03\n - type: map_at_1000\n value: 64.05999999999999\n - type: map_at_20\n value: 63.819\n - type: map_at_3\n value: 61.773\n - type: map_at_5\n value: 62.736999999999995\n - type: mrr_at_1\n value: 55.88300465322402\n - type: mrr_at_10\n value: 63.43111082973707\n - type: mrr_at_100\n value: 64.03962373590272\n - type: mrr_at_1000\n value: 64.0698259866376\n - type: mrr_at_20\n value: 63.82871766489112\n - type: mrr_at_3\n value: 61.78447448112865\n - type: mrr_at_5\n value: 62.74835659945346\n - type: nauc_map_at_1000_diff1\n value: 74.58505763417352\n - type: nauc_map_at_1000_max\n value: 66.26060764852198\n - type: nauc_map_at_1000_std\n value: -16.896178230873897\n - type: nauc_map_at_100_diff1\n value: 74.57057487892857\n - type: nauc_map_at_100_max\n value: 66.26600433283826\n - type: nauc_map_at_100_std\n value: -16.87596113104189\n - type: nauc_map_at_10_diff1\n value: 74.53453636322749\n - type: nauc_map_at_10_max\n value: 66.27501737773804\n - type: nauc_map_at_10_std\n value: -17.178743257781775\n - type: nauc_map_at_1_diff1\n value: 77.63067209375254\n - type: nauc_map_at_1_max\n value: 64.17718675702672\n - type: nauc_map_at_1_std\n value: -17.639521106853717\n - type: nauc_map_at_20_diff1\n value: 74.52007402431164\n - type: nauc_map_at_20_max\n value: 66.28276291359268\n - type: nauc_map_at_20_std\n value: -16.939292897754758\n - type: nauc_map_at_3_diff1\n value: 74.79187974631951\n - type: nauc_map_at_3_max\n value: 66.23256568210611\n - type: nauc_map_at_3_std\n value: -17.894889918934112\n - type: nauc_map_at_5_diff1\n value: 74.63011328882517\n - type: nauc_map_at_5_max\n value: 66.35411054978499\n - type: nauc_map_at_5_std\n value: -17.50140342194211\n - type: nauc_mrr_at_1000_diff1\n value: 74.57520089771667\n - type: nauc_mrr_at_1000_max\n value: 66.27270912845914\n - type: nauc_mrr_at_1000_std\n value: -16.84012675362397\n - type: nauc_mrr_at_100_diff1\n value: 74.56070964572156\n - type: nauc_mrr_at_100_max\n value: 66.2780701126926\n - type: nauc_mrr_at_100_std\n value: -16.820035083069865\n - type: nauc_mrr_at_10_diff1\n value: 74.52455978435117\n - type: nauc_mrr_at_10_max\n value: 66.28697244023137\n - type: nauc_mrr_at_10_std\n value: -17.122477723330523\n - type: nauc_mrr_at_1_diff1\n value: 77.60643512422061\n - type: nauc_mrr_at_1_max\n value: 64.21736966061896\n - type: nauc_mrr_at_1_std\n value: -17.56627338275146\n - type: nauc_mrr_at_20_diff1\n value: 74.5099814266373\n - type: nauc_mrr_at_20_max\n value: 66.29485560556576\n - type: nauc_mrr_at_20_std\n value: -16.882350027335306\n - type: nauc_mrr_at_3_diff1\n value: 74.78132817375507\n - type: nauc_mrr_at_3_max\n value: 66.24761860047623\n - type: nauc_mrr_at_3_std\n value: -17.833128575678998\n - type: nauc_mrr_at_5_diff1\n value: 74.6193031207433\n - type: nauc_mrr_at_5_max\n value: 66.36951764432901\n - type: nauc_mrr_at_5_std\n value: -17.438203106324227\n - type: nauc_ndcg_at_1000_diff1\n value: 73.79386161629151\n - type: nauc_ndcg_at_1000_max\n value: 66.84013038018082\n - type: nauc_ndcg_at_1000_std\n value: -15.387358822700667\n - type: nauc_ndcg_at_100_diff1\n value: 73.36132885277745\n - type: nauc_ndcg_at_100_max\n value: 67.04416926901568\n - type: nauc_ndcg_at_100_std\n value: -14.503256942521972\n - type: nauc_ndcg_at_10_diff1\n value: 73.11847332785027\n - type: nauc_ndcg_at_10_max\n value: 67.02149621303091\n - type: nauc_ndcg_at_10_std\n value: -16.142234662067782\n - type: nauc_ndcg_at_1_diff1\n value: 77.60643512422061\n - type: nauc_ndcg_at_1_max\n value: 64.21736966061896\n - type: nauc_ndcg_at_1_std\n value: -17.56627338275146\n - type: nauc_ndcg_at_20_diff1\n value: 72.97961452569768\n - type: nauc_ndcg_at_20_max\n value: 67.12369127081152\n - type: nauc_ndcg_at_20_std\n value: -15.11921773223936\n - type: nauc_ndcg_at_3_diff1\n value: 73.77769312598772\n - type: nauc_ndcg_at_3_max\n value: 66.94438755852309\n - type: nauc_ndcg_at_3_std\n value: -17.75960443830741\n - type: nauc_ndcg_at_5_diff1\n value: 73.43991209562891\n - type: nauc_ndcg_at_5_max\n value: 67.21682951737418\n - type: nauc_ndcg_at_5_std\n value: -17.013510008231805\n - type: nauc_precision_at_1000_diff1\n value: 51.30633281948362\n - type: nauc_precision_at_1000_max\n value: 76.78675288883846\n - type: nauc_precision_at_1000_std\n value: 71.70041985304397\n - type: nauc_precision_at_100_diff1\n value: 59.86656455853326\n - type: nauc_precision_at_100_max\n value: 74.41958422732161\n - type: nauc_precision_at_100_std\n value: 22.098920296069124\n - type: nauc_precision_at_10_diff1\n value: 66.4696166928741\n - type: nauc_precision_at_10_max\n value: 69.88463108697104\n - type: nauc_precision_at_10_std\n value: -10.707950954702742\n - type: nauc_precision_at_1_diff1\n value: 77.60643512422061\n - type: nauc_precision_at_1_max\n value: 64.21736966061896\n - type: nauc_precision_at_1_std\n value: -17.56627338275146\n - type: nauc_precision_at_20_diff1\n value: 63.45094585276983\n - type: nauc_precision_at_20_max\n value: 71.57741245347195\n - type: nauc_precision_at_20_std\n value: -2.2211545419051744\n - type: nauc_precision_at_3_diff1\n value: 70.28060818081384\n - type: nauc_precision_at_3_max\n value: 69.22652927816439\n - type: nauc_precision_at_3_std\n value: -17.158576243559434\n - type: nauc_precision_at_5_diff1\n value: 68.90765418427162\n - type: nauc_precision_at_5_max\n value: 70.32585273389111\n - type: nauc_precision_at_5_std\n value: -14.950363729664524\n - type: nauc_recall_at_1000_diff1\n value: 65.11255117927331\n - type: nauc_recall_at_1000_max\n value: 88.35641213283338\n - type: nauc_recall_at_1000_std\n value: 69.89792573640547\n - type: nauc_recall_at_100_diff1\n value: 61.46376457272238\n - type: nauc_recall_at_100_max\n value: 75.48265142243015\n - type: nauc_recall_at_100_std\n value: 21.223182712042178\n - type: nauc_recall_at_10_diff1\n value: 66.89353375308997\n - type: nauc_recall_at_10_max\n value: 70.06655416883785\n - type: nauc_recall_at_10_std\n value: -11.100871879439435\n - type: nauc_recall_at_1_diff1\n value: 77.63067209375254\n - type: nauc_recall_at_1_max\n value: 64.17718675702672\n - type: nauc_recall_at_1_std\n value: -17.639521106853717\n - type: nauc_recall_at_20_diff1\n value: 63.98532276331878\n - type: nauc_recall_at_20_max\n value: 71.81562599791899\n - type: nauc_recall_at_20_std\n value: -2.696537977147695\n - type: nauc_recall_at_3_diff1\n value: 70.4507655865698\n - type: nauc_recall_at_3_max\n value: 69.25705030141037\n - type: nauc_recall_at_3_std\n value: -17.299948348202836\n - type: nauc_recall_at_5_diff1\n value: 69.09152857901888\n - type: nauc_recall_at_5_max\n value: 70.35609636026405\n - type: nauc_recall_at_5_std\n value: -15.105012139255896\n - type: ndcg_at_1\n value: 55.883\n - type: ndcg_at_10\n value: 67.067\n - type: ndcg_at_100\n value: 70.07\n - type: ndcg_at_1000\n value: 70.875\n - type: ndcg_at_20\n value: 68.498\n - type: ndcg_at_3\n value: 63.666\n - type: ndcg_at_5\n value: 65.40599999999999\n - type: precision_at_1\n value: 55.883\n - type: precision_at_10\n value: 7.8549999999999995\n - type: precision_at_100\n value: 0.928\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.2090000000000005\n - type: precision_at_3\n value: 23.052\n - type: precision_at_5\n value: 14.677999999999999\n - type: recall_at_1\n value: 55.861000000000004\n - type: recall_at_10\n value: 78.495\n - type: recall_at_100\n value: 92.688\n - type: recall_at_1000\n value: 99.02499999999999\n - type: recall_at_20\n value: 84.124\n - type: recall_at_3\n value: 69.123\n - type: recall_at_5\n value: 73.355\n task:\n type: Retrieval\n - dataset:\n config: spa-deu\n name: MTEB MLQARetrieval (spa-deu)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 73.90299999999999\n - type: map_at_1\n value: 61.236000000000004\n - type: map_at_10\n value: 69.88799999999999\n - type: map_at_100\n value: 70.319\n - type: map_at_1000\n value: 70.341\n - type: map_at_20\n value: 70.16799999999999\n - type: map_at_3\n value: 68.104\n - type: map_at_5\n value: 69.164\n - type: mrr_at_1\n value: 61.2739571589628\n - type: mrr_at_10\n value: 69.92589162684993\n - type: mrr_at_100\n value: 70.35245455509234\n - type: mrr_at_1000\n value: 70.37438351396742\n - type: mrr_at_20\n value: 70.20247469915404\n - type: mrr_at_3\n value: 68.14167606163099\n - type: mrr_at_5\n value: 69.20142803457354\n - type: nauc_map_at_1000_diff1\n value: 74.70416754842327\n - type: nauc_map_at_1000_max\n value: 65.86915994583384\n - type: nauc_map_at_1000_std\n value: -19.04437483534443\n - type: nauc_map_at_100_diff1\n value: 74.70011798058674\n - type: nauc_map_at_100_max\n value: 65.88507779167188\n - type: nauc_map_at_100_std\n value: -19.018670970643786\n - type: nauc_map_at_10_diff1\n value: 74.6362126804427\n - type: nauc_map_at_10_max\n value: 66.05733054427198\n - type: nauc_map_at_10_std\n value: -19.034317737897354\n - type: nauc_map_at_1_diff1\n value: 77.24970536833601\n - type: nauc_map_at_1_max\n value: 62.07820573048406\n - type: nauc_map_at_1_std\n value: -20.917086586335078\n - type: nauc_map_at_20_diff1\n value: 74.64113920401083\n - type: nauc_map_at_20_max\n value: 65.89991740166793\n - type: nauc_map_at_20_std\n value: -19.09987515041243\n - type: nauc_map_at_3_diff1\n value: 74.6518162332119\n - type: nauc_map_at_3_max\n value: 66.10312348194024\n - type: nauc_map_at_3_std\n value: -18.95881457716116\n - type: nauc_map_at_5_diff1\n value: 74.55141020670321\n - type: nauc_map_at_5_max\n value: 65.94345752979342\n - type: nauc_map_at_5_std\n value: -19.453976877992304\n - type: nauc_mrr_at_1000_diff1\n value: 74.64458488344088\n - type: nauc_mrr_at_1000_max\n value: 65.84575328456057\n - type: nauc_mrr_at_1000_std\n value: -18.901614615119904\n - type: nauc_mrr_at_100_diff1\n value: 74.64058497924627\n - type: nauc_mrr_at_100_max\n value: 65.86170461767928\n - type: nauc_mrr_at_100_std\n value: -18.87601697091505\n - type: nauc_mrr_at_10_diff1\n value: 74.57266634464752\n - type: nauc_mrr_at_10_max\n value: 66.03331587645152\n - type: nauc_mrr_at_10_std\n value: -18.87888060105393\n - type: nauc_mrr_at_1_diff1\n value: 77.19578272647183\n - type: nauc_mrr_at_1_max\n value: 62.05252035478773\n - type: nauc_mrr_at_1_std\n value: -20.790530940625267\n - type: nauc_mrr_at_20_diff1\n value: 74.5808171250021\n - type: nauc_mrr_at_20_max\n value: 65.87643606587798\n - type: nauc_mrr_at_20_std\n value: -18.95476583474199\n - type: nauc_mrr_at_3_diff1\n value: 74.5917053289191\n - type: nauc_mrr_at_3_max\n value: 66.08044079438714\n - type: nauc_mrr_at_3_std\n value: -18.81168463163586\n - type: nauc_mrr_at_5_diff1\n value: 74.48934579694608\n - type: nauc_mrr_at_5_max\n value: 65.91993162383771\n - type: nauc_mrr_at_5_std\n value: -19.302710791338797\n - type: nauc_ndcg_at_1000_diff1\n value: 74.20191283992186\n - type: nauc_ndcg_at_1000_max\n value: 66.60831175771229\n - type: nauc_ndcg_at_1000_std\n value: -18.175208725175484\n - type: nauc_ndcg_at_100_diff1\n value: 74.07713451642955\n - type: nauc_ndcg_at_100_max\n value: 67.02028626335476\n - type: nauc_ndcg_at_100_std\n value: -17.36560972181693\n - type: nauc_ndcg_at_10_diff1\n value: 73.63235521598476\n - type: nauc_ndcg_at_10_max\n value: 67.8118473312638\n - type: nauc_ndcg_at_10_std\n value: -17.647560577355915\n - type: nauc_ndcg_at_1_diff1\n value: 77.19578272647183\n - type: nauc_ndcg_at_1_max\n value: 62.05252035478773\n - type: nauc_ndcg_at_1_std\n value: -20.790530940625267\n - type: nauc_ndcg_at_20_diff1\n value: 73.65300308228291\n - type: nauc_ndcg_at_20_max\n value: 67.18353402731985\n - type: nauc_ndcg_at_20_std\n value: -17.9240756389792\n - type: nauc_ndcg_at_3_diff1\n value: 73.73764900202292\n - type: nauc_ndcg_at_3_max\n value: 67.60840957876889\n - type: nauc_ndcg_at_3_std\n value: -17.962667543518933\n - type: nauc_ndcg_at_5_diff1\n value: 73.49040500302092\n - type: nauc_ndcg_at_5_max\n value: 67.41251918514402\n - type: nauc_ndcg_at_5_std\n value: -18.851877225955523\n - type: nauc_precision_at_1000_diff1\n value: -18.652906102973922\n - type: nauc_precision_at_1000_max\n value: 2.1701672475574885\n - type: nauc_precision_at_1000_std\n value: 61.713411950188835\n - type: nauc_precision_at_100_diff1\n value: 62.37565302288498\n - type: nauc_precision_at_100_max\n value: 76.96921843049006\n - type: nauc_precision_at_100_std\n value: 19.152009040219678\n - type: nauc_precision_at_10_diff1\n value: 68.14047344105212\n - type: nauc_precision_at_10_max\n value: 77.7177273849099\n - type: nauc_precision_at_10_std\n value: -9.124325941493698\n - type: nauc_precision_at_1_diff1\n value: 77.19578272647183\n - type: nauc_precision_at_1_max\n value: 62.05252035478773\n - type: nauc_precision_at_1_std\n value: -20.790530940625267\n - type: nauc_precision_at_20_diff1\n value: 65.38487456362745\n - type: nauc_precision_at_20_max\n value: 74.61122933443669\n - type: nauc_precision_at_20_std\n value: -8.129775929648341\n - type: nauc_precision_at_3_diff1\n value: 70.45937744142297\n - type: nauc_precision_at_3_max\n value: 73.03004233073901\n - type: nauc_precision_at_3_std\n value: -14.246554579025158\n - type: nauc_precision_at_5_diff1\n value: 69.02821772428955\n - type: nauc_precision_at_5_max\n value: 73.52949774726446\n - type: nauc_precision_at_5_std\n value: -16.355747231517757\n - type: nauc_recall_at_1000_diff1\n value: 35.804192824985755\n - type: nauc_recall_at_1000_max\n value: 61.367785756485894\n - type: nauc_recall_at_1000_std\n value: 54.01380822466869\n - type: nauc_recall_at_100_diff1\n value: 67.96210883597479\n - type: nauc_recall_at_100_max\n value: 82.38124823732169\n - type: nauc_recall_at_100_std\n value: 16.814922595309966\n - type: nauc_recall_at_10_diff1\n value: 68.21964459634341\n - type: nauc_recall_at_10_max\n value: 77.68301934858845\n - type: nauc_recall_at_10_std\n value: -9.430792913885066\n - type: nauc_recall_at_1_diff1\n value: 77.24970536833601\n - type: nauc_recall_at_1_max\n value: 62.07820573048406\n - type: nauc_recall_at_1_std\n value: -20.917086586335078\n - type: nauc_recall_at_20_diff1\n value: 66.60569906579487\n - type: nauc_recall_at_20_max\n value: 75.66163186604354\n - type: nauc_recall_at_20_std\n value: -9.09826205489828\n - type: nauc_recall_at_3_diff1\n value: 70.52323701841641\n - type: nauc_recall_at_3_max\n value: 73.03478107411232\n - type: nauc_recall_at_3_std\n value: -14.432325989967962\n - type: nauc_recall_at_5_diff1\n value: 69.08521261524373\n - type: nauc_recall_at_5_max\n value: 73.51150270382094\n - type: nauc_recall_at_5_std\n value: -16.569387503524368\n - type: ndcg_at_1\n value: 61.273999999999994\n - type: ndcg_at_10\n value: 73.90299999999999\n - type: ndcg_at_100\n value: 75.983\n - type: ndcg_at_1000\n value: 76.488\n - type: ndcg_at_20\n value: 74.921\n - type: ndcg_at_3\n value: 70.277\n - type: ndcg_at_5\n value: 72.172\n - type: precision_at_1\n value: 61.273999999999994\n - type: precision_at_10\n value: 8.641\n - type: precision_at_100\n value: 0.962\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 4.524\n - type: precision_at_3\n value: 25.517\n - type: precision_at_5\n value: 16.223000000000003\n - type: recall_at_1\n value: 61.236000000000004\n - type: recall_at_10\n value: 86.37700000000001\n - type: recall_at_100\n value: 96.054\n - type: recall_at_1000\n value: 99.887\n - type: recall_at_20\n value: 90.398\n - type: recall_at_3\n value: 76.51299999999999\n - type: recall_at_5\n value: 81.07900000000001\n task:\n type: Retrieval\n - dataset:\n config: spa-spa\n name: MTEB MLQARetrieval (spa-spa)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 68.632\n - type: map_at_1\n value: 57.046\n - type: map_at_10\n value: 64.869\n - type: map_at_100\n value: 65.384\n - type: map_at_1000\n value: 65.413\n - type: map_at_20\n value: 65.185\n - type: map_at_3\n value: 63.178\n - type: map_at_5\n value: 64.12\n - type: mrr_at_1\n value: 57.05579889544848\n - type: mrr_at_10\n value: 64.8806425382317\n - type: mrr_at_100\n value: 65.39469233244084\n - type: mrr_at_1000\n value: 65.42342199403159\n - type: mrr_at_20\n value: 65.19634815919534\n - type: mrr_at_3\n value: 63.18796419729591\n - type: mrr_at_5\n value: 64.13159398209874\n - type: nauc_map_at_1000_diff1\n value: 73.23803038674018\n - type: nauc_map_at_1000_max\n value: 67.44156201421714\n - type: nauc_map_at_1000_std\n value: -8.60143026450049\n - type: nauc_map_at_100_diff1\n value: 73.22575613034235\n - type: nauc_map_at_100_max\n value: 67.44735143420195\n - type: nauc_map_at_100_std\n value: -8.576905069492895\n - type: nauc_map_at_10_diff1\n value: 73.11950129610865\n - type: nauc_map_at_10_max\n value: 67.45107232305055\n - type: nauc_map_at_10_std\n value: -8.799837857015392\n - type: nauc_map_at_1_diff1\n value: 76.18354072047988\n - type: nauc_map_at_1_max\n value: 65.03342186728786\n - type: nauc_map_at_1_std\n value: -10.867650288695796\n - type: nauc_map_at_20_diff1\n value: 73.21570748770948\n - type: nauc_map_at_20_max\n value: 67.50340321088724\n - type: nauc_map_at_20_std\n value: -8.594057184944676\n - type: nauc_map_at_3_diff1\n value: 73.17239276163892\n - type: nauc_map_at_3_max\n value: 67.06319504819103\n - type: nauc_map_at_3_std\n value: -9.883216310270528\n - type: nauc_map_at_5_diff1\n value: 73.11913507367727\n - type: nauc_map_at_5_max\n value: 67.27497019567078\n - type: nauc_map_at_5_std\n value: -9.497714822103118\n - type: nauc_mrr_at_1000_diff1\n value: 73.22971233311306\n - type: nauc_mrr_at_1000_max\n value: 67.42977229057223\n - type: nauc_mrr_at_1000_std\n value: -8.550068702273297\n - type: nauc_mrr_at_100_diff1\n value: 73.21744467317815\n - type: nauc_mrr_at_100_max\n value: 67.43557491068093\n - type: nauc_mrr_at_100_std\n value: -8.52559275190607\n - type: nauc_mrr_at_10_diff1\n value: 73.11075619726137\n - type: nauc_mrr_at_10_max\n value: 67.43889760205286\n - type: nauc_mrr_at_10_std\n value: -8.74617232559183\n - type: nauc_mrr_at_1_diff1\n value: 76.17529975949547\n - type: nauc_mrr_at_1_max\n value: 65.02401127001608\n - type: nauc_mrr_at_1_std\n value: -10.817814457633952\n - type: nauc_mrr_at_20_diff1\n value: 73.20689275225138\n - type: nauc_mrr_at_20_max\n value: 67.49111752272192\n - type: nauc_mrr_at_20_std\n value: -8.539827528410353\n - type: nauc_mrr_at_3_diff1\n value: 73.16291729623958\n - type: nauc_mrr_at_3_max\n value: 67.05300993427998\n - type: nauc_mrr_at_3_std\n value: -9.827915885680811\n - type: nauc_mrr_at_5_diff1\n value: 73.11055686484109\n - type: nauc_mrr_at_5_max\n value: 67.26299851089122\n - type: nauc_mrr_at_5_std\n value: -9.445190276650903\n - type: nauc_ndcg_at_1000_diff1\n value: 72.58833638407177\n - type: nauc_ndcg_at_1000_max\n value: 68.10447506371374\n - type: nauc_ndcg_at_1000_std\n value: -6.910306241546282\n - type: nauc_ndcg_at_100_diff1\n value: 72.24524849631476\n - type: nauc_ndcg_at_100_max\n value: 68.30659210081238\n - type: nauc_ndcg_at_100_std\n value: -6.04305364268931\n - type: nauc_ndcg_at_10_diff1\n value: 71.87363502582961\n - type: nauc_ndcg_at_10_max\n value: 68.5010009653693\n - type: nauc_ndcg_at_10_std\n value: -7.021281296450588\n - type: nauc_ndcg_at_1_diff1\n value: 76.17529975949547\n - type: nauc_ndcg_at_1_max\n value: 65.02401127001608\n - type: nauc_ndcg_at_1_std\n value: -10.817814457633952\n - type: nauc_ndcg_at_20_diff1\n value: 72.21241010439327\n - type: nauc_ndcg_at_20_max\n value: 68.71743274030551\n - type: nauc_ndcg_at_20_std\n value: -6.186629577195946\n - type: nauc_ndcg_at_3_diff1\n value: 72.08204674794459\n - type: nauc_ndcg_at_3_max\n value: 67.5958365046156\n - type: nauc_ndcg_at_3_std\n value: -9.576418336610345\n - type: nauc_ndcg_at_5_diff1\n value: 71.93179095844508\n - type: nauc_ndcg_at_5_max\n value: 68.01914639754217\n - type: nauc_ndcg_at_5_std\n value: -8.833768332910777\n - type: nauc_precision_at_1000_diff1\n value: 63.0051360227489\n - type: nauc_precision_at_1000_max\n value: 79.93532442313229\n - type: nauc_precision_at_1000_std\n value: 52.869517607133254\n - type: nauc_precision_at_100_diff1\n value: 62.43301501857154\n - type: nauc_precision_at_100_max\n value: 75.57280416668183\n - type: nauc_precision_at_100_std\n value: 26.758300486132747\n - type: nauc_precision_at_10_diff1\n value: 66.29806375971134\n - type: nauc_precision_at_10_max\n value: 73.40301413754797\n - type: nauc_precision_at_10_std\n value: 1.9858547295235462\n - type: nauc_precision_at_1_diff1\n value: 76.17529975949547\n - type: nauc_precision_at_1_max\n value: 65.02401127001608\n - type: nauc_precision_at_1_std\n value: -10.817814457633952\n - type: nauc_precision_at_20_diff1\n value: 67.05111836051105\n - type: nauc_precision_at_20_max\n value: 76.09783190824155\n - type: nauc_precision_at_20_std\n value: 9.906010659515564\n - type: nauc_precision_at_3_diff1\n value: 68.44186679250453\n - type: nauc_precision_at_3_max\n value: 69.30301351119388\n - type: nauc_precision_at_3_std\n value: -8.566522518882348\n - type: nauc_precision_at_5_diff1\n value: 67.51737199297388\n - type: nauc_precision_at_5_max\n value: 70.75887601590472\n - type: nauc_precision_at_5_std\n value: -6.278983102710238\n - type: nauc_recall_at_1000_diff1\n value: 65.12360093170948\n - type: nauc_recall_at_1000_max\n value: 82.60209843191132\n - type: nauc_recall_at_1000_std\n value: 51.740179583368636\n - type: nauc_recall_at_100_diff1\n value: 62.82007697326819\n - type: nauc_recall_at_100_max\n value: 76.04844844677562\n - type: nauc_recall_at_100_std\n value: 26.4678415019248\n - type: nauc_recall_at_10_diff1\n value: 66.28557566848767\n - type: nauc_recall_at_10_max\n value: 73.40302709828738\n - type: nauc_recall_at_10_std\n value: 1.9224272854613582\n - type: nauc_recall_at_1_diff1\n value: 76.18354072047988\n - type: nauc_recall_at_1_max\n value: 65.03342186728786\n - type: nauc_recall_at_1_std\n value: -10.867650288695796\n - type: nauc_recall_at_20_diff1\n value: 67.03430451094992\n - type: nauc_recall_at_20_max\n value: 76.09474005171319\n - type: nauc_recall_at_20_std\n value: 9.815888637851074\n - type: nauc_recall_at_3_diff1\n value: 68.44411411344718\n - type: nauc_recall_at_3_max\n value: 69.30502737137265\n - type: nauc_recall_at_3_std\n value: -8.629526329714132\n - type: nauc_recall_at_5_diff1\n value: 67.51469265953514\n - type: nauc_recall_at_5_max\n value: 70.76969893818111\n - type: nauc_recall_at_5_std\n value: -6.325600167105444\n - type: ndcg_at_1\n value: 57.056\n - type: ndcg_at_10\n value: 68.632\n - type: ndcg_at_100\n value: 71.202\n - type: ndcg_at_1000\n value: 71.97099999999999\n - type: ndcg_at_20\n value: 69.785\n - type: ndcg_at_3\n value: 65.131\n - type: ndcg_at_5\n value: 66.834\n - type: precision_at_1\n value: 57.056\n - type: precision_at_10\n value: 8.044\n - type: precision_at_100\n value: 0.9259999999999999\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.251\n - type: precision_at_3\n value: 23.589\n - type: precision_at_5\n value: 14.984\n - type: recall_at_1\n value: 57.046\n - type: recall_at_10\n value: 80.423\n - type: recall_at_100\n value: 92.582\n - type: recall_at_1000\n value: 98.638\n - type: recall_at_20\n value: 84.993\n - type: recall_at_3\n value: 70.758\n - type: recall_at_5\n value: 74.9\n task:\n type: Retrieval\n - dataset:\n config: spa-eng\n name: MTEB MLQARetrieval (spa-eng)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 68.765\n - type: map_at_1\n value: 56.538999999999994\n - type: map_at_10\n value: 64.816\n - type: map_at_100\n value: 65.325\n - type: map_at_1000\n value: 65.352\n - type: map_at_20\n value: 65.113\n - type: map_at_3\n value: 62.934999999999995\n - type: map_at_5\n value: 64.063\n - type: mrr_at_1\n value: 56.539120502569965\n - type: mrr_at_10\n value: 64.81561556661505\n - type: mrr_at_100\n value: 65.32464238613954\n - type: mrr_at_1000\n value: 65.35206516602133\n - type: mrr_at_20\n value: 65.11270445292227\n - type: mrr_at_3\n value: 62.935465448315384\n - type: mrr_at_5\n value: 64.06339234723022\n - type: nauc_map_at_1000_diff1\n value: 73.20701050428072\n - type: nauc_map_at_1000_max\n value: 67.32797480614404\n - type: nauc_map_at_1000_std\n value: -6.211540626528362\n - type: nauc_map_at_100_diff1\n value: 73.19497683923063\n - type: nauc_map_at_100_max\n value: 67.33392646467817\n - type: nauc_map_at_100_std\n value: -6.196671563900051\n - type: nauc_map_at_10_diff1\n value: 73.16010547612956\n - type: nauc_map_at_10_max\n value: 67.37793741307372\n - type: nauc_map_at_10_std\n value: -6.3443240322521675\n - type: nauc_map_at_1_diff1\n value: 76.63696578575964\n - type: nauc_map_at_1_max\n value: 65.08189618178105\n - type: nauc_map_at_1_std\n value: -8.594195451782733\n - type: nauc_map_at_20_diff1\n value: 73.15233479381568\n - type: nauc_map_at_20_max\n value: 67.3679607256072\n - type: nauc_map_at_20_std\n value: -6.175928265286352\n - type: nauc_map_at_3_diff1\n value: 73.14853380980746\n - type: nauc_map_at_3_max\n value: 67.10354198073468\n - type: nauc_map_at_3_std\n value: -7.409679815529866\n - type: nauc_map_at_5_diff1\n value: 73.13425961877715\n - type: nauc_map_at_5_max\n value: 67.22452899371224\n - type: nauc_map_at_5_std\n value: -6.895257774506354\n - type: nauc_mrr_at_1000_diff1\n value: 73.20701050428072\n - type: nauc_mrr_at_1000_max\n value: 67.32797480614404\n - type: nauc_mrr_at_1000_std\n value: -6.211540626528362\n - type: nauc_mrr_at_100_diff1\n value: 73.19497683923063\n - type: nauc_mrr_at_100_max\n value: 67.33392646467817\n - type: nauc_mrr_at_100_std\n value: -6.196671563900051\n - type: nauc_mrr_at_10_diff1\n value: 73.16010547612956\n - type: nauc_mrr_at_10_max\n value: 67.37793741307372\n - type: nauc_mrr_at_10_std\n value: -6.3443240322521675\n - type: nauc_mrr_at_1_diff1\n value: 76.63696578575964\n - type: nauc_mrr_at_1_max\n value: 65.08189618178105\n - type: nauc_mrr_at_1_std\n value: -8.594195451782733\n - type: nauc_mrr_at_20_diff1\n value: 73.15233479381568\n - type: nauc_mrr_at_20_max\n value: 67.3679607256072\n - type: nauc_mrr_at_20_std\n value: -6.175928265286352\n - type: nauc_mrr_at_3_diff1\n value: 73.14853380980746\n - type: nauc_mrr_at_3_max\n value: 67.10354198073468\n - type: nauc_mrr_at_3_std\n value: -7.409679815529866\n - type: nauc_mrr_at_5_diff1\n value: 73.13425961877715\n - type: nauc_mrr_at_5_max\n value: 67.22452899371224\n - type: nauc_mrr_at_5_std\n value: -6.895257774506354\n - type: nauc_ndcg_at_1000_diff1\n value: 72.44364625096874\n - type: nauc_ndcg_at_1000_max\n value: 67.93635761141552\n - type: nauc_ndcg_at_1000_std\n value: -4.616429464350954\n - type: nauc_ndcg_at_100_diff1\n value: 72.11352383758482\n - type: nauc_ndcg_at_100_max\n value: 68.1627312575955\n - type: nauc_ndcg_at_100_std\n value: -3.894213672131282\n - type: nauc_ndcg_at_10_diff1\n value: 71.8526850770812\n - type: nauc_ndcg_at_10_max\n value: 68.41366561888562\n - type: nauc_ndcg_at_10_std\n value: -4.472146861145989\n - type: nauc_ndcg_at_1_diff1\n value: 76.63696578575964\n - type: nauc_ndcg_at_1_max\n value: 65.08189618178105\n - type: nauc_ndcg_at_1_std\n value: -8.594195451782733\n - type: nauc_ndcg_at_20_diff1\n value: 71.76464418138866\n - type: nauc_ndcg_at_20_max\n value: 68.41174963313698\n - type: nauc_ndcg_at_20_std\n value: -3.7449762037540157\n - type: nauc_ndcg_at_3_diff1\n value: 71.93808990683131\n - type: nauc_ndcg_at_3_max\n value: 67.7010029507334\n - type: nauc_ndcg_at_3_std\n value: -6.971858419379321\n - type: nauc_ndcg_at_5_diff1\n value: 71.8505224811326\n - type: nauc_ndcg_at_5_max\n value: 67.97139549500251\n - type: nauc_ndcg_at_5_std\n value: -5.958491308070017\n - type: nauc_precision_at_1000_diff1\n value: 62.20956180320043\n - type: nauc_precision_at_1000_max\n value: 82.53412670611299\n - type: nauc_precision_at_1000_std\n value: 55.57278124999575\n - type: nauc_precision_at_100_diff1\n value: 62.03792857023201\n - type: nauc_precision_at_100_max\n value: 76.77130713424538\n - type: nauc_precision_at_100_std\n value: 26.674102719959564\n - type: nauc_precision_at_10_diff1\n value: 65.89798055049931\n - type: nauc_precision_at_10_max\n value: 73.41908620140674\n - type: nauc_precision_at_10_std\n value: 5.21818573283179\n - type: nauc_precision_at_1_diff1\n value: 76.63696578575964\n - type: nauc_precision_at_1_max\n value: 65.08189618178105\n - type: nauc_precision_at_1_std\n value: -8.594195451782733\n - type: nauc_precision_at_20_diff1\n value: 63.734308542647355\n - type: nauc_precision_at_20_max\n value: 74.69578825096144\n - type: nauc_precision_at_20_std\n value: 12.627842502659162\n - type: nauc_precision_at_3_diff1\n value: 67.91189666671904\n - type: nauc_precision_at_3_max\n value: 69.64986036783209\n - type: nauc_precision_at_3_std\n value: -5.505669087429055\n - type: nauc_precision_at_5_diff1\n value: 67.01880006360248\n - type: nauc_precision_at_5_max\n value: 70.78916423358686\n - type: nauc_precision_at_5_std\n value: -2.2273742736401045\n - type: nauc_recall_at_1000_diff1\n value: 62.20956180319936\n - type: nauc_recall_at_1000_max\n value: 82.53412670611287\n - type: nauc_recall_at_1000_std\n value: 55.57278124999549\n - type: nauc_recall_at_100_diff1\n value: 62.03792857023208\n - type: nauc_recall_at_100_max\n value: 76.77130713424577\n - type: nauc_recall_at_100_std\n value: 26.67410271995973\n - type: nauc_recall_at_10_diff1\n value: 65.8979805504994\n - type: nauc_recall_at_10_max\n value: 73.41908620140678\n - type: nauc_recall_at_10_std\n value: 5.2181857328318655\n - type: nauc_recall_at_1_diff1\n value: 76.63696578575964\n - type: nauc_recall_at_1_max\n value: 65.08189618178105\n - type: nauc_recall_at_1_std\n value: -8.594195451782733\n - type: nauc_recall_at_20_diff1\n value: 63.734308542647334\n - type: nauc_recall_at_20_max\n value: 74.69578825096123\n - type: nauc_recall_at_20_std\n value: 12.627842502658982\n - type: nauc_recall_at_3_diff1\n value: 67.91189666671897\n - type: nauc_recall_at_3_max\n value: 69.64986036783203\n - type: nauc_recall_at_3_std\n value: -5.505669087428989\n - type: nauc_recall_at_5_diff1\n value: 67.01880006360243\n - type: nauc_recall_at_5_max\n value: 70.78916423358686\n - type: nauc_recall_at_5_std\n value: -2.227374273640135\n - type: ndcg_at_1\n value: 56.538999999999994\n - type: ndcg_at_10\n value: 68.765\n - type: ndcg_at_100\n value: 71.314\n - type: ndcg_at_1000\n value: 72.038\n - type: ndcg_at_20\n value: 69.828\n - type: ndcg_at_3\n value: 64.937\n - type: ndcg_at_5\n value: 66.956\n - type: precision_at_1\n value: 56.538999999999994\n - type: precision_at_10\n value: 8.113\n - type: precision_at_100\n value: 0.932\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.265\n - type: precision_at_3\n value: 23.567\n - type: precision_at_5\n value: 15.115\n - type: recall_at_1\n value: 56.538999999999994\n - type: recall_at_10\n value: 81.135\n - type: recall_at_100\n value: 93.223\n - type: recall_at_1000\n value: 98.896\n - type: recall_at_20\n value: 85.304\n - type: recall_at_3\n value: 70.702\n - type: recall_at_5\n value: 75.576\n task:\n type: Retrieval\n - dataset:\n config: eng-deu\n name: MTEB MLQARetrieval (eng-deu)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 69.298\n - type: map_at_1\n value: 58.553\n - type: map_at_10\n value: 65.769\n - type: map_at_100\n value: 66.298\n - type: map_at_1000\n value: 66.328\n - type: map_at_20\n value: 66.101\n - type: map_at_3\n value: 64.048\n - type: map_at_5\n value: 65.09\n - type: mrr_at_1\n value: 58.564148016840235\n - type: mrr_at_10\n value: 65.7685997066675\n - type: mrr_at_100\n value: 66.29874034432214\n - type: mrr_at_1000\n value: 66.32844979939088\n - type: mrr_at_20\n value: 66.10120513957821\n - type: mrr_at_3\n value: 64.04830489696437\n - type: mrr_at_5\n value: 65.08974074894746\n - type: nauc_map_at_1000_diff1\n value: 76.8409650183994\n - type: nauc_map_at_1000_max\n value: 71.86367015521367\n - type: nauc_map_at_1000_std\n value: -14.464881539957256\n - type: nauc_map_at_100_diff1\n value: 76.82536521842064\n - type: nauc_map_at_100_max\n value: 71.86811127965429\n - type: nauc_map_at_100_std\n value: -14.441105539722244\n - type: nauc_map_at_10_diff1\n value: 76.75522453447859\n - type: nauc_map_at_10_max\n value: 71.87677500176706\n - type: nauc_map_at_10_std\n value: -14.741331625103559\n - type: nauc_map_at_1_diff1\n value: 79.64060747740989\n - type: nauc_map_at_1_max\n value: 69.84278563569617\n - type: nauc_map_at_1_std\n value: -15.936904929655832\n - type: nauc_map_at_20_diff1\n value: 76.78894776059715\n - type: nauc_map_at_20_max\n value: 71.89637938044827\n - type: nauc_map_at_20_std\n value: -14.500564106990769\n - type: nauc_map_at_3_diff1\n value: 77.20562577450342\n - type: nauc_map_at_3_max\n value: 71.80578229361525\n - type: nauc_map_at_3_std\n value: -15.344134588512201\n - type: nauc_map_at_5_diff1\n value: 77.00480147367867\n - type: nauc_map_at_5_max\n value: 71.98335924076163\n - type: nauc_map_at_5_std\n value: -15.16537653041026\n - type: nauc_mrr_at_1000_diff1\n value: 76.84165367691193\n - type: nauc_mrr_at_1000_max\n value: 71.8642679499795\n - type: nauc_mrr_at_1000_std\n value: -14.461717954593158\n - type: nauc_mrr_at_100_diff1\n value: 76.8263363557998\n - type: nauc_mrr_at_100_max\n value: 71.86874522368626\n - type: nauc_mrr_at_100_std\n value: -14.437105168707426\n - type: nauc_mrr_at_10_diff1\n value: 76.75522453447859\n - type: nauc_mrr_at_10_max\n value: 71.87677500176706\n - type: nauc_mrr_at_10_std\n value: -14.741331625103559\n - type: nauc_mrr_at_1_diff1\n value: 79.65642669321981\n - type: nauc_mrr_at_1_max\n value: 69.89135358784799\n - type: nauc_mrr_at_1_std\n value: -15.919357002229589\n - type: nauc_mrr_at_20_diff1\n value: 76.78883171270601\n - type: nauc_mrr_at_20_max\n value: 71.89806887245291\n - type: nauc_mrr_at_20_std\n value: -14.497139746907905\n - type: nauc_mrr_at_3_diff1\n value: 77.20562577450342\n - type: nauc_mrr_at_3_max\n value: 71.80578229361525\n - type: nauc_mrr_at_3_std\n value: -15.344134588512201\n - type: nauc_mrr_at_5_diff1\n value: 77.00480147367867\n - type: nauc_mrr_at_5_max\n value: 71.98335924076163\n - type: nauc_mrr_at_5_std\n value: -15.16537653041026\n - type: nauc_ndcg_at_1000_diff1\n value: 76.07802417817047\n - type: nauc_ndcg_at_1000_max\n value: 72.31792804426776\n - type: nauc_ndcg_at_1000_std\n value: -13.049160715132244\n - type: nauc_ndcg_at_100_diff1\n value: 75.63343849116544\n - type: nauc_ndcg_at_100_max\n value: 72.48362076101817\n - type: nauc_ndcg_at_100_std\n value: -12.089600993516777\n - type: nauc_ndcg_at_10_diff1\n value: 75.23387929929208\n - type: nauc_ndcg_at_10_max\n value: 72.51436288271807\n - type: nauc_ndcg_at_10_std\n value: -13.624132103038104\n - type: nauc_ndcg_at_1_diff1\n value: 79.65642669321981\n - type: nauc_ndcg_at_1_max\n value: 69.89135358784799\n - type: nauc_ndcg_at_1_std\n value: -15.919357002229589\n - type: nauc_ndcg_at_20_diff1\n value: 75.32926047656296\n - type: nauc_ndcg_at_20_max\n value: 72.61254165918145\n - type: nauc_ndcg_at_20_std\n value: -12.683157599238701\n - type: nauc_ndcg_at_3_diff1\n value: 76.3089337665469\n - type: nauc_ndcg_at_3_max\n value: 72.40014674426054\n - type: nauc_ndcg_at_3_std\n value: -15.08624226353458\n - type: nauc_ndcg_at_5_diff1\n value: 75.88857331641834\n - type: nauc_ndcg_at_5_max\n value: 72.7719386827224\n - type: nauc_ndcg_at_5_std\n value: -14.70546521089236\n - type: nauc_precision_at_1000_diff1\n value: 59.66563879069911\n - type: nauc_precision_at_1000_max\n value: 74.57123562956772\n - type: nauc_precision_at_1000_std\n value: 58.61396866718965\n - type: nauc_precision_at_100_diff1\n value: 62.8695896550042\n - type: nauc_precision_at_100_max\n value: 77.81408796785\n - type: nauc_precision_at_100_std\n value: 23.819735672317826\n - type: nauc_precision_at_10_diff1\n value: 68.08051625224569\n - type: nauc_precision_at_10_max\n value: 75.14432336036869\n - type: nauc_precision_at_10_std\n value: -7.97602345252735\n - type: nauc_precision_at_1_diff1\n value: 79.65642669321981\n - type: nauc_precision_at_1_max\n value: 69.89135358784799\n - type: nauc_precision_at_1_std\n value: -15.919357002229589\n - type: nauc_precision_at_20_diff1\n value: 66.7168005185165\n - type: nauc_precision_at_20_max\n value: 76.58522761697147\n - type: nauc_precision_at_20_std\n value: -0.17923428317323292\n - type: nauc_precision_at_3_diff1\n value: 73.23394851561207\n - type: nauc_precision_at_3_max\n value: 74.32517846819215\n - type: nauc_precision_at_3_std\n value: -14.142301336188348\n - type: nauc_precision_at_5_diff1\n value: 71.5666882547012\n - type: nauc_precision_at_5_max\n value: 75.71098205440033\n - type: nauc_precision_at_5_std\n value: -12.808362513638052\n - type: nauc_recall_at_1000_diff1\n value: 71.73736112325805\n - type: nauc_recall_at_1000_max\n value: 86.70743436225898\n - type: nauc_recall_at_1000_std\n value: 54.45802578371167\n - type: nauc_recall_at_100_diff1\n value: 64.07053861428128\n - type: nauc_recall_at_100_max\n value: 78.8348308099261\n - type: nauc_recall_at_100_std\n value: 22.72263677785103\n - type: nauc_recall_at_10_diff1\n value: 68.20272901407903\n - type: nauc_recall_at_10_max\n value: 75.16315335381938\n - type: nauc_recall_at_10_std\n value: -8.060716748913386\n - type: nauc_recall_at_1_diff1\n value: 79.64060747740989\n - type: nauc_recall_at_1_max\n value: 69.84278563569617\n - type: nauc_recall_at_1_std\n value: -15.936904929655832\n - type: nauc_recall_at_20_diff1\n value: 66.88206981973654\n - type: nauc_recall_at_20_max\n value: 76.54824917595687\n - type: nauc_recall_at_20_std\n value: -0.40294589316962287\n - type: nauc_recall_at_3_diff1\n value: 73.33076087258938\n - type: nauc_recall_at_3_max\n value: 74.33763112508771\n - type: nauc_recall_at_3_std\n value: -14.213355414905399\n - type: nauc_recall_at_5_diff1\n value: 71.67487623469464\n - type: nauc_recall_at_5_max\n value: 75.72770292516316\n - type: nauc_recall_at_5_std\n value: -12.887572274644818\n - type: ndcg_at_1\n value: 58.56400000000001\n - type: ndcg_at_10\n value: 69.298\n - type: ndcg_at_100\n value: 71.95899999999999\n - type: ndcg_at_1000\n value: 72.735\n - type: ndcg_at_20\n value: 70.50699999999999\n - type: ndcg_at_3\n value: 65.81700000000001\n - type: ndcg_at_5\n value: 67.681\n - type: precision_at_1\n value: 58.56400000000001\n - type: precision_at_10\n value: 8.039\n - type: precision_at_100\n value: 0.931\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.259\n - type: precision_at_3\n value: 23.65\n - type: precision_at_5\n value: 15.09\n - type: recall_at_1\n value: 58.553\n - type: recall_at_10\n value: 80.368\n - type: recall_at_100\n value: 93.013\n - type: recall_at_1000\n value: 99.092\n - type: recall_at_20\n value: 85.143\n - type: recall_at_3\n value: 70.928\n - type: recall_at_5\n value: 75.42699999999999\n task:\n type: Retrieval\n - dataset:\n config: eng-spa\n name: MTEB MLQARetrieval (eng-spa)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 66.374\n - type: map_at_1\n value: 55.494\n - type: map_at_10\n value: 62.763999999999996\n - type: map_at_100\n value: 63.33\n - type: map_at_1000\n value: 63.36000000000001\n - type: map_at_20\n value: 63.104000000000006\n - type: map_at_3\n value: 61.065000000000005\n - type: map_at_5\n value: 62.053000000000004\n - type: mrr_at_1\n value: 55.49419158255571\n - type: mrr_at_10\n value: 62.765195140457095\n - type: mrr_at_100\n value: 63.33083349354529\n - type: mrr_at_1000\n value: 63.3611897014839\n - type: mrr_at_20\n value: 63.10543590095977\n - type: mrr_at_3\n value: 61.06455913159412\n - type: mrr_at_5\n value: 62.052942296705474\n - type: nauc_map_at_1000_diff1\n value: 75.04200018088618\n - type: nauc_map_at_1000_max\n value: 70.49937782771909\n - type: nauc_map_at_1000_std\n value: -5.257206317083184\n - type: nauc_map_at_100_diff1\n value: 75.02786834256312\n - type: nauc_map_at_100_max\n value: 70.5016476500189\n - type: nauc_map_at_100_std\n value: -5.228770832077681\n - type: nauc_map_at_10_diff1\n value: 74.9626552701647\n - type: nauc_map_at_10_max\n value: 70.56253732243214\n - type: nauc_map_at_10_std\n value: -5.359037281768563\n - type: nauc_map_at_1_diff1\n value: 78.46858307815857\n - type: nauc_map_at_1_max\n value: 69.03908373759435\n - type: nauc_map_at_1_std\n value: -7.479412070736642\n - type: nauc_map_at_20_diff1\n value: 74.98121458084796\n - type: nauc_map_at_20_max\n value: 70.51885366822565\n - type: nauc_map_at_20_std\n value: -5.286051287133815\n - type: nauc_map_at_3_diff1\n value: 75.36078454383373\n - type: nauc_map_at_3_max\n value: 70.34997144546014\n - type: nauc_map_at_3_std\n value: -6.663517224039184\n - type: nauc_map_at_5_diff1\n value: 75.0274512828238\n - type: nauc_map_at_5_max\n value: 70.45292551591874\n - type: nauc_map_at_5_std\n value: -6.029224488640147\n - type: nauc_mrr_at_1000_diff1\n value: 75.04018768469983\n - type: nauc_mrr_at_1000_max\n value: 70.49855509132635\n - type: nauc_mrr_at_1000_std\n value: -5.258929961409948\n - type: nauc_mrr_at_100_diff1\n value: 75.02605732810112\n - type: nauc_mrr_at_100_max\n value: 70.50082584929103\n - type: nauc_mrr_at_100_std\n value: -5.2304917988542154\n - type: nauc_mrr_at_10_diff1\n value: 74.96079080525713\n - type: nauc_mrr_at_10_max\n value: 70.56167294920391\n - type: nauc_mrr_at_10_std\n value: -5.360650630655072\n - type: nauc_mrr_at_1_diff1\n value: 78.46858307815857\n - type: nauc_mrr_at_1_max\n value: 69.03908373759435\n - type: nauc_mrr_at_1_std\n value: -7.479412070736642\n - type: nauc_mrr_at_20_diff1\n value: 74.97939804960517\n - type: nauc_mrr_at_20_max\n value: 70.51804078965411\n - type: nauc_mrr_at_20_std\n value: -5.287681954889177\n - type: nauc_mrr_at_3_diff1\n value: 75.36078454383373\n - type: nauc_mrr_at_3_max\n value: 70.34997144546014\n - type: nauc_mrr_at_3_std\n value: -6.663517224039184\n - type: nauc_mrr_at_5_diff1\n value: 75.0274512828238\n - type: nauc_mrr_at_5_max\n value: 70.45292551591874\n - type: nauc_mrr_at_5_std\n value: -6.029224488640147\n - type: nauc_ndcg_at_1000_diff1\n value: 74.22106834748942\n - type: nauc_ndcg_at_1000_max\n value: 70.93625922934912\n - type: nauc_ndcg_at_1000_std\n value: -3.4878399005946017\n - type: nauc_ndcg_at_100_diff1\n value: 73.74068883646733\n - type: nauc_ndcg_at_100_max\n value: 71.02357018347472\n - type: nauc_ndcg_at_100_std\n value: -2.462293184201324\n - type: nauc_ndcg_at_10_diff1\n value: 73.40967965536565\n - type: nauc_ndcg_at_10_max\n value: 71.29379828672067\n - type: nauc_ndcg_at_10_std\n value: -3.295547756383108\n - type: nauc_ndcg_at_1_diff1\n value: 78.46858307815857\n - type: nauc_ndcg_at_1_max\n value: 69.03908373759435\n - type: nauc_ndcg_at_1_std\n value: -7.479412070736642\n - type: nauc_ndcg_at_20_diff1\n value: 73.45790057693699\n - type: nauc_ndcg_at_20_max\n value: 71.16598432419126\n - type: nauc_ndcg_at_20_std\n value: -2.962877157646097\n - type: nauc_ndcg_at_3_diff1\n value: 74.30696173964847\n - type: nauc_ndcg_at_3_max\n value: 70.79878978459556\n - type: nauc_ndcg_at_3_std\n value: -6.297286578628299\n - type: nauc_ndcg_at_5_diff1\n value: 73.65858211199816\n - type: nauc_ndcg_at_5_max\n value: 71.01122417463776\n - type: nauc_ndcg_at_5_std\n value: -5.075990882646765\n - type: nauc_precision_at_1000_diff1\n value: 68.71065091972568\n - type: nauc_precision_at_1000_max\n value: 81.38173585624777\n - type: nauc_precision_at_1000_std\n value: 58.035497889797895\n - type: nauc_precision_at_100_diff1\n value: 61.93634256957017\n - type: nauc_precision_at_100_max\n value: 74.84191770203093\n - type: nauc_precision_at_100_std\n value: 31.3325983123831\n - type: nauc_precision_at_10_diff1\n value: 66.68247010944937\n - type: nauc_precision_at_10_max\n value: 74.48773524654571\n - type: nauc_precision_at_10_std\n value: 6.560421880785153\n - type: nauc_precision_at_1_diff1\n value: 78.46858307815857\n - type: nauc_precision_at_1_max\n value: 69.03908373759435\n - type: nauc_precision_at_1_std\n value: -7.479412070736642\n - type: nauc_precision_at_20_diff1\n value: 65.51592872758067\n - type: nauc_precision_at_20_max\n value: 74.50684066823096\n - type: nauc_precision_at_20_std\n value: 10.830479877698208\n - type: nauc_precision_at_3_diff1\n value: 70.89587884861588\n - type: nauc_precision_at_3_max\n value: 72.25310558370424\n - type: nauc_precision_at_3_std\n value: -5.0796100900749765\n - type: nauc_precision_at_5_diff1\n value: 68.71885719845497\n - type: nauc_precision_at_5_max\n value: 73.02601751485672\n - type: nauc_precision_at_5_std\n value: -1.4382681421626857\n - type: nauc_recall_at_1000_diff1\n value: 71.95510299834734\n - type: nauc_recall_at_1000_max\n value: 84.03647166092985\n - type: nauc_recall_at_1000_std\n value: 56.87490604776847\n - type: nauc_recall_at_100_diff1\n value: 62.446624924715955\n - type: nauc_recall_at_100_max\n value: 75.25666892464507\n - type: nauc_recall_at_100_std\n value: 31.068789794554686\n - type: nauc_recall_at_10_diff1\n value: 66.70676336328988\n - type: nauc_recall_at_10_max\n value: 74.4963699656397\n - type: nauc_recall_at_10_std\n value: 6.57498399706916\n - type: nauc_recall_at_1_diff1\n value: 78.46858307815857\n - type: nauc_recall_at_1_max\n value: 69.03908373759435\n - type: nauc_recall_at_1_std\n value: -7.479412070736642\n - type: nauc_recall_at_20_diff1\n value: 65.54082767974772\n - type: nauc_recall_at_20_max\n value: 74.5111529838772\n - type: nauc_recall_at_20_std\n value: 10.84574829707354\n - type: nauc_recall_at_3_diff1\n value: 70.89587884861584\n - type: nauc_recall_at_3_max\n value: 72.25310558370421\n - type: nauc_recall_at_3_std\n value: -5.07961009007491\n - type: nauc_recall_at_5_diff1\n value: 68.71885719845501\n - type: nauc_recall_at_5_max\n value: 73.02601751485666\n - type: nauc_recall_at_5_std\n value: -1.4382681421626995\n - type: ndcg_at_1\n value: 55.494\n - type: ndcg_at_10\n value: 66.374\n - type: ndcg_at_100\n value: 69.254\n - type: ndcg_at_1000\n value: 70.136\n - type: ndcg_at_20\n value: 67.599\n - type: ndcg_at_3\n value: 62.863\n - type: ndcg_at_5\n value: 64.644\n - type: precision_at_1\n value: 55.494\n - type: precision_at_10\n value: 7.776\n - type: precision_at_100\n value: 0.9159999999999999\n - type: precision_at_1000\n value: 0.099\n - type: precision_at_20\n value: 4.1290000000000004\n - type: precision_at_3\n value: 22.688\n - type: precision_at_5\n value: 14.477\n - type: recall_at_1\n value: 55.494\n - type: recall_at_10\n value: 77.747\n - type: recall_at_100\n value: 91.535\n - type: recall_at_1000\n value: 98.619\n - type: recall_at_20\n value: 82.565\n - type: recall_at_3\n value: 68.063\n - type: recall_at_5\n value: 72.386\n task:\n type: Retrieval\n - dataset:\n config: eng-eng\n name: MTEB MLQARetrieval (eng-eng)\n revision: 397ed406c1a7902140303e7faf60fff35b58d285\n split: test\n type: facebook/mlqa\n metrics:\n - type: main_score\n value: 64.723\n - type: map_at_1\n value: 54.308\n - type: map_at_10\n value: 61.26200000000001\n - type: map_at_100\n value: 61.82299999999999\n - type: map_at_1000\n value: 61.856\n - type: map_at_20\n value: 61.575\n - type: map_at_3\n value: 59.565\n - type: map_at_5\n value: 60.561\n - type: mrr_at_1\n value: 54.31704368848212\n - type: mrr_at_10\n value: 61.26520216098834\n - type: mrr_at_100\n value: 61.82588321127103\n - type: mrr_at_1000\n value: 61.859333030574334\n - type: mrr_at_20\n value: 61.57780339921337\n - type: mrr_at_3\n value: 59.569446842801646\n - type: mrr_at_5\n value: 60.56323029989004\n - type: nauc_map_at_1000_diff1\n value: 74.21413722468635\n - type: nauc_map_at_1000_max\n value: 70.41741227882316\n - type: nauc_map_at_1000_std\n value: -2.5438707209848506\n - type: nauc_map_at_100_diff1\n value: 74.19812315947975\n - type: nauc_map_at_100_max\n value: 70.41589146728445\n - type: nauc_map_at_100_std\n value: -2.5336117059429553\n - type: nauc_map_at_10_diff1\n value: 74.21810561152937\n - type: nauc_map_at_10_max\n value: 70.48816115200171\n - type: nauc_map_at_10_std\n value: -2.7443834681406734\n - type: nauc_map_at_1_diff1\n value: 77.69378738778958\n - type: nauc_map_at_1_max\n value: 68.64652310701173\n - type: nauc_map_at_1_std\n value: -4.667071946448379\n - type: nauc_map_at_20_diff1\n value: 74.16105697562438\n - type: nauc_map_at_20_max\n value: 70.42491994631179\n - type: nauc_map_at_20_std\n value: -2.6070416022440472\n - type: nauc_map_at_3_diff1\n value: 74.60449392878863\n - type: nauc_map_at_3_max\n value: 70.39888609914269\n - type: nauc_map_at_3_std\n value: -3.5401151125723986\n - type: nauc_map_at_5_diff1\n value: 74.2423420992663\n - type: nauc_map_at_5_max\n value: 70.36574501826757\n - type: nauc_map_at_5_std\n value: -3.2707393116898964\n - type: nauc_mrr_at_1000_diff1\n value: 74.21029843731323\n - type: nauc_mrr_at_1000_max\n value: 70.43020492688913\n - type: nauc_mrr_at_1000_std\n value: -2.526895582202081\n - type: nauc_mrr_at_100_diff1\n value: 74.19440960479243\n - type: nauc_mrr_at_100_max\n value: 70.4288998824232\n - type: nauc_mrr_at_100_std\n value: -2.5160929945118107\n - type: nauc_mrr_at_10_diff1\n value: 74.2141357266166\n - type: nauc_mrr_at_10_max\n value: 70.5005683347807\n - type: nauc_mrr_at_10_std\n value: -2.727154557882168\n - type: nauc_mrr_at_1_diff1\n value: 77.69891248239793\n - type: nauc_mrr_at_1_max\n value: 68.68255231164922\n - type: nauc_mrr_at_1_std\n value: -4.630226727154317\n - type: nauc_mrr_at_20_diff1\n value: 74.15705434409723\n - type: nauc_mrr_at_20_max\n value: 70.43741835972747\n - type: nauc_mrr_at_20_std\n value: -2.5896756472464495\n - type: nauc_mrr_at_3_diff1\n value: 74.5981844349412\n - type: nauc_mrr_at_3_max\n value: 70.41834937080564\n - type: nauc_mrr_at_3_std\n value: -3.5161656408031163\n - type: nauc_mrr_at_5_diff1\n value: 74.23847535424844\n - type: nauc_mrr_at_5_max\n value: 70.37763810013656\n - type: nauc_mrr_at_5_std\n value: -3.2560955164581733\n - type: nauc_ndcg_at_1000_diff1\n value: 73.20994496725493\n - type: nauc_ndcg_at_1000_max\n value: 70.8903016277125\n - type: nauc_ndcg_at_1000_std\n value: -0.625772298462309\n - type: nauc_ndcg_at_100_diff1\n value: 72.6847141682645\n - type: nauc_ndcg_at_100_max\n value: 70.86564422034162\n - type: nauc_ndcg_at_100_std\n value: -0.07195786766326141\n - type: nauc_ndcg_at_10_diff1\n value: 72.78806493754281\n - type: nauc_ndcg_at_10_max\n value: 71.21957067926769\n - type: nauc_ndcg_at_10_std\n value: -1.2760418313382227\n - type: nauc_ndcg_at_1_diff1\n value: 77.69891248239793\n - type: nauc_ndcg_at_1_max\n value: 68.68255231164922\n - type: nauc_ndcg_at_1_std\n value: -4.630226727154317\n - type: nauc_ndcg_at_20_diff1\n value: 72.52082440882546\n - type: nauc_ndcg_at_20_max\n value: 70.98185004796734\n - type: nauc_ndcg_at_20_std\n value: -0.6908280874815464\n - type: nauc_ndcg_at_3_diff1\n value: 73.59870660843939\n - type: nauc_ndcg_at_3_max\n value: 70.94391957288654\n - type: nauc_ndcg_at_3_std\n value: -3.147723179140428\n - type: nauc_ndcg_at_5_diff1\n value: 72.90122868193457\n - type: nauc_ndcg_at_5_max\n value: 70.89376368965165\n - type: nauc_ndcg_at_5_std\n value: -2.6451807385626744\n - type: nauc_precision_at_1000_diff1\n value: 58.14737201864067\n - type: nauc_precision_at_1000_max\n value: 78.79011251144826\n - type: nauc_precision_at_1000_std\n value: 59.98985420476577\n - type: nauc_precision_at_100_diff1\n value: 59.21069121644552\n - type: nauc_precision_at_100_max\n value: 73.00557835912306\n - type: nauc_precision_at_100_std\n value: 26.85027406282173\n - type: nauc_precision_at_10_diff1\n value: 66.8760831023675\n - type: nauc_precision_at_10_max\n value: 74.21167950452596\n - type: nauc_precision_at_10_std\n value: 5.453652499335947\n - type: nauc_precision_at_1_diff1\n value: 77.69891248239793\n - type: nauc_precision_at_1_max\n value: 68.68255231164922\n - type: nauc_precision_at_1_std\n value: -4.630226727154317\n - type: nauc_precision_at_20_diff1\n value: 64.3118559132602\n - type: nauc_precision_at_20_max\n value: 73.33078184673825\n - type: nauc_precision_at_20_std\n value: 9.993299523049402\n - type: nauc_precision_at_3_diff1\n value: 70.38667185155593\n - type: nauc_precision_at_3_max\n value: 72.66495006030951\n - type: nauc_precision_at_3_std\n value: -1.8532839591326276\n - type: nauc_precision_at_5_diff1\n value: 68.12161337583686\n - type: nauc_precision_at_5_max\n value: 72.65644960375046\n - type: nauc_precision_at_5_std\n value: -0.33317164167012875\n - type: nauc_recall_at_1000_diff1\n value: 61.63204394739985\n - type: nauc_recall_at_1000_max\n value: 81.77241537319897\n - type: nauc_recall_at_1000_std\n value: 58.44841544062308\n - type: nauc_recall_at_100_diff1\n value: 59.72072697224705\n - type: nauc_recall_at_100_max\n value: 73.28519507061553\n - type: nauc_recall_at_100_std\n value: 26.27318390763456\n - type: nauc_recall_at_10_diff1\n value: 66.9757135465418\n - type: nauc_recall_at_10_max\n value: 74.21919493374149\n - type: nauc_recall_at_10_std\n value: 5.323369605377166\n - type: nauc_recall_at_1_diff1\n value: 77.69378738778958\n - type: nauc_recall_at_1_max\n value: 68.64652310701173\n - type: nauc_recall_at_1_std\n value: -4.667071946448379\n - type: nauc_recall_at_20_diff1\n value: 64.42290081731899\n - type: nauc_recall_at_20_max\n value: 73.3358289439033\n - type: nauc_recall_at_20_std\n value: 9.846598361586073\n - type: nauc_recall_at_3_diff1\n value: 70.41211290964785\n - type: nauc_recall_at_3_max\n value: 72.64451776775402\n - type: nauc_recall_at_3_std\n value: -1.916280959835826\n - type: nauc_recall_at_5_diff1\n value: 68.20695272727916\n - type: nauc_recall_at_5_max\n value: 72.66404224006101\n - type: nauc_recall_at_5_std\n value: -0.431125323007886\n - type: ndcg_at_1\n value: 54.31700000000001\n - type: ndcg_at_10\n value: 64.723\n - type: ndcg_at_100\n value: 67.648\n - type: ndcg_at_1000\n value: 68.619\n - type: ndcg_at_20\n value: 65.85499999999999\n - type: ndcg_at_3\n value: 61.244\n - type: ndcg_at_5\n value: 63.038000000000004\n - type: precision_at_1\n value: 54.31700000000001\n - type: precision_at_10\n value: 7.564\n - type: precision_at_100\n value: 0.898\n - type: precision_at_1000\n value: 0.098\n - type: precision_at_20\n value: 4.005\n - type: precision_at_3\n value: 22.034000000000002\n - type: precision_at_5\n value: 14.093\n - type: recall_at_1\n value: 54.308\n - type: recall_at_10\n value: 75.622\n - type: recall_at_100\n value: 89.744\n - type: recall_at_1000\n value: 97.539\n - type: recall_at_20\n value: 80.085\n - type: recall_at_3\n value: 66.09\n - type: recall_at_5\n value: 70.446\n task:\n type: Retrieval\n - dataset:\n config: de\n name: MTEB MLSUMClusteringP2P (de)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 41.267647761702854\n - type: v_measure\n value: 41.267647761702854\n - type: v_measure_std\n value: 10.93390895077248\n task:\n type: Clustering\n - dataset:\n config: fr\n name: MTEB MLSUMClusteringP2P (fr)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 44.68714862333979\n - type: v_measure\n value: 44.68714862333979\n - type: v_measure_std\n value: 1.811036989797814\n task:\n type: Clustering\n - dataset:\n config: ru\n name: MTEB MLSUMClusteringP2P (ru)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 41.92518785753813\n - type: v_measure\n value: 41.92518785753813\n - type: v_measure_std\n value: 5.9356661900220775\n task:\n type: Clustering\n - dataset:\n config: es\n name: MTEB MLSUMClusteringP2P (es)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 48.69875719812033\n - type: v_measure\n value: 48.69875719812033\n - type: v_measure_std\n value: 1.204253881950113\n task:\n type: Clustering\n - dataset:\n config: de\n name: MTEB MLSUMClusteringS2S (de)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 40.07927325071353\n - type: v_measure\n value: 40.07927325071353\n - type: v_measure_std\n value: 9.296680835266145\n task:\n type: Clustering\n - dataset:\n config: fr\n name: MTEB MLSUMClusteringS2S (fr)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 44.88484854069901\n - type: v_measure\n value: 44.88484854069901\n - type: v_measure_std\n value: 2.3704247819781843\n task:\n type: Clustering\n - dataset:\n config: ru\n name: MTEB MLSUMClusteringS2S (ru)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 43.97657450929179\n - type: v_measure\n value: 43.97657450929179\n - type: v_measure_std\n value: 6.087547931333613\n task:\n type: Clustering\n - dataset:\n config: es\n name: MTEB MLSUMClusteringS2S (es)\n revision: b5d54f8f3b61ae17845046286940f03c6bc79bc7\n split: test\n type: reciTAL/mlsum\n metrics:\n - type: main_score\n value: 48.41108671948728\n - type: v_measure\n value: 48.41108671948728\n - type: v_measure_std\n value: 1.3848320630151243\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB MMarcoReranking (default)\n revision: 8e0c766dbe9e16e1d221116a3f36795fbade07f6\n split: dev\n type: C-MTEB/Mmarco-reranking\n metrics:\n - type: map\n value: 21.050447576170395\n - type: mrr\n value: 20.201984126984126\n - type: main_score\n value: 21.050447576170395\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB MMarcoRetrieval (default)\n revision: 539bbde593d947e2a124ba72651aafc09eb33fc2\n split: dev\n type: C-MTEB/MMarcoRetrieval\n metrics:\n - type: main_score\n value: 79.687\n - type: map_at_1\n value: 66.872\n - type: map_at_10\n value: 75.949\n - type: map_at_100\n value: 76.25\n - type: map_at_1000\n value: 76.259\n - type: map_at_20\n value: 76.145\n - type: map_at_3\n value: 74.01299999999999\n - type: map_at_5\n value: 75.232\n - type: mrr_at_1\n value: 69.18338108882521\n - type: mrr_at_10\n value: 76.5424227952881\n - type: mrr_at_100\n value: 76.8019342792628\n - type: mrr_at_1000\n value: 76.81002278342808\n - type: mrr_at_20\n value: 76.7115234815896\n - type: mrr_at_3\n value: 74.83046800382044\n - type: mrr_at_5\n value: 75.88490926456515\n - type: nauc_map_at_1000_diff1\n value: 78.06933310424179\n - type: nauc_map_at_1000_max\n value: 49.392948209665896\n - type: nauc_map_at_1000_std\n value: -15.126109322591166\n - type: nauc_map_at_100_diff1\n value: 78.06612779298378\n - type: nauc_map_at_100_max\n value: 49.40761618630397\n - type: nauc_map_at_100_std\n value: -15.099282408159349\n - type: nauc_map_at_10_diff1\n value: 77.94565685470538\n - type: nauc_map_at_10_max\n value: 49.50559610363201\n - type: nauc_map_at_10_std\n value: -15.182130695916355\n - type: nauc_map_at_1_diff1\n value: 79.84814509858211\n - type: nauc_map_at_1_max\n value: 40.78978466656547\n - type: nauc_map_at_1_std\n value: -19.96189264026715\n - type: nauc_map_at_20_diff1\n value: 78.03597839981245\n - type: nauc_map_at_20_max\n value: 49.49477427223376\n - type: nauc_map_at_20_std\n value: -15.084990000838378\n - type: nauc_map_at_3_diff1\n value: 78.0637014655507\n - type: nauc_map_at_3_max\n value: 48.63214001973341\n - type: nauc_map_at_3_std\n value: -17.093950563306596\n - type: nauc_map_at_5_diff1\n value: 77.94068229240348\n - type: nauc_map_at_5_max\n value: 49.38930719689204\n - type: nauc_map_at_5_std\n value: -15.9919454201954\n - type: nauc_mrr_at_1000_diff1\n value: 78.34582398092816\n - type: nauc_mrr_at_1000_max\n value: 49.623566992784156\n - type: nauc_mrr_at_1000_std\n value: -14.381347765493265\n - type: nauc_mrr_at_100_diff1\n value: 78.3429966714221\n - type: nauc_mrr_at_100_max\n value: 49.63684922240546\n - type: nauc_mrr_at_100_std\n value: -14.354914066301236\n - type: nauc_mrr_at_10_diff1\n value: 78.2208070219624\n - type: nauc_mrr_at_10_max\n value: 49.77720536573364\n - type: nauc_mrr_at_10_std\n value: -14.316233764741812\n - type: nauc_mrr_at_1_diff1\n value: 80.22305496572142\n - type: nauc_mrr_at_1_max\n value: 44.30231210192536\n - type: nauc_mrr_at_1_std\n value: -18.942549914934492\n - type: nauc_mrr_at_20_diff1\n value: 78.31006724240147\n - type: nauc_mrr_at_20_max\n value: 49.72338465276142\n - type: nauc_mrr_at_20_std\n value: -14.30722621948953\n - type: nauc_mrr_at_3_diff1\n value: 78.39832634634523\n - type: nauc_mrr_at_3_max\n value: 49.24985961036677\n - type: nauc_mrr_at_3_std\n value: -15.966286866763191\n - type: nauc_mrr_at_5_diff1\n value: 78.2406507247798\n - type: nauc_mrr_at_5_max\n value: 49.71276359754787\n - type: nauc_mrr_at_5_std\n value: -14.979526226149698\n - type: nauc_ndcg_at_1000_diff1\n value: 77.74892471071016\n - type: nauc_ndcg_at_1000_max\n value: 51.11543344053061\n - type: nauc_ndcg_at_1000_std\n value: -12.208878737005096\n - type: nauc_ndcg_at_100_diff1\n value: 77.67462502211228\n - type: nauc_ndcg_at_100_max\n value: 51.593977338939034\n - type: nauc_ndcg_at_100_std\n value: -11.312126179513802\n - type: nauc_ndcg_at_10_diff1\n value: 77.0571291760012\n - type: nauc_ndcg_at_10_max\n value: 52.35435572808972\n - type: nauc_ndcg_at_10_std\n value: -11.33242546164059\n - type: nauc_ndcg_at_1_diff1\n value: 80.22305496572142\n - type: nauc_ndcg_at_1_max\n value: 44.30231210192536\n - type: nauc_ndcg_at_1_std\n value: -18.942549914934492\n - type: nauc_ndcg_at_20_diff1\n value: 77.4141216117471\n - type: nauc_ndcg_at_20_max\n value: 52.340600871365375\n - type: nauc_ndcg_at_20_std\n value: -10.989010161550912\n - type: nauc_ndcg_at_3_diff1\n value: 77.43971989259062\n - type: nauc_ndcg_at_3_max\n value: 50.59251358320663\n - type: nauc_ndcg_at_3_std\n value: -15.59337960636058\n - type: nauc_ndcg_at_5_diff1\n value: 77.12174287031847\n - type: nauc_ndcg_at_5_max\n value: 51.97108510288907\n - type: nauc_ndcg_at_5_std\n value: -13.474902612427167\n - type: nauc_precision_at_1000_diff1\n value: -19.36793534929367\n - type: nauc_precision_at_1000_max\n value: 11.803383262344036\n - type: nauc_precision_at_1000_std\n value: 24.304436015177046\n - type: nauc_precision_at_100_diff1\n value: -6.273790806909921\n - type: nauc_precision_at_100_max\n value: 23.372606271300747\n - type: nauc_precision_at_100_std\n value: 29.085768971612342\n - type: nauc_precision_at_10_diff1\n value: 21.67045907336595\n - type: nauc_precision_at_10_max\n value: 41.68948432407223\n - type: nauc_precision_at_10_std\n value: 17.837055074458092\n - type: nauc_precision_at_1_diff1\n value: 80.22305496572142\n - type: nauc_precision_at_1_max\n value: 44.30231210192536\n - type: nauc_precision_at_1_std\n value: -18.942549914934492\n - type: nauc_precision_at_20_diff1\n value: 12.577671896684803\n - type: nauc_precision_at_20_max\n value: 37.44944702246691\n - type: nauc_precision_at_20_std\n value: 23.635897665206087\n - type: nauc_precision_at_3_diff1\n value: 47.165335112814056\n - type: nauc_precision_at_3_max\n value: 47.0458691263379\n - type: nauc_precision_at_3_std\n value: -3.3181861146890217\n - type: nauc_precision_at_5_diff1\n value: 35.406205343514806\n - type: nauc_precision_at_5_max\n value: 45.56549449285401\n - type: nauc_precision_at_5_std\n value: 5.612378074562386\n - type: nauc_recall_at_1000_diff1\n value: 72.32762520815842\n - type: nauc_recall_at_1000_max\n value: 85.64979256307343\n - type: nauc_recall_at_1000_std\n value: 73.61925297037476\n - type: nauc_recall_at_100_diff1\n value: 72.31946328709962\n - type: nauc_recall_at_100_max\n value: 83.76576070068353\n - type: nauc_recall_at_100_std\n value: 57.39376538662535\n - type: nauc_recall_at_10_diff1\n value: 69.51307788072499\n - type: nauc_recall_at_10_max\n value: 69.60124733654142\n - type: nauc_recall_at_10_std\n value: 13.483540424716892\n - type: nauc_recall_at_1_diff1\n value: 79.84814509858211\n - type: nauc_recall_at_1_max\n value: 40.78978466656547\n - type: nauc_recall_at_1_std\n value: -19.96189264026715\n - type: nauc_recall_at_20_diff1\n value: 70.92168324710599\n - type: nauc_recall_at_20_max\n value: 76.09106252420084\n - type: nauc_recall_at_20_std\n value: 25.406842300761447\n - type: nauc_recall_at_3_diff1\n value: 74.1212680517145\n - type: nauc_recall_at_3_max\n value: 56.24921832879403\n - type: nauc_recall_at_3_std\n value: -11.55542913578436\n - type: nauc_recall_at_5_diff1\n value: 72.31262959872993\n - type: nauc_recall_at_5_max\n value: 62.761214896697915\n - type: nauc_recall_at_5_std\n value: -3.280167584070396\n - type: ndcg_at_1\n value: 69.18299999999999\n - type: ndcg_at_10\n value: 79.687\n - type: ndcg_at_100\n value: 81.062\n - type: ndcg_at_1000\n value: 81.312\n - type: ndcg_at_20\n value: 80.34599999999999\n - type: ndcg_at_3\n value: 75.98700000000001\n - type: ndcg_at_5\n value: 78.039\n - type: precision_at_1\n value: 69.18299999999999\n - type: precision_at_10\n value: 9.636\n - type: precision_at_100\n value: 1.0330000000000001\n - type: precision_at_1000\n value: 0.105\n - type: precision_at_20\n value: 4.958\n - type: precision_at_3\n value: 28.515\n - type: precision_at_5\n value: 18.201\n - type: recall_at_1\n value: 66.872\n - type: recall_at_10\n value: 90.688\n - type: recall_at_100\n value: 96.99\n - type: recall_at_1000\n value: 98.958\n - type: recall_at_20\n value: 93.21199999999999\n - type: recall_at_3\n value: 80.84599999999999\n - type: recall_at_5\n value: 85.732\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB MSMARCO (default)\n revision: c5a29a104738b98a9e76336939199e264163d4a0\n split: dev\n type: mteb/msmarco\n metrics:\n - type: map_at_1\n value: 21.861\n - type: map_at_10\n value: 34.008\n - type: map_at_100\n value: 35.174\n - type: map_at_1000\n value: 35.224\n - type: map_at_20\n value: 34.705999999999996\n - type: map_at_3\n value: 30.209000000000003\n - type: map_at_5\n value: 32.351\n - type: mrr_at_1\n value: 22.493\n - type: mrr_at_10\n value: 34.583999999999996\n - type: mrr_at_100\n value: 35.691\n - type: mrr_at_1000\n value: 35.736000000000004\n - type: mrr_at_20\n value: 35.257\n - type: mrr_at_3\n value: 30.85\n - type: mrr_at_5\n value: 32.962\n - type: ndcg_at_1\n value: 22.493\n - type: ndcg_at_10\n value: 40.815\n - type: ndcg_at_100\n value: 46.483999999999995\n - type: ndcg_at_1000\n value: 47.73\n - type: ndcg_at_20\n value: 43.302\n - type: ndcg_at_3\n value: 33.056000000000004\n - type: ndcg_at_5\n value: 36.879\n - type: precision_at_1\n value: 22.493\n - type: precision_at_10\n value: 6.465999999999999\n - type: precision_at_100\n value: 0.932\n - type: precision_at_1000\n value: 0.104\n - type: precision_at_20\n value: 3.752\n - type: precision_at_3\n value: 14.069\n - type: precision_at_5\n value: 10.384\n - type: recall_at_1\n value: 21.861\n - type: recall_at_10\n value: 61.781\n - type: recall_at_100\n value: 88.095\n - type: recall_at_1000\n value: 97.625\n - type: recall_at_20\n value: 71.44500000000001\n - type: recall_at_3\n value: 40.653\n - type: recall_at_5\n value: 49.841\n - type: main_score\n value: 40.815\n task:\n type: Retrieval\n - dataset:\n config: en\n name: MTEB MTOPDomainClassification (en)\n revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf\n split: test\n type: mteb/mtop_domain\n metrics:\n - type: accuracy\n value: 97.4874601003192\n - type: f1\n value: 97.19067544931094\n - type: f1_weighted\n value: 97.49331776181019\n - type: main_score\n value: 97.4874601003192\n task:\n type: Classification\n - dataset:\n config: de\n name: MTEB MTOPDomainClassification (de)\n revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf\n split: test\n type: mteb/mtop_domain\n metrics:\n - type: accuracy\n value: 96.89489997182305\n - type: f1\n value: 96.51138586512977\n - type: f1_weighted\n value: 96.89723065967186\n - type: main_score\n value: 96.89489997182305\n task:\n type: Classification\n - dataset:\n config: es\n name: MTEB MTOPDomainClassification (es)\n revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf\n split: test\n type: mteb/mtop_domain\n metrics:\n - type: accuracy\n value: 97.17144763175452\n - type: f1\n value: 96.81785681878274\n - type: f1_weighted\n value: 97.1778974586874\n - type: main_score\n value: 97.17144763175452\n task:\n type: Classification\n - dataset:\n config: fr\n name: MTEB MTOPDomainClassification (fr)\n revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf\n split: test\n type: mteb/mtop_domain\n metrics:\n - type: accuracy\n value: 96.30128405887879\n - type: f1\n value: 95.94555923088487\n - type: f1_weighted\n value: 96.30399416794926\n - type: main_score\n value: 96.30128405887879\n task:\n type: Classification\n - dataset:\n config: en\n name: MTEB MTOPIntentClassification (en)\n revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba\n split: test\n type: mteb/mtop_intent\n metrics:\n - type: accuracy\n value: 84.53488372093022\n - type: f1\n value: 61.77995074251401\n - type: f1_weighted\n value: 86.8005170485101\n - type: main_score\n value: 84.53488372093022\n task:\n type: Classification\n - dataset:\n config: de\n name: MTEB MTOPIntentClassification (de)\n revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba\n split: test\n type: mteb/mtop_intent\n metrics:\n - type: accuracy\n value: 80.79459002535924\n - type: f1\n value: 56.08938302001448\n - type: f1_weighted\n value: 83.66582131948252\n - type: main_score\n value: 80.79459002535924\n task:\n type: Classification\n - dataset:\n config: es\n name: MTEB MTOPIntentClassification (es)\n revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba\n split: test\n type: mteb/mtop_intent\n metrics:\n - type: accuracy\n value: 84.7765176784523\n - type: f1\n value: 61.39860057885528\n - type: f1_weighted\n value: 86.94881745670745\n - type: main_score\n value: 84.7765176784523\n task:\n type: Classification\n - dataset:\n config: fr\n name: MTEB MTOPIntentClassification (fr)\n revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba\n split: test\n type: mteb/mtop_intent\n metrics:\n - type: accuracy\n value: 82.2079549013467\n - type: f1\n value: 59.90260478749016\n - type: f1_weighted\n value: 84.36861708593257\n - type: main_score\n value: 82.2079549013467\n task:\n type: Classification\n - dataset:\n config: eng\n name: MTEB MasakhaNEWSClassification (eng)\n revision: 18193f187b92da67168c655c9973a165ed9593dd\n split: test\n type: mteb/masakhanews\n metrics:\n - type: accuracy\n value: 74.98945147679325\n - type: f1\n value: 74.3157483560261\n - type: f1_weighted\n value: 75.01179008904884\n - type: main_score\n value: 74.98945147679325\n task:\n type: Classification\n - dataset:\n config: fra\n name: MTEB MasakhaNEWSClassification (fra)\n revision: 18193f187b92da67168c655c9973a165ed9593dd\n split: test\n type: mteb/masakhanews\n metrics:\n - type: accuracy\n value: 74.02843601895735\n - type: f1\n value: 70.40326349620732\n - type: f1_weighted\n value: 74.6596277063484\n - type: main_score\n value: 74.02843601895735\n task:\n type: Classification\n - dataset:\n config: amh\n name: MTEB MasakhaNEWSClusteringP2P (amh)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 69.45780291725053\n - type: v_measure\n value: 69.45780291725053\n - type: v_measure_std\n value: 36.54340055904091\n task:\n type: Clustering\n - dataset:\n config: eng\n name: MTEB MasakhaNEWSClusteringP2P (eng)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 64.88996119332239\n - type: v_measure\n value: 64.88996119332239\n - type: v_measure_std\n value: 30.017223408197268\n task:\n type: Clustering\n - dataset:\n config: fra\n name: MTEB MasakhaNEWSClusteringP2P (fra)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 42.362383958691666\n - type: v_measure\n value: 42.362383958691666\n - type: v_measure_std\n value: 37.61076788039063\n task:\n type: Clustering\n - dataset:\n config: hau\n name: MTEB MasakhaNEWSClusteringP2P (hau)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 43.29201252405562\n - type: v_measure\n value: 43.29201252405562\n - type: v_measure_std\n value: 34.31987945146255\n task:\n type: Clustering\n - dataset:\n config: ibo\n name: MTEB MasakhaNEWSClusteringP2P (ibo)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 33.59926542995238\n - type: v_measure\n value: 33.59926542995238\n - type: v_measure_std\n value: 35.70048601084112\n task:\n type: Clustering\n - dataset:\n config: lin\n name: MTEB MasakhaNEWSClusteringP2P (lin)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 67.58487601893106\n - type: v_measure\n value: 67.58487601893106\n - type: v_measure_std\n value: 35.16784970777931\n task:\n type: Clustering\n - dataset:\n config: lug\n name: MTEB MasakhaNEWSClusteringP2P (lug)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 50.01220872023533\n - type: v_measure\n value: 50.01220872023533\n - type: v_measure_std\n value: 41.87411574676182\n task:\n type: Clustering\n - dataset:\n config: orm\n name: MTEB MasakhaNEWSClusteringP2P (orm)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 29.007847502598317\n - type: v_measure\n value: 29.007847502598317\n - type: v_measure_std\n value: 38.374997395079994\n task:\n type: Clustering\n - dataset:\n config: pcm\n name: MTEB MasakhaNEWSClusteringP2P (pcm)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 79.13520228554611\n - type: v_measure\n value: 79.13520228554611\n - type: v_measure_std\n value: 18.501843848275183\n task:\n type: Clustering\n - dataset:\n config: run\n name: MTEB MasakhaNEWSClusteringP2P (run)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 60.317213909746656\n - type: v_measure\n value: 60.317213909746656\n - type: v_measure_std\n value: 36.500281823747386\n task:\n type: Clustering\n - dataset:\n config: sna\n name: MTEB MasakhaNEWSClusteringP2P (sna)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 59.395277358240946\n - type: v_measure\n value: 59.395277358240946\n - type: v_measure_std\n value: 37.500916816164654\n task:\n type: Clustering\n - dataset:\n config: som\n name: MTEB MasakhaNEWSClusteringP2P (som)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 38.18638688704302\n - type: v_measure\n value: 38.18638688704302\n - type: v_measure_std\n value: 35.453681137564466\n task:\n type: Clustering\n - dataset:\n config: swa\n name: MTEB MasakhaNEWSClusteringP2P (swa)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 29.49230755729658\n - type: v_measure\n value: 29.49230755729658\n - type: v_measure_std\n value: 28.284313285264645\n task:\n type: Clustering\n - dataset:\n config: tir\n name: MTEB MasakhaNEWSClusteringP2P (tir)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 60.632258622750115\n - type: v_measure\n value: 60.632258622750115\n - type: v_measure_std\n value: 34.429711214740564\n task:\n type: Clustering\n - dataset:\n config: xho\n name: MTEB MasakhaNEWSClusteringP2P (xho)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 41.76322918806381\n - type: v_measure\n value: 41.76322918806381\n - type: v_measure_std\n value: 36.43245296200775\n task:\n type: Clustering\n - dataset:\n config: yor\n name: MTEB MasakhaNEWSClusteringP2P (yor)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 33.17083910808645\n - type: v_measure\n value: 33.17083910808645\n - type: v_measure_std\n value: 34.87547994284835\n task:\n type: Clustering\n - dataset:\n config: amh\n name: MTEB MasakhaNEWSClusteringS2S (amh)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 60.95132147787602\n - type: v_measure\n value: 60.95132147787602\n - type: v_measure_std\n value: 37.330148394033365\n task:\n type: Clustering\n - dataset:\n config: eng\n name: MTEB MasakhaNEWSClusteringS2S (eng)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 60.974810831426595\n - type: v_measure\n value: 60.974810831426595\n - type: v_measure_std\n value: 24.934675467507827\n task:\n type: Clustering\n - dataset:\n config: fra\n name: MTEB MasakhaNEWSClusteringS2S (fra)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 44.479206673553335\n - type: v_measure\n value: 44.479206673553335\n - type: v_measure_std\n value: 32.58254804499339\n task:\n type: Clustering\n - dataset:\n config: hau\n name: MTEB MasakhaNEWSClusteringS2S (hau)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 26.4742082741682\n - type: v_measure\n value: 26.4742082741682\n - type: v_measure_std\n value: 22.344929192323097\n task:\n type: Clustering\n - dataset:\n config: ibo\n name: MTEB MasakhaNEWSClusteringS2S (ibo)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 38.906129911741985\n - type: v_measure\n value: 38.906129911741985\n - type: v_measure_std\n value: 34.785601792668444\n task:\n type: Clustering\n - dataset:\n config: lin\n name: MTEB MasakhaNEWSClusteringS2S (lin)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 62.60982020876592\n - type: v_measure\n value: 62.60982020876592\n - type: v_measure_std\n value: 40.7368955715045\n task:\n type: Clustering\n - dataset:\n config: lug\n name: MTEB MasakhaNEWSClusteringS2S (lug)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 42.70424106365967\n - type: v_measure\n value: 42.70424106365967\n - type: v_measure_std\n value: 46.80946241135087\n task:\n type: Clustering\n - dataset:\n config: orm\n name: MTEB MasakhaNEWSClusteringS2S (orm)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 28.609942199922322\n - type: v_measure\n value: 28.609942199922322\n - type: v_measure_std\n value: 38.46685040191088\n task:\n type: Clustering\n - dataset:\n config: pcm\n name: MTEB MasakhaNEWSClusteringS2S (pcm)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 76.83901348810822\n - type: v_measure\n value: 76.83901348810822\n - type: v_measure_std\n value: 17.57617141269189\n task:\n type: Clustering\n - dataset:\n config: run\n name: MTEB MasakhaNEWSClusteringS2S (run)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 46.89757547846193\n - type: v_measure\n value: 46.89757547846193\n - type: v_measure_std\n value: 44.58903590203438\n task:\n type: Clustering\n - dataset:\n config: sna\n name: MTEB MasakhaNEWSClusteringS2S (sna)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 55.37185207068829\n - type: v_measure\n value: 55.37185207068829\n - type: v_measure_std\n value: 36.944574863543004\n task:\n type: Clustering\n - dataset:\n config: som\n name: MTEB MasakhaNEWSClusteringS2S (som)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 37.44211021681754\n - type: v_measure\n value: 37.44211021681754\n - type: v_measure_std\n value: 33.41469994463241\n task:\n type: Clustering\n - dataset:\n config: swa\n name: MTEB MasakhaNEWSClusteringS2S (swa)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 26.020680621216062\n - type: v_measure\n value: 26.020680621216062\n - type: v_measure_std\n value: 25.480037522570413\n task:\n type: Clustering\n - dataset:\n config: tir\n name: MTEB MasakhaNEWSClusteringS2S (tir)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 63.74306846771303\n - type: v_measure\n value: 63.74306846771303\n - type: v_measure_std\n value: 32.19119631078685\n task:\n type: Clustering\n - dataset:\n config: xho\n name: MTEB MasakhaNEWSClusteringS2S (xho)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 24.580890519243777\n - type: v_measure\n value: 24.580890519243777\n - type: v_measure_std\n value: 37.941836363967106\n task:\n type: Clustering\n - dataset:\n config: yor\n name: MTEB MasakhaNEWSClusteringS2S (yor)\n revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60\n split: test\n type: masakhane/masakhanews\n metrics:\n - type: main_score\n value: 43.63458888828314\n - type: v_measure\n value: 43.63458888828314\n - type: v_measure_std\n value: 31.28169350649098\n task:\n type: Clustering\n - dataset:\n config: pl\n name: MTEB MassiveIntentClassification (pl)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 75.37323470073974\n - type: f1\n value: 71.1836877753734\n - type: f1_weighted\n value: 75.72073213955457\n - type: main_score\n value: 75.37323470073974\n task:\n type: Classification\n - dataset:\n config: de\n name: MTEB MassiveIntentClassification (de)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 74.83523873570948\n - type: f1\n value: 70.72375821116886\n - type: f1_weighted\n value: 75.20800490010755\n - type: main_score\n value: 74.83523873570948\n task:\n type: Classification\n - dataset:\n config: es\n name: MTEB MassiveIntentClassification (es)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 75.31607262945528\n - type: f1\n value: 72.06063554897662\n - type: f1_weighted\n value: 75.72438161355252\n - type: main_score\n value: 75.31607262945528\n task:\n type: Classification\n - dataset:\n config: ru\n name: MTEB MassiveIntentClassification (ru)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 76.7955615332885\n - type: f1\n value: 73.08099648499756\n - type: f1_weighted\n value: 77.18482068239668\n - type: main_score\n value: 76.7955615332885\n task:\n type: Classification\n - dataset:\n config: en\n name: MTEB MassiveIntentClassification (en)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 77.60591795561534\n - type: f1\n value: 74.46676705370395\n - type: f1_weighted\n value: 77.69888062336614\n - type: main_score\n value: 77.60591795561534\n task:\n type: Classification\n - dataset:\n config: fr\n name: MTEB MassiveIntentClassification (fr)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 76.32145258910558\n - type: f1\n value: 72.89824154178328\n - type: f1_weighted\n value: 76.6539327979472\n - type: main_score\n value: 76.32145258910558\n task:\n type: Classification\n - dataset:\n config: zh-CN\n name: MTEB MassiveIntentClassification (zh-CN)\n revision: 4672e20407010da34463acc759c162ca9734bca6\n split: test\n type: mteb/amazon_massive_intent\n metrics:\n - type: accuracy\n value: 73.21788836583724\n - type: f1\n value: 70.45594512246377\n - type: f1_weighted\n value: 73.67862536499393\n - type: main_score\n value: 73.21788836583724\n task:\n type: Classification\n - dataset:\n config: zh-CN\n name: MTEB MassiveScenarioClassification (zh-CN)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 80.82044384667114\n - type: f1\n value: 80.53217664465089\n - type: f1_weighted\n value: 80.94535087010512\n - type: main_score\n value: 80.82044384667114\n task:\n type: Classification\n - dataset:\n config: pl\n name: MTEB MassiveScenarioClassification (pl)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 82.1049092131809\n - type: f1\n value: 81.55343463694733\n - type: f1_weighted\n value: 82.33509098770782\n - type: main_score\n value: 82.1049092131809\n task:\n type: Classification\n - dataset:\n config: es\n name: MTEB MassiveScenarioClassification (es)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 82.58238063214526\n - type: f1\n value: 82.27974449333072\n - type: f1_weighted\n value: 82.81337569618209\n - type: main_score\n value: 82.58238063214526\n task:\n type: Classification\n - dataset:\n config: de\n name: MTEB MassiveScenarioClassification (de)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 83.97108271687962\n - type: f1\n value: 83.56285606936076\n - type: f1_weighted\n value: 84.10198745390771\n - type: main_score\n value: 83.97108271687962\n task:\n type: Classification\n - dataset:\n config: en\n name: MTEB MassiveScenarioClassification (en)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 84.71082716879623\n - type: f1\n value: 84.09447062371402\n - type: f1_weighted\n value: 84.73765765551342\n - type: main_score\n value: 84.71082716879623\n task:\n type: Classification\n - dataset:\n config: fr\n name: MTEB MassiveScenarioClassification (fr)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 83.093476798924\n - type: f1\n value: 82.72656900752943\n - type: f1_weighted\n value: 83.26606516503364\n - type: main_score\n value: 83.093476798924\n task:\n type: Classification\n - dataset:\n config: ru\n name: MTEB MassiveScenarioClassification (ru)\n revision: fad2c6e8459f9e1c45d9315f4953d921437d70f8\n split: test\n type: mteb/amazon_massive_scenario\n metrics:\n - type: accuracy\n value: 84.05850706119705\n - type: f1\n value: 83.64234048881222\n - type: f1_weighted\n value: 84.17315768381876\n - type: main_score\n value: 84.05850706119705\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB MedicalRetrieval (default)\n revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6\n split: dev\n type: C-MTEB/MedicalRetrieval\n metrics:\n - type: main_score\n value: 56.635999999999996\n - type: map_at_1\n value: 48.699999999999996\n - type: map_at_10\n value: 53.991\n - type: map_at_100\n value: 54.449999999999996\n - type: map_at_1000\n value: 54.515\n - type: map_at_20\n value: 54.212\n - type: map_at_3\n value: 52.833\n - type: map_at_5\n value: 53.503\n - type: mrr_at_1\n value: 48.699999999999996\n - type: mrr_at_10\n value: 53.991309523809505\n - type: mrr_at_100\n value: 54.45008993448266\n - type: mrr_at_1000\n value: 54.515253990549795\n - type: mrr_at_20\n value: 54.21201762247036\n - type: mrr_at_3\n value: 52.8333333333333\n - type: mrr_at_5\n value: 53.50333333333328\n - type: nauc_map_at_1000_diff1\n value: 79.96867989401643\n - type: nauc_map_at_1000_max\n value: 69.75230895599029\n - type: nauc_map_at_1000_std\n value: 2.6418738289740213\n - type: nauc_map_at_100_diff1\n value: 79.95343709599133\n - type: nauc_map_at_100_max\n value: 69.751282671507\n - type: nauc_map_at_100_std\n value: 2.621719966106279\n - type: nauc_map_at_10_diff1\n value: 80.02875864565634\n - type: nauc_map_at_10_max\n value: 69.80948662290187\n - type: nauc_map_at_10_std\n value: 2.329151604733765\n - type: nauc_map_at_1_diff1\n value: 83.616940281383\n - type: nauc_map_at_1_max\n value: 69.08142651929452\n - type: nauc_map_at_1_std\n value: 1.9687791394035643\n - type: nauc_map_at_20_diff1\n value: 79.95555601275339\n - type: nauc_map_at_20_max\n value: 69.76604695002925\n - type: nauc_map_at_20_std\n value: 2.556184141901367\n - type: nauc_map_at_3_diff1\n value: 80.74790131023668\n - type: nauc_map_at_3_max\n value: 70.57797991892402\n - type: nauc_map_at_3_std\n value: 2.7115149849964117\n - type: nauc_map_at_5_diff1\n value: 80.31796539878381\n - type: nauc_map_at_5_max\n value: 69.93573796420061\n - type: nauc_map_at_5_std\n value: 2.0731614029506606\n - type: nauc_mrr_at_1000_diff1\n value: 79.96867999907981\n - type: nauc_mrr_at_1000_max\n value: 69.57395578976896\n - type: nauc_mrr_at_1000_std\n value: 2.46351945887829\n - type: nauc_mrr_at_100_diff1\n value: 79.95343709599133\n - type: nauc_mrr_at_100_max\n value: 69.57322054130803\n - type: nauc_mrr_at_100_std\n value: 2.4436578359073433\n - type: nauc_mrr_at_10_diff1\n value: 80.02875864565634\n - type: nauc_mrr_at_10_max\n value: 69.63292630937411\n - type: nauc_mrr_at_10_std\n value: 2.1525912912060012\n - type: nauc_mrr_at_1_diff1\n value: 83.616940281383\n - type: nauc_mrr_at_1_max\n value: 68.74717310480305\n - type: nauc_mrr_at_1_std\n value: 1.6345257249120868\n - type: nauc_mrr_at_20_diff1\n value: 79.95555601275339\n - type: nauc_mrr_at_20_max\n value: 69.58883608470444\n - type: nauc_mrr_at_20_std\n value: 2.378973276576547\n - type: nauc_mrr_at_3_diff1\n value: 80.74790131023668\n - type: nauc_mrr_at_3_max\n value: 70.40430475488604\n - type: nauc_mrr_at_3_std\n value: 2.5378398209583817\n - type: nauc_mrr_at_5_diff1\n value: 80.31796539878381\n - type: nauc_mrr_at_5_max\n value: 69.7605991748183\n - type: nauc_mrr_at_5_std\n value: 1.898022613568352\n - type: nauc_ndcg_at_1000_diff1\n value: 78.35504059321225\n - type: nauc_ndcg_at_1000_max\n value: 69.06752522437093\n - type: nauc_ndcg_at_1000_std\n value: 3.9624036886099265\n - type: nauc_ndcg_at_100_diff1\n value: 77.79729140249833\n - type: nauc_ndcg_at_100_max\n value: 68.93113791506029\n - type: nauc_ndcg_at_100_std\n value: 3.642178826886181\n - type: nauc_ndcg_at_10_diff1\n value: 78.160158293918\n - type: nauc_ndcg_at_10_max\n value: 69.28122202281361\n - type: nauc_ndcg_at_10_std\n value: 2.438976810940962\n - type: nauc_ndcg_at_1_diff1\n value: 83.616940281383\n - type: nauc_ndcg_at_1_max\n value: 69.08142651929452\n - type: nauc_ndcg_at_1_std\n value: 1.9687791394035643\n - type: nauc_ndcg_at_20_diff1\n value: 77.88514432874997\n - type: nauc_ndcg_at_20_max\n value: 69.06148818508873\n - type: nauc_ndcg_at_20_std\n value: 3.1800249272363676\n - type: nauc_ndcg_at_3_diff1\n value: 79.73510384405803\n - type: nauc_ndcg_at_3_max\n value: 70.78000695123832\n - type: nauc_ndcg_at_3_std\n value: 2.9041415468363274\n - type: nauc_ndcg_at_5_diff1\n value: 78.91872808866195\n - type: nauc_ndcg_at_5_max\n value: 69.61478429620091\n - type: nauc_ndcg_at_5_std\n value: 1.734699636301054\n - type: nauc_precision_at_1000_diff1\n value: 66.37858395390673\n - type: nauc_precision_at_1000_max\n value: 60.651659037598534\n - type: nauc_precision_at_1000_std\n value: 27.388353715469798\n - type: nauc_precision_at_100_diff1\n value: 66.34325807776025\n - type: nauc_precision_at_100_max\n value: 63.63855305621111\n - type: nauc_precision_at_100_std\n value: 10.641748149575351\n - type: nauc_precision_at_10_diff1\n value: 71.3784685491089\n - type: nauc_precision_at_10_max\n value: 67.05313695174542\n - type: nauc_precision_at_10_std\n value: 3.000406867930561\n - type: nauc_precision_at_1_diff1\n value: 83.616940281383\n - type: nauc_precision_at_1_max\n value: 69.08142651929452\n - type: nauc_precision_at_1_std\n value: 1.9687791394035643\n - type: nauc_precision_at_20_diff1\n value: 69.73407910977694\n - type: nauc_precision_at_20_max\n value: 65.77426240320742\n - type: nauc_precision_at_20_std\n value: 6.204416838482586\n - type: nauc_precision_at_3_diff1\n value: 76.63737537643107\n - type: nauc_precision_at_3_max\n value: 71.29710200719668\n - type: nauc_precision_at_3_std\n value: 3.47180961484546\n - type: nauc_precision_at_5_diff1\n value: 74.36945983536717\n - type: nauc_precision_at_5_max\n value: 68.33292218003061\n - type: nauc_precision_at_5_std\n value: 0.47128762620258075\n - type: nauc_recall_at_1000_diff1\n value: 66.37858395390681\n - type: nauc_recall_at_1000_max\n value: 60.65165903759889\n - type: nauc_recall_at_1000_std\n value: 27.388353715469822\n - type: nauc_recall_at_100_diff1\n value: 66.34325807776025\n - type: nauc_recall_at_100_max\n value: 63.63855305621116\n - type: nauc_recall_at_100_std\n value: 10.641748149575351\n - type: nauc_recall_at_10_diff1\n value: 71.37846854910892\n - type: nauc_recall_at_10_max\n value: 67.05313695174546\n - type: nauc_recall_at_10_std\n value: 3.000406867930663\n - type: nauc_recall_at_1_diff1\n value: 83.616940281383\n - type: nauc_recall_at_1_max\n value: 69.08142651929452\n - type: nauc_recall_at_1_std\n value: 1.9687791394035643\n - type: nauc_recall_at_20_diff1\n value: 69.73407910977691\n - type: nauc_recall_at_20_max\n value: 65.77426240320746\n - type: nauc_recall_at_20_std\n value: 6.204416838482536\n - type: nauc_recall_at_3_diff1\n value: 76.63737537643112\n - type: nauc_recall_at_3_max\n value: 71.29710200719668\n - type: nauc_recall_at_3_std\n value: 3.471809614845442\n - type: nauc_recall_at_5_diff1\n value: 74.36945983536715\n - type: nauc_recall_at_5_max\n value: 68.33292218003065\n - type: nauc_recall_at_5_std\n value: 0.4712876262026442\n - type: ndcg_at_1\n value: 48.699999999999996\n - type: ndcg_at_10\n value: 56.635999999999996\n - type: ndcg_at_100\n value: 59.193\n - type: ndcg_at_1000\n value: 60.97\n - type: ndcg_at_20\n value: 57.426\n - type: ndcg_at_3\n value: 54.186\n - type: ndcg_at_5\n value: 55.407\n - type: precision_at_1\n value: 48.699999999999996\n - type: precision_at_10\n value: 6.5\n - type: precision_at_100\n value: 0.777\n - type: precision_at_1000\n value: 0.092\n - type: precision_at_20\n value: 3.405\n - type: precision_at_3\n value: 19.367\n - type: precision_at_5\n value: 12.22\n - type: recall_at_1\n value: 48.699999999999996\n - type: recall_at_10\n value: 65.0\n - type: recall_at_100\n value: 77.7\n - type: recall_at_1000\n value: 91.8\n - type: recall_at_20\n value: 68.10000000000001\n - type: recall_at_3\n value: 58.099999999999994\n - type: recall_at_5\n value: 61.1\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB MedrxivClusteringP2P (default)\n revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73\n split: test\n type: mteb/medrxiv-clustering-p2p\n metrics:\n - type: main_score\n value: 34.80188561439236\n - type: v_measure\n value: 34.80188561439236\n - type: v_measure_std\n value: 1.5703148841573102\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB MedrxivClusteringS2S (default)\n revision: 35191c8c0dca72d8ff3efcd72aa802307d469663\n split: test\n type: mteb/medrxiv-clustering-s2s\n metrics:\n - type: main_score\n value: 32.42285513996236\n - type: v_measure\n value: 32.42285513996236\n - type: v_measure_std\n value: 1.3769867487457566\n task:\n type: Clustering\n - dataset:\n config: de\n name: MTEB MintakaRetrieval (de)\n revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e\n split: test\n type: jinaai/mintakaqa\n metrics:\n - type: main_score\n value: 27.025\n - type: map_at_1\n value: 14.532\n - type: map_at_10\n value: 22.612\n - type: map_at_100\n value: 23.802\n - type: map_at_1000\n value: 23.9\n - type: map_at_20\n value: 23.275000000000002\n - type: map_at_3\n value: 20.226\n - type: map_at_5\n value: 21.490000000000002\n - type: mrr_at_1\n value: 14.532434709351305\n - type: mrr_at_10\n value: 22.612077265615575\n - type: mrr_at_100\n value: 23.801523356874675\n - type: mrr_at_1000\n value: 23.900118499340238\n - type: mrr_at_20\n value: 23.275466430108995\n - type: mrr_at_3\n value: 20.22606009547877\n - type: mrr_at_5\n value: 21.489750070204945\n - type: nauc_map_at_1000_diff1\n value: 14.148987799763596\n - type: nauc_map_at_1000_max\n value: 44.70338461387784\n - type: nauc_map_at_1000_std\n value: 15.868006767707637\n - type: nauc_map_at_100_diff1\n value: 14.11371769080442\n - type: nauc_map_at_100_max\n value: 44.67995540936296\n - type: nauc_map_at_100_std\n value: 15.890796502029076\n - type: nauc_map_at_10_diff1\n value: 14.29066834165688\n - type: nauc_map_at_10_max\n value: 45.10997111765282\n - type: nauc_map_at_10_std\n value: 15.508568918629864\n - type: nauc_map_at_1_diff1\n value: 23.473291302576396\n - type: nauc_map_at_1_max\n value: 44.68942599764586\n - type: nauc_map_at_1_std\n value: 12.424377262427253\n - type: nauc_map_at_20_diff1\n value: 14.112652046087831\n - type: nauc_map_at_20_max\n value: 44.82014861413682\n - type: nauc_map_at_20_std\n value: 15.739350613646385\n - type: nauc_map_at_3_diff1\n value: 16.119659221396347\n - type: nauc_map_at_3_max\n value: 46.04766378953525\n - type: nauc_map_at_3_std\n value: 13.969878046315925\n - type: nauc_map_at_5_diff1\n value: 15.095453434076184\n - type: nauc_map_at_5_max\n value: 45.802128149314406\n - type: nauc_map_at_5_std\n value: 14.957442173319949\n - type: nauc_mrr_at_1000_diff1\n value: 14.148987799763596\n - type: nauc_mrr_at_1000_max\n value: 44.70338461387784\n - type: nauc_mrr_at_1000_std\n value: 15.868006767707637\n - type: nauc_mrr_at_100_diff1\n value: 14.11371769080442\n - type: nauc_mrr_at_100_max\n value: 44.67995540936296\n - type: nauc_mrr_at_100_std\n value: 15.890796502029076\n - type: nauc_mrr_at_10_diff1\n value: 14.29066834165688\n - type: nauc_mrr_at_10_max\n value: 45.10997111765282\n - type: nauc_mrr_at_10_std\n value: 15.508568918629864\n - type: nauc_mrr_at_1_diff1\n value: 23.473291302576396\n - type: nauc_mrr_at_1_max\n value: 44.68942599764586\n - type: nauc_mrr_at_1_std\n value: 12.424377262427253\n - type: nauc_mrr_at_20_diff1\n value: 14.112652046087831\n - type: nauc_mrr_at_20_max\n value: 44.82014861413682\n - type: nauc_mrr_at_20_std\n value: 15.739350613646385\n - type: nauc_mrr_at_3_diff1\n value: 16.119659221396347\n - type: nauc_mrr_at_3_max\n value: 46.04766378953525\n - type: nauc_mrr_at_3_std\n value: 13.969878046315925\n - type: nauc_mrr_at_5_diff1\n value: 15.095453434076184\n - type: nauc_mrr_at_5_max\n value: 45.802128149314406\n - type: nauc_mrr_at_5_std\n value: 14.957442173319949\n - type: nauc_ndcg_at_1000_diff1\n value: 11.626606894574028\n - type: nauc_ndcg_at_1000_max\n value: 43.328592841065536\n - type: nauc_ndcg_at_1000_std\n value: 18.049446272245547\n - type: nauc_ndcg_at_100_diff1\n value: 10.485720606660239\n - type: nauc_ndcg_at_100_max\n value: 42.405317674170966\n - type: nauc_ndcg_at_100_std\n value: 19.107151641936987\n - type: nauc_ndcg_at_10_diff1\n value: 11.029351078162982\n - type: nauc_ndcg_at_10_max\n value: 44.36855031964681\n - type: nauc_ndcg_at_10_std\n value: 17.302796171409305\n - type: nauc_ndcg_at_1_diff1\n value: 23.473291302576396\n - type: nauc_ndcg_at_1_max\n value: 44.68942599764586\n - type: nauc_ndcg_at_1_std\n value: 12.424377262427253\n - type: nauc_ndcg_at_20_diff1\n value: 10.356662718168412\n - type: nauc_ndcg_at_20_max\n value: 43.31602680430083\n - type: nauc_ndcg_at_20_std\n value: 18.162891267850316\n - type: nauc_ndcg_at_3_diff1\n value: 14.42844952297869\n - type: nauc_ndcg_at_3_max\n value: 46.26603339466543\n - type: nauc_ndcg_at_3_std\n value: 14.449362723887857\n - type: nauc_ndcg_at_5_diff1\n value: 12.783416563486396\n - type: nauc_ndcg_at_5_max\n value: 45.852176479124424\n - type: nauc_ndcg_at_5_std\n value: 16.11775016428085\n - type: nauc_precision_at_1000_diff1\n value: -8.045361059399795\n - type: nauc_precision_at_1000_max\n value: 21.970273281738777\n - type: nauc_precision_at_1000_std\n value: 49.564650488193266\n - type: nauc_precision_at_100_diff1\n value: -2.118628861593353\n - type: nauc_precision_at_100_max\n value: 31.32498977104778\n - type: nauc_precision_at_100_std\n value: 32.96087731883451\n - type: nauc_precision_at_10_diff1\n value: 3.0335517475367615\n - type: nauc_precision_at_10_max\n value: 42.21620215030219\n - type: nauc_precision_at_10_std\n value: 21.90159732315962\n - type: nauc_precision_at_1_diff1\n value: 23.473291302576396\n - type: nauc_precision_at_1_max\n value: 44.68942599764586\n - type: nauc_precision_at_1_std\n value: 12.424377262427253\n - type: nauc_precision_at_20_diff1\n value: 0.4087201843719047\n - type: nauc_precision_at_20_max\n value: 38.485034773895734\n - type: nauc_precision_at_20_std\n value: 25.077397979916682\n - type: nauc_precision_at_3_diff1\n value: 10.408327736589833\n - type: nauc_precision_at_3_max\n value: 46.757216289175076\n - type: nauc_precision_at_3_std\n value: 15.62594354926867\n - type: nauc_precision_at_5_diff1\n value: 7.326752744229544\n - type: nauc_precision_at_5_max\n value: 45.89190518573553\n - type: nauc_precision_at_5_std\n value: 19.01717163438957\n - type: nauc_recall_at_1000_diff1\n value: -8.045361059400387\n - type: nauc_recall_at_1000_max\n value: 21.97027328173812\n - type: nauc_recall_at_1000_std\n value: 49.56465048819266\n - type: nauc_recall_at_100_diff1\n value: -2.118628861593277\n - type: nauc_recall_at_100_max\n value: 31.324989771047818\n - type: nauc_recall_at_100_std\n value: 32.96087731883457\n - type: nauc_recall_at_10_diff1\n value: 3.0335517475367166\n - type: nauc_recall_at_10_max\n value: 42.21620215030217\n - type: nauc_recall_at_10_std\n value: 21.901597323159606\n - type: nauc_recall_at_1_diff1\n value: 23.473291302576396\n - type: nauc_recall_at_1_max\n value: 44.68942599764586\n - type: nauc_recall_at_1_std\n value: 12.424377262427253\n - type: nauc_recall_at_20_diff1\n value: 0.40872018437190905\n - type: nauc_recall_at_20_max\n value: 38.485034773895734\n - type: nauc_recall_at_20_std\n value: 25.077397979916693\n - type: nauc_recall_at_3_diff1\n value: 10.408327736589843\n - type: nauc_recall_at_3_max\n value: 46.75721628917507\n - type: nauc_recall_at_3_std\n value: 15.625943549268664\n - type: nauc_recall_at_5_diff1\n value: 7.326752744229548\n - type: nauc_recall_at_5_max\n value: 45.89190518573557\n - type: nauc_recall_at_5_std\n value: 19.01717163438958\n - type: ndcg_at_1\n value: 14.532\n - type: ndcg_at_10\n value: 27.025\n - type: ndcg_at_100\n value: 33.305\n - type: ndcg_at_1000\n value: 36.38\n - type: ndcg_at_20\n value: 29.443\n - type: ndcg_at_3\n value: 22.035\n - type: ndcg_at_5\n value: 24.319\n - type: precision_at_1\n value: 14.532\n - type: precision_at_10\n value: 4.115\n - type: precision_at_100\n value: 0.717\n - type: precision_at_1000\n value: 0.097\n - type: precision_at_20\n value: 2.536\n - type: precision_at_3\n value: 9.085\n - type: precision_at_5\n value: 6.563\n - type: recall_at_1\n value: 14.532\n - type: recall_at_10\n value: 41.154\n - type: recall_at_100\n value: 71.651\n - type: recall_at_1000\n value: 96.841\n - type: recall_at_20\n value: 50.71600000000001\n - type: recall_at_3\n value: 27.254\n - type: recall_at_5\n value: 32.814\n task:\n type: Retrieval\n - dataset:\n config: es\n name: MTEB MintakaRetrieval (es)\n revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e\n split: test\n type: jinaai/mintakaqa\n metrics:\n - type: main_score\n value: 26.912000000000003\n - type: map_at_1\n value: 14.686\n - type: map_at_10\n value: 22.569\n - type: map_at_100\n value: 23.679\n - type: map_at_1000\n value: 23.777\n - type: map_at_20\n value: 23.169\n - type: map_at_3\n value: 20.201\n - type: map_at_5\n value: 21.566\n - type: mrr_at_1\n value: 14.686468646864686\n - type: mrr_at_10\n value: 22.569346220336296\n - type: mrr_at_100\n value: 23.678819125817146\n - type: mrr_at_1000\n value: 23.77713511338264\n - type: mrr_at_20\n value: 23.16850858443442\n - type: mrr_at_3\n value: 20.200770077007665\n - type: mrr_at_5\n value: 21.56628162816276\n - type: nauc_map_at_1000_diff1\n value: 14.129007578838381\n - type: nauc_map_at_1000_max\n value: 44.4255501141499\n - type: nauc_map_at_1000_std\n value: 19.95906154868176\n - type: nauc_map_at_100_diff1\n value: 14.09071870575231\n - type: nauc_map_at_100_max\n value: 44.403179928955566\n - type: nauc_map_at_100_std\n value: 20.00413657519976\n - type: nauc_map_at_10_diff1\n value: 14.149535953153688\n - type: nauc_map_at_10_max\n value: 44.66529917634685\n - type: nauc_map_at_10_std\n value: 19.580235989479394\n - type: nauc_map_at_1_diff1\n value: 23.489813522176636\n - type: nauc_map_at_1_max\n value: 46.54578639925787\n - type: nauc_map_at_1_std\n value: 16.39083721709994\n - type: nauc_map_at_20_diff1\n value: 14.021560420656181\n - type: nauc_map_at_20_max\n value: 44.4825455452467\n - type: nauc_map_at_20_std\n value: 19.886927750826878\n - type: nauc_map_at_3_diff1\n value: 16.182977890477723\n - type: nauc_map_at_3_max\n value: 46.1840554029258\n - type: nauc_map_at_3_std\n value: 18.735671900228958\n - type: nauc_map_at_5_diff1\n value: 14.779126395472833\n - type: nauc_map_at_5_max\n value: 45.23237213817556\n - type: nauc_map_at_5_std\n value: 19.348508580412872\n - type: nauc_mrr_at_1000_diff1\n value: 14.129007578838381\n - type: nauc_mrr_at_1000_max\n value: 44.4255501141499\n - type: nauc_mrr_at_1000_std\n value: 19.95906154868176\n - type: nauc_mrr_at_100_diff1\n value: 14.09071870575231\n - type: nauc_mrr_at_100_max\n value: 44.403179928955566\n - type: nauc_mrr_at_100_std\n value: 20.00413657519976\n - type: nauc_mrr_at_10_diff1\n value: 14.149535953153688\n - type: nauc_mrr_at_10_max\n value: 44.66529917634685\n - type: nauc_mrr_at_10_std\n value: 19.580235989479394\n - type: nauc_mrr_at_1_diff1\n value: 23.489813522176636\n - type: nauc_mrr_at_1_max\n value: 46.54578639925787\n - type: nauc_mrr_at_1_std\n value: 16.39083721709994\n - type: nauc_mrr_at_20_diff1\n value: 14.021560420656181\n - type: nauc_mrr_at_20_max\n value: 44.4825455452467\n - type: nauc_mrr_at_20_std\n value: 19.886927750826878\n - type: nauc_mrr_at_3_diff1\n value: 16.182977890477723\n - type: nauc_mrr_at_3_max\n value: 46.1840554029258\n - type: nauc_mrr_at_3_std\n value: 18.735671900228958\n - type: nauc_mrr_at_5_diff1\n value: 14.779126395472833\n - type: nauc_mrr_at_5_max\n value: 45.23237213817556\n - type: nauc_mrr_at_5_std\n value: 19.348508580412872\n - type: nauc_ndcg_at_1000_diff1\n value: 11.762470380481101\n - type: nauc_ndcg_at_1000_max\n value: 42.8233203033089\n - type: nauc_ndcg_at_1000_std\n value: 21.78503705117719\n - type: nauc_ndcg_at_100_diff1\n value: 10.45886076220022\n - type: nauc_ndcg_at_100_max\n value: 41.85472899256818\n - type: nauc_ndcg_at_100_std\n value: 23.20955486335138\n - type: nauc_ndcg_at_10_diff1\n value: 10.605912468659469\n - type: nauc_ndcg_at_10_max\n value: 43.150942448104715\n - type: nauc_ndcg_at_10_std\n value: 21.120035764826085\n - type: nauc_ndcg_at_1_diff1\n value: 23.489813522176636\n - type: nauc_ndcg_at_1_max\n value: 46.54578639925787\n - type: nauc_ndcg_at_1_std\n value: 16.39083721709994\n - type: nauc_ndcg_at_20_diff1\n value: 10.11291783888644\n - type: nauc_ndcg_at_20_max\n value: 42.51260678842788\n - type: nauc_ndcg_at_20_std\n value: 22.1744949382252\n - type: nauc_ndcg_at_3_diff1\n value: 14.25625326760802\n - type: nauc_ndcg_at_3_max\n value: 45.96162916377383\n - type: nauc_ndcg_at_3_std\n value: 19.557832728215523\n - type: nauc_ndcg_at_5_diff1\n value: 11.956317653823053\n - type: nauc_ndcg_at_5_max\n value: 44.35971268886807\n - type: nauc_ndcg_at_5_std\n value: 20.581696730374233\n - type: nauc_precision_at_1000_diff1\n value: 5.132291843566577\n - type: nauc_precision_at_1000_max\n value: 25.293354576835263\n - type: nauc_precision_at_1000_std\n value: 40.36005126087624\n - type: nauc_precision_at_100_diff1\n value: -1.5252854375008238\n - type: nauc_precision_at_100_max\n value: 31.007586474495984\n - type: nauc_precision_at_100_std\n value: 37.297552993548386\n - type: nauc_precision_at_10_diff1\n value: 1.9663657370770737\n - type: nauc_precision_at_10_max\n value: 39.194092293625125\n - type: nauc_precision_at_10_std\n value: 24.956542621999542\n - type: nauc_precision_at_1_diff1\n value: 23.489813522176636\n - type: nauc_precision_at_1_max\n value: 46.54578639925787\n - type: nauc_precision_at_1_std\n value: 16.39083721709994\n - type: nauc_precision_at_20_diff1\n value: 0.011112090390932373\n - type: nauc_precision_at_20_max\n value: 36.9357074392519\n - type: nauc_precision_at_20_std\n value: 28.611387115093876\n - type: nauc_precision_at_3_diff1\n value: 9.596831091013703\n - type: nauc_precision_at_3_max\n value: 45.3905541893809\n - type: nauc_precision_at_3_std\n value: 21.599314388526945\n - type: nauc_precision_at_5_diff1\n value: 5.175887949900142\n - type: nauc_precision_at_5_max\n value: 42.129467510414464\n - type: nauc_precision_at_5_std\n value: 23.607251548776677\n - type: nauc_recall_at_1000_diff1\n value: 5.132291843566257\n - type: nauc_recall_at_1000_max\n value: 25.29335457683396\n - type: nauc_recall_at_1000_std\n value: 40.36005126087638\n - type: nauc_recall_at_100_diff1\n value: -1.5252854375008988\n - type: nauc_recall_at_100_max\n value: 31.00758647449594\n - type: nauc_recall_at_100_std\n value: 37.29755299354834\n - type: nauc_recall_at_10_diff1\n value: 1.9663657370770793\n - type: nauc_recall_at_10_max\n value: 39.19409229362512\n - type: nauc_recall_at_10_std\n value: 24.956542621999546\n - type: nauc_recall_at_1_diff1\n value: 23.489813522176636\n - type: nauc_recall_at_1_max\n value: 46.54578639925787\n - type: nauc_recall_at_1_std\n value: 16.39083721709994\n - type: nauc_recall_at_20_diff1\n value: 0.011112090390923075\n - type: nauc_recall_at_20_max\n value: 36.93570743925189\n - type: nauc_recall_at_20_std\n value: 28.611387115093883\n - type: nauc_recall_at_3_diff1\n value: 9.596831091013714\n - type: nauc_recall_at_3_max\n value: 45.39055418938087\n - type: nauc_recall_at_3_std\n value: 21.599314388526956\n - type: nauc_recall_at_5_diff1\n value: 5.17588794990012\n - type: nauc_recall_at_5_max\n value: 42.12946751041448\n - type: nauc_recall_at_5_std\n value: 23.607251548776695\n - type: ndcg_at_1\n value: 14.686\n - type: ndcg_at_10\n value: 26.912000000000003\n - type: ndcg_at_100\n value: 32.919\n - type: ndcg_at_1000\n value: 36.119\n - type: ndcg_at_20\n value: 29.079\n - type: ndcg_at_3\n value: 21.995\n - type: ndcg_at_5\n value: 24.474999999999998\n - type: precision_at_1\n value: 14.686\n - type: precision_at_10\n value: 4.08\n - type: precision_at_100\n value: 0.703\n - type: precision_at_1000\n value: 0.097\n - type: precision_at_20\n value: 2.467\n - type: precision_at_3\n value: 9.062000000000001\n - type: precision_at_5\n value: 6.65\n - type: recall_at_1\n value: 14.686\n - type: recall_at_10\n value: 40.8\n - type: recall_at_100\n value: 70.338\n - type: recall_at_1000\n value: 96.82300000000001\n - type: recall_at_20\n value: 49.34\n - type: recall_at_3\n value: 27.186\n - type: recall_at_5\n value: 33.251\n task:\n type: Retrieval\n - dataset:\n config: fr\n name: MTEB MintakaRetrieval (fr)\n revision: efa78cc2f74bbcd21eff2261f9e13aebe40b814e\n split: test\n type: jinaai/mintakaqa\n metrics:\n - type: main_score\n value: 26.909\n - type: map_at_1\n value: 14.701\n - type: map_at_10\n value: 22.613\n - type: map_at_100\n value: 23.729\n - type: map_at_1000\n value: 23.837\n - type: map_at_20\n value: 23.262\n - type: map_at_3\n value: 20.236\n - type: map_at_5\n value: 21.673000000000002\n - type: mrr_at_1\n value: 14.7010647010647\n - type: mrr_at_10\n value: 22.613165113165113\n - type: mrr_at_100\n value: 23.72877605989423\n - type: mrr_at_1000\n value: 23.837150802746805\n - type: mrr_at_20\n value: 23.261627081110596\n - type: mrr_at_3\n value: 20.2361452361452\n - type: mrr_at_5\n value: 21.673491673491625\n - type: nauc_map_at_1000_diff1\n value: 17.08927788889635\n - type: nauc_map_at_1000_max\n value: 47.240929150603336\n - type: nauc_map_at_1000_std\n value: 20.559244258100275\n - type: nauc_map_at_100_diff1\n value: 17.029461792796777\n - type: nauc_map_at_100_max\n value: 47.207381115550696\n - type: nauc_map_at_100_std\n value: 20.581498156895265\n - type: nauc_map_at_10_diff1\n value: 17.351456007804536\n - type: nauc_map_at_10_max\n value: 47.815880040221344\n - type: nauc_map_at_10_std\n value: 20.292999107555794\n - type: nauc_map_at_1_diff1\n value: 27.297525357600776\n - type: nauc_map_at_1_max\n value: 47.18835074959486\n - type: nauc_map_at_1_std\n value: 18.304203168281834\n - type: nauc_map_at_20_diff1\n value: 17.157460199542136\n - type: nauc_map_at_20_max\n value: 47.4776610667456\n - type: nauc_map_at_20_std\n value: 20.499186342964478\n - type: nauc_map_at_3_diff1\n value: 19.393119961356277\n - type: nauc_map_at_3_max\n value: 49.02841822452882\n - type: nauc_map_at_3_std\n value: 19.293122796321292\n - type: nauc_map_at_5_diff1\n value: 17.76275044752008\n - type: nauc_map_at_5_max\n value: 48.01292548040298\n - type: nauc_map_at_5_std\n value: 19.928449977400504\n - type: nauc_mrr_at_1000_diff1\n value: 17.08927788889635\n - type: nauc_mrr_at_1000_max\n value: 47.240929150603336\n - type: nauc_mrr_at_1000_std\n value: 20.559244258100275\n - type: nauc_mrr_at_100_diff1\n value: 17.029461792796777\n - type: nauc_mrr_at_100_max\n value: 47.207381115550696\n - type: nauc_mrr_at_100_std\n value: 20.581498156895265\n - type: nauc_mrr_at_10_diff1\n value: 17.351456007804536\n - type: nauc_mrr_at_10_max\n value: 47.815880040221344\n - type: nauc_mrr_at_10_std\n value: 20.292999107555794\n - type: nauc_mrr_at_1_diff1\n value: 27.297525357600776\n - type: nauc_mrr_at_1_max\n value: 47.18835074959486\n - type: nauc_mrr_at_1_std\n value: 18.304203168281834\n - type: nauc_mrr_at_20_diff1\n value: 17.157460199542136\n - type: nauc_mrr_at_20_max\n value: 47.4776610667456\n - type: nauc_mrr_at_20_std\n value: 20.499186342964478\n - type: nauc_mrr_at_3_diff1\n value: 19.393119961356277\n - type: nauc_mrr_at_3_max\n value: 49.02841822452882\n - type: nauc_mrr_at_3_std\n value: 19.293122796321292\n - type: nauc_mrr_at_5_diff1\n value: 17.76275044752008\n - type: nauc_mrr_at_5_max\n value: 48.01292548040298\n - type: nauc_mrr_at_5_std\n value: 19.928449977400504\n - type: nauc_ndcg_at_1000_diff1\n value: 13.989496006047975\n - type: nauc_ndcg_at_1000_max\n value: 45.626323944336114\n - type: nauc_ndcg_at_1000_std\n value: 22.125600410796515\n - type: nauc_ndcg_at_100_diff1\n value: 12.302204843705244\n - type: nauc_ndcg_at_100_max\n value: 44.46856314559079\n - type: nauc_ndcg_at_100_std\n value: 23.084984546328677\n - type: nauc_ndcg_at_10_diff1\n value: 14.001226213368275\n - type: nauc_ndcg_at_10_max\n value: 47.37780636546918\n - type: nauc_ndcg_at_10_std\n value: 21.702709032840637\n - type: nauc_ndcg_at_1_diff1\n value: 27.297525357600776\n - type: nauc_ndcg_at_1_max\n value: 47.18835074959486\n - type: nauc_ndcg_at_1_std\n value: 18.304203168281834\n - type: nauc_ndcg_at_20_diff1\n value: 13.317759910171056\n - type: nauc_ndcg_at_20_max\n value: 46.25171251043813\n - type: nauc_ndcg_at_20_std\n value: 22.309331575402595\n - type: nauc_ndcg_at_3_diff1\n value: 17.555381234893872\n - type: nauc_ndcg_at_3_max\n value: 49.48635590260059\n - type: nauc_ndcg_at_3_std\n value: 19.734570962933674\n - type: nauc_ndcg_at_5_diff1\n value: 14.844841165765061\n - type: nauc_ndcg_at_5_max\n value: 47.76437065028708\n - type: nauc_ndcg_at_5_std\n value: 20.816034479453954\n - type: nauc_precision_at_1000_diff1\n value: -15.591898698252546\n - type: nauc_precision_at_1000_max\n value: 20.545984285353892\n - type: nauc_precision_at_1000_std\n value: 38.9013414992826\n - type: nauc_precision_at_100_diff1\n value: -5.290395978742176\n - type: nauc_precision_at_100_max\n value: 31.340480360546845\n - type: nauc_precision_at_100_std\n value: 33.6897935720505\n - type: nauc_precision_at_10_diff1\n value: 5.965001997926562\n - type: nauc_precision_at_10_max\n value: 46.12515296162247\n - type: nauc_precision_at_10_std\n value: 25.409433135253558\n - type: nauc_precision_at_1_diff1\n value: 27.297525357600776\n - type: nauc_precision_at_1_max\n value: 47.18835074959486\n - type: nauc_precision_at_1_std\n value: 18.304203168281834\n - type: nauc_precision_at_20_diff1\n value: 3.4438127279827744\n - type: nauc_precision_at_20_max\n value: 42.36095587714494\n - type: nauc_precision_at_20_std\n value: 27.367900512797906\n - type: nauc_precision_at_3_diff1\n value: 13.165017224718916\n - type: nauc_precision_at_3_max\n value: 50.58931825484506\n - type: nauc_precision_at_3_std\n value: 20.852009214609442\n - type: nauc_precision_at_5_diff1\n value: 7.840087177549876\n - type: nauc_precision_at_5_max\n value: 46.99388755575109\n - type: nauc_precision_at_5_std\n value: 23.048702393099834\n - type: nauc_recall_at_1000_diff1\n value: -15.591898698252932\n - type: nauc_recall_at_1000_max\n value: 20.5459842853537\n - type: nauc_recall_at_1000_std\n value: 38.901341499282395\n - type: nauc_recall_at_100_diff1\n value: -5.290395978742165\n - type: nauc_recall_at_100_max\n value: 31.340480360546863\n - type: nauc_recall_at_100_std\n value: 33.68979357205046\n - type: nauc_recall_at_10_diff1\n value: 5.96500199792656\n - type: nauc_recall_at_10_max\n value: 46.1251529616225\n - type: nauc_recall_at_10_std\n value: 25.409433135253543\n - type: nauc_recall_at_1_diff1\n value: 27.297525357600776\n - type: nauc_recall_at_1_max\n value: 47.18835074959486\n - type: nauc_recall_at_1_std\n value: 18.304203168281834\n - type: nauc_recall_at_20_diff1\n value: 3.4438127279827833\n - type: nauc_recall_at_20_max\n value: 42.36095587714498\n - type: nauc_recall_at_20_std\n value: 27.36790051279787\n - type: nauc_recall_at_3_diff1\n value: 13.165017224718916\n - type: nauc_recall_at_3_max\n value: 50.589318254845054\n - type: nauc_recall_at_3_std\n value: 20.852009214609435\n - type: nauc_recall_at_5_diff1\n value: 7.840087177549891\n - type: nauc_recall_at_5_max\n value: 46.99388755575112\n - type: nauc_recall_at_5_std\n value: 23.048702393099845\n - type: ndcg_at_1\n value: 14.701\n - type: ndcg_at_10\n value: 26.909\n - type: ndcg_at_100\n value: 32.727000000000004\n - type: ndcg_at_1000\n value: 36.086\n - type: ndcg_at_20\n value: 29.236\n - type: ndcg_at_3\n value: 22.004\n - type: ndcg_at_5\n value: 24.615000000000002\n - type: precision_at_1\n value: 14.701\n - type: precision_at_10\n value: 4.062\n - type: precision_at_100\n value: 0.688\n - type: precision_at_1000\n value: 0.096\n - type: precision_at_20\n value: 2.488\n - type: precision_at_3\n value: 9.036\n - type: precision_at_5\n value: 6.699\n - type: recall_at_1\n value: 14.701\n - type: recall_at_10\n value: 40.622\n - type: recall_at_100\n value: 68.796\n - type: recall_at_1000\n value: 96.314\n - type: recall_at_20\n value: 49.754\n - type: recall_at_3\n value: 27.108999999999998\n - type: recall_at_5\n value: 33.497\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB MultilingualSentiment (default)\n revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a\n split: test\n type: C-MTEB/MultilingualSentiment-classification\n metrics:\n - type: accuracy\n value: 73.20999999999998\n - type: f1\n value: 73.18755986777474\n - type: f1_weighted\n value: 73.18755986777475\n - type: main_score\n value: 73.20999999999998\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB NFCorpus (default)\n revision: ec0fa4fe99da2ff19ca1214b7966684033a58814\n split: test\n type: mteb/nfcorpus\n metrics:\n - type: map_at_1\n value: 4.822\n - type: map_at_10\n value: 13.144\n - type: map_at_100\n value: 17.254\n - type: map_at_1000\n value: 18.931\n - type: map_at_20\n value: 14.834\n - type: map_at_3\n value: 8.975\n - type: map_at_5\n value: 10.922\n - type: mrr_at_1\n value: 47.059\n - type: mrr_at_10\n value: 55.806999999999995\n - type: mrr_at_100\n value: 56.286\n - type: mrr_at_1000\n value: 56.327000000000005\n - type: mrr_at_20\n value: 56.00000000000001\n - type: mrr_at_3\n value: 54.17999999999999\n - type: mrr_at_5\n value: 55.155\n - type: ndcg_at_1\n value: 44.427\n - type: ndcg_at_10\n value: 36.623\n - type: ndcg_at_100\n value: 33.664\n - type: ndcg_at_1000\n value: 42.538\n - type: ndcg_at_20\n value: 34.066\n - type: ndcg_at_3\n value: 41.118\n - type: ndcg_at_5\n value: 39.455\n - type: precision_at_1\n value: 46.44\n - type: precision_at_10\n value: 28.607\n - type: precision_at_100\n value: 9.189\n - type: precision_at_1000\n value: 2.261\n - type: precision_at_20\n value: 21.238\n - type: precision_at_3\n value: 39.628\n - type: precision_at_5\n value: 35.604\n - type: recall_at_1\n value: 4.822\n - type: recall_at_10\n value: 17.488999999999997\n - type: recall_at_100\n value: 35.052\n - type: recall_at_1000\n value: 66.67999999999999\n - type: recall_at_20\n value: 21.343999999999998\n - type: recall_at_3\n value: 10.259\n - type: recall_at_5\n value: 13.406\n - type: main_score\n value: 36.623\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB NQ (default)\n revision: b774495ed302d8c44a3a7ea25c90dbce03968f31\n split: test\n type: mteb/nq\n metrics:\n - type: map_at_1\n value: 41.411\n - type: map_at_10\n value: 57.179\n - type: map_at_100\n value: 57.945\n - type: map_at_1000\n value: 57.967999999999996\n - type: map_at_20\n value: 57.687\n - type: map_at_3\n value: 53.46300000000001\n - type: map_at_5\n value: 55.696999999999996\n - type: mrr_at_1\n value: 46.233999999999995\n - type: mrr_at_10\n value: 59.831999999999994\n - type: mrr_at_100\n value: 60.33500000000001\n - type: mrr_at_1000\n value: 60.348\n - type: mrr_at_20\n value: 60.167\n - type: mrr_at_3\n value: 56.972\n - type: mrr_at_5\n value: 58.74\n - type: ndcg_at_1\n value: 46.205\n - type: ndcg_at_10\n value: 64.23100000000001\n - type: ndcg_at_100\n value: 67.242\n - type: ndcg_at_1000\n value: 67.72500000000001\n - type: ndcg_at_20\n value: 65.77300000000001\n - type: ndcg_at_3\n value: 57.516\n - type: ndcg_at_5\n value: 61.11600000000001\n - type: precision_at_1\n value: 46.205\n - type: precision_at_10\n value: 9.873\n - type: precision_at_100\n value: 1.158\n - type: precision_at_1000\n value: 0.12\n - type: precision_at_20\n value: 5.319\n - type: precision_at_3\n value: 25.424999999999997\n - type: precision_at_5\n value: 17.375\n - type: recall_at_1\n value: 41.411\n - type: recall_at_10\n value: 82.761\n - type: recall_at_100\n value: 95.52199999999999\n - type: recall_at_1000\n value: 99.02499999999999\n - type: recall_at_20\n value: 88.34\n - type: recall_at_3\n value: 65.73\n - type: recall_at_5\n value: 73.894\n - type: main_score\n value: 64.23100000000001\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB Ocnli (default)\n revision: 66e76a618a34d6d565d5538088562851e6daa7ec\n split: validation\n type: C-MTEB/OCNLI\n metrics:\n - type: cosine_accuracy\n value: 62.3714131023281\n - type: cosine_accuracy_threshold\n value: 79.70921993255615\n - type: cosine_ap\n value: 66.41380155495659\n - type: cosine_f1\n value: 68.89547185780786\n - type: cosine_f1_threshold\n value: 72.91591167449951\n - type: cosine_precision\n value: 57.485875706214685\n - type: cosine_recall\n value: 85.95564941921859\n - type: dot_accuracy\n value: 60.47644829453167\n - type: dot_accuracy_threshold\n value: 36627.362060546875\n - type: dot_ap\n value: 63.696303449293204\n - type: dot_f1\n value: 68.3986041101202\n - type: dot_f1_threshold\n value: 30452.72216796875\n - type: dot_precision\n value: 54.04411764705882\n - type: dot_recall\n value: 93.13621964097149\n - type: euclidean_accuracy\n value: 63.02111532214402\n - type: euclidean_accuracy_threshold\n value: 1392.76762008667\n - type: euclidean_ap\n value: 66.65907089443218\n - type: euclidean_f1\n value: 69.05036524413688\n - type: euclidean_f1_threshold\n value: 1711.5310668945312\n - type: euclidean_precision\n value: 54.29262394195889\n - type: euclidean_recall\n value: 94.82576557550159\n - type: main_score\n value: 63.02111532214402\n - type: manhattan_accuracy\n value: 62.75040606388739\n - type: manhattan_accuracy_threshold\n value: 32475.347900390625\n - type: manhattan_ap\n value: 66.50943585125434\n - type: manhattan_f1\n value: 69.08382066276802\n - type: manhattan_f1_threshold\n value: 41238.470458984375\n - type: manhattan_precision\n value: 54.75896168108776\n - type: manhattan_recall\n value: 93.55860612460401\n - type: max_accuracy\n value: 63.02111532214402\n - type: max_ap\n value: 66.65907089443218\n - type: max_f1\n value: 69.08382066276802\n - type: max_precision\n value: 57.485875706214685\n - type: max_recall\n value: 94.82576557550159\n - type: similarity_accuracy\n value: 62.3714131023281\n - type: similarity_accuracy_threshold\n value: 79.70921993255615\n - type: similarity_ap\n value: 66.41380155495659\n - type: similarity_f1\n value: 68.89547185780786\n - type: similarity_f1_threshold\n value: 72.91591167449951\n - type: similarity_precision\n value: 57.485875706214685\n - type: similarity_recall\n value: 85.95564941921859\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB OnlineShopping (default)\n revision: e610f2ebd179a8fda30ae534c3878750a96db120\n split: test\n type: C-MTEB/OnlineShopping-classification\n metrics:\n - type: accuracy\n value: 91.88000000000001\n - type: ap\n value: 89.52463684448476\n - type: ap_weighted\n value: 89.52463684448476\n - type: f1\n value: 91.86313022306673\n - type: f1_weighted\n value: 91.87806318146912\n - type: main_score\n value: 91.88000000000001\n task:\n type: Classification\n - dataset:\n config: en\n name: MTEB OpusparcusPC (en)\n revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a\n split: test.full\n type: GEM/opusparcus\n metrics:\n - type: cosine_accuracy\n value: 92.65578635014838\n - type: cosine_accuracy_threshold\n value: 74.02530312538147\n - type: cosine_ap\n value: 98.3834226153613\n - type: cosine_f1\n value: 94.92567913890312\n - type: cosine_f1_threshold\n value: 74.02530312538147\n - type: cosine_precision\n value: 95.562435500516\n - type: cosine_recall\n value: 94.29735234215886\n - type: dot_accuracy\n value: 91.54302670623146\n - type: dot_accuracy_threshold\n value: 34452.29187011719\n - type: dot_ap\n value: 98.1237257754439\n - type: dot_f1\n value: 94.22400803616273\n - type: dot_f1_threshold\n value: 33670.41931152344\n - type: dot_precision\n value: 92.9633300297324\n - type: dot_recall\n value: 95.5193482688391\n - type: euclidean_accuracy\n value: 92.28486646884274\n - type: euclidean_accuracy_threshold\n value: 1602.8022766113281\n - type: euclidean_ap\n value: 98.3099021504706\n - type: euclidean_f1\n value: 94.75277497477296\n - type: euclidean_f1_threshold\n value: 1604.7462463378906\n - type: euclidean_precision\n value: 93.89999999999999\n - type: euclidean_recall\n value: 95.62118126272912\n - type: main_score\n value: 98.3834226153613\n - type: manhattan_accuracy\n value: 92.2106824925816\n - type: manhattan_accuracy_threshold\n value: 38872.90954589844\n - type: manhattan_ap\n value: 98.28694101230218\n - type: manhattan_f1\n value: 94.67815509376584\n - type: manhattan_f1_threshold\n value: 38872.90954589844\n - type: manhattan_precision\n value: 94.24823410696267\n - type: manhattan_recall\n value: 95.11201629327903\n - type: max_accuracy\n value: 92.65578635014838\n - type: max_ap\n value: 98.3834226153613\n - type: max_f1\n value: 94.92567913890312\n - type: max_precision\n value: 95.562435500516\n - type: max_recall\n value: 95.62118126272912\n - type: similarity_accuracy\n value: 92.65578635014838\n - type: similarity_accuracy_threshold\n value: 74.02530312538147\n - type: similarity_ap\n value: 98.3834226153613\n - type: similarity_f1\n value: 94.92567913890312\n - type: similarity_f1_threshold\n value: 74.02530312538147\n - type: similarity_precision\n value: 95.562435500516\n - type: similarity_recall\n value: 94.29735234215886\n task:\n type: PairClassification\n - dataset:\n config: de\n name: MTEB OpusparcusPC (de)\n revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a\n split: test.full\n type: GEM/opusparcus\n metrics:\n - type: cosine_accuracy\n value: 87.72178850248403\n - type: cosine_accuracy_threshold\n value: 73.33863377571106\n - type: cosine_ap\n value: 96.98901408834976\n - type: cosine_f1\n value: 91.89944134078212\n - type: cosine_f1_threshold\n value: 71.45810127258301\n - type: cosine_precision\n value: 89.64577656675749\n - type: cosine_recall\n value: 94.26934097421203\n - type: dot_accuracy\n value: 86.30234208658624\n - type: dot_accuracy_threshold\n value: 32027.130126953125\n - type: dot_ap\n value: 96.12260574893256\n - type: dot_f1\n value: 91.31602506714414\n - type: dot_f1_threshold\n value: 30804.376220703125\n - type: dot_precision\n value: 85.93091828138164\n - type: dot_recall\n value: 97.42120343839542\n - type: euclidean_accuracy\n value: 87.9347054648687\n - type: euclidean_accuracy_threshold\n value: 1609.6670150756836\n - type: euclidean_ap\n value: 97.00238860358252\n - type: euclidean_f1\n value: 92.1089063221043\n - type: euclidean_f1_threshold\n value: 1641.8487548828125\n - type: euclidean_precision\n value: 89.10714285714286\n - type: euclidean_recall\n value: 95.31996179560649\n - type: main_score\n value: 97.00238860358252\n - type: manhattan_accuracy\n value: 87.72178850248403\n - type: manhattan_accuracy_threshold\n value: 40137.060546875\n - type: manhattan_ap\n value: 96.98653728159941\n - type: manhattan_f1\n value: 92.03865623561896\n - type: manhattan_f1_threshold\n value: 40137.060546875\n - type: manhattan_precision\n value: 88.80994671403198\n - type: manhattan_recall\n value: 95.51098376313276\n - type: max_accuracy\n value: 87.9347054648687\n - type: max_ap\n value: 97.00238860358252\n - type: max_f1\n value: 92.1089063221043\n - type: max_precision\n value: 89.64577656675749\n - type: max_recall\n value: 97.42120343839542\n - type: similarity_accuracy\n value: 87.72178850248403\n - type: similarity_accuracy_threshold\n value: 73.33863377571106\n - type: similarity_ap\n value: 96.98901408834976\n - type: similarity_f1\n value: 91.89944134078212\n - type: similarity_f1_threshold\n value: 71.45810127258301\n - type: similarity_precision\n value: 89.64577656675749\n - type: similarity_recall\n value: 94.26934097421203\n task:\n type: PairClassification\n - dataset:\n config: fr\n name: MTEB OpusparcusPC (fr)\n revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a\n split: test.full\n type: GEM/opusparcus\n metrics:\n - type: cosine_accuracy\n value: 80.92643051771117\n - type: cosine_accuracy_threshold\n value: 76.68856382369995\n - type: cosine_ap\n value: 93.74622381534307\n - type: cosine_f1\n value: 87.12328767123287\n - type: cosine_f1_threshold\n value: 71.64022922515869\n - type: cosine_precision\n value: 80.64243448858834\n - type: cosine_recall\n value: 94.73684210526315\n - type: dot_accuracy\n value: 80.858310626703\n - type: dot_accuracy_threshold\n value: 34028.3935546875\n - type: dot_ap\n value: 91.18448457633308\n - type: dot_f1\n value: 86.82606657290202\n - type: dot_f1_threshold\n value: 34028.3935546875\n - type: dot_precision\n value: 82.2380106571936\n - type: dot_recall\n value: 91.9563058589871\n - type: euclidean_accuracy\n value: 80.858310626703\n - type: euclidean_accuracy_threshold\n value: 1595.7651138305664\n - type: euclidean_ap\n value: 93.8182717829648\n - type: euclidean_f1\n value: 87.04044117647058\n - type: euclidean_f1_threshold\n value: 1609.2475891113281\n - type: euclidean_precision\n value: 81.00940975192472\n - type: euclidean_recall\n value: 94.04170804369414\n - type: main_score\n value: 93.8182717829648\n - type: manhattan_accuracy\n value: 80.99455040871935\n - type: manhattan_accuracy_threshold\n value: 38092.132568359375\n - type: manhattan_ap\n value: 93.77563401151711\n - type: manhattan_f1\n value: 86.91983122362869\n - type: manhattan_f1_threshold\n value: 38092.132568359375\n - type: manhattan_precision\n value: 82.32682060390763\n - type: manhattan_recall\n value: 92.05561072492551\n - type: max_accuracy\n value: 80.99455040871935\n - type: max_ap\n value: 93.8182717829648\n - type: max_f1\n value: 87.12328767123287\n - type: max_precision\n value: 82.32682060390763\n - type: max_recall\n value: 94.73684210526315\n - type: similarity_accuracy\n value: 80.92643051771117\n - type: similarity_accuracy_threshold\n value: 76.68856382369995\n - type: similarity_ap\n value: 93.74622381534307\n - type: similarity_f1\n value: 87.12328767123287\n - type: similarity_f1_threshold\n value: 71.64022922515869\n - type: similarity_precision\n value: 80.64243448858834\n - type: similarity_recall\n value: 94.73684210526315\n task:\n type: PairClassification\n - dataset:\n config: ru\n name: MTEB OpusparcusPC (ru)\n revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a\n split: test.full\n type: GEM/opusparcus\n metrics:\n - type: cosine_accuracy\n value: 76.83823529411765\n - type: cosine_accuracy_threshold\n value: 72.70769476890564\n - type: cosine_ap\n value: 89.56692049908222\n - type: cosine_f1\n value: 83.99832003359934\n - type: cosine_f1_threshold\n value: 70.9052324295044\n - type: cosine_precision\n value: 76.16146230007617\n - type: cosine_recall\n value: 93.63295880149812\n - type: dot_accuracy\n value: 76.28676470588235\n - type: dot_accuracy_threshold\n value: 33740.68908691406\n - type: dot_ap\n value: 87.77185177141567\n - type: dot_f1\n value: 83.62251375370292\n - type: dot_f1_threshold\n value: 32726.611328125\n - type: dot_precision\n value: 76.29343629343629\n - type: dot_recall\n value: 92.50936329588015\n - type: euclidean_accuracy\n value: 77.32843137254902\n - type: euclidean_accuracy_threshold\n value: 1566.510009765625\n - type: euclidean_ap\n value: 89.60605626791111\n - type: euclidean_f1\n value: 84.06546080964686\n - type: euclidean_f1_threshold\n value: 1576.4202117919922\n - type: euclidean_precision\n value: 77.83094098883574\n - type: euclidean_recall\n value: 91.38576779026218\n - type: main_score\n value: 89.60605626791111\n - type: manhattan_accuracy\n value: 76.89950980392157\n - type: manhattan_accuracy_threshold\n value: 38202.215576171875\n - type: manhattan_ap\n value: 89.55766894104868\n - type: manhattan_f1\n value: 83.80462724935732\n - type: manhattan_f1_threshold\n value: 38934.375\n - type: manhattan_precision\n value: 77.25118483412322\n - type: manhattan_recall\n value: 91.57303370786516\n - type: max_accuracy\n value: 77.32843137254902\n - type: max_ap\n value: 89.60605626791111\n - type: max_f1\n value: 84.06546080964686\n - type: max_precision\n value: 77.83094098883574\n - type: max_recall\n value: 93.63295880149812\n - type: similarity_accuracy\n value: 76.83823529411765\n - type: similarity_accuracy_threshold\n value: 72.70769476890564\n - type: similarity_ap\n value: 89.56692049908222\n - type: similarity_f1\n value: 83.99832003359934\n - type: similarity_f1_threshold\n value: 70.9052324295044\n - type: similarity_precision\n value: 76.16146230007617\n - type: similarity_recall\n value: 93.63295880149812\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB PAC (default)\n revision: fc69d1c153a8ccdcf1eef52f4e2a27f88782f543\n split: test\n type: laugustyniak/abusive-clauses-pl\n metrics:\n - type: accuracy\n value: 68.39559803069794\n - type: ap\n value: 77.68074206719457\n - type: ap_weighted\n value: 77.68074206719457\n - type: f1\n value: 66.23485605467732\n - type: f1_weighted\n value: 69.03201442129347\n - type: main_score\n value: 68.39559803069794\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB PAWSX (default)\n revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1\n split: test\n type: C-MTEB/PAWSX\n metrics:\n - type: cosine_pearson\n value: 13.161523266433587\n - type: cosine_spearman\n value: 15.557333873773386\n - type: euclidean_pearson\n value: 17.147508431907525\n - type: euclidean_spearman\n value: 15.664112857732146\n - type: main_score\n value: 15.557333873773386\n - type: manhattan_pearson\n value: 17.130875906264386\n - type: manhattan_spearman\n value: 15.624397342229637\n - type: pearson\n value: 13.161523266433587\n - type: spearman\n value: 15.557333873773386\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB PSC (default)\n revision: d05a294af9e1d3ff2bfb6b714e08a24a6cabc669\n split: test\n type: PL-MTEB/psc-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 97.86641929499072\n - type: cosine_accuracy_threshold\n value: 79.0391206741333\n - type: cosine_ap\n value: 99.19403807771533\n - type: cosine_f1\n value: 96.45608628659475\n - type: cosine_f1_threshold\n value: 79.0391206741333\n - type: cosine_precision\n value: 97.50778816199377\n - type: cosine_recall\n value: 95.42682926829268\n - type: dot_accuracy\n value: 98.14471243042672\n - type: dot_accuracy_threshold\n value: 29808.1787109375\n - type: dot_ap\n value: 99.331999859971\n - type: dot_f1\n value: 97.01492537313433\n - type: dot_f1_threshold\n value: 29808.1787109375\n - type: dot_precision\n value: 95.02923976608187\n - type: dot_recall\n value: 99.08536585365853\n - type: euclidean_accuracy\n value: 97.49536178107606\n - type: euclidean_accuracy_threshold\n value: 1276.227855682373\n - type: euclidean_ap\n value: 98.91056467717377\n - type: euclidean_f1\n value: 95.83975346687212\n - type: euclidean_f1_threshold\n value: 1276.227855682373\n - type: euclidean_precision\n value: 96.88473520249221\n - type: euclidean_recall\n value: 94.8170731707317\n - type: main_score\n value: 99.331999859971\n - type: manhattan_accuracy\n value: 97.49536178107606\n - type: manhattan_accuracy_threshold\n value: 31097.674560546875\n - type: manhattan_ap\n value: 98.95694691792707\n - type: manhattan_f1\n value: 95.83975346687212\n - type: manhattan_f1_threshold\n value: 31097.674560546875\n - type: manhattan_precision\n value: 96.88473520249221\n - type: manhattan_recall\n value: 94.8170731707317\n - type: max_accuracy\n value: 98.14471243042672\n - type: max_ap\n value: 99.331999859971\n - type: max_f1\n value: 97.01492537313433\n - type: max_precision\n value: 97.50778816199377\n - type: max_recall\n value: 99.08536585365853\n - type: similarity_accuracy\n value: 97.86641929499072\n - type: similarity_accuracy_threshold\n value: 79.0391206741333\n - type: similarity_ap\n value: 99.19403807771533\n - type: similarity_f1\n value: 96.45608628659475\n - type: similarity_f1_threshold\n value: 79.0391206741333\n - type: similarity_precision\n value: 97.50778816199377\n - type: similarity_recall\n value: 95.42682926829268\n task:\n type: PairClassification\n - dataset:\n config: en\n name: MTEB PawsXPairClassification (en)\n revision: 8a04d940a42cd40658986fdd8e3da561533a3646\n split: test\n type: google-research-datasets/paws-x\n metrics:\n - type: cosine_accuracy\n value: 61.8\n - type: cosine_accuracy_threshold\n value: 99.5664119720459\n - type: cosine_ap\n value: 60.679317786040585\n - type: cosine_f1\n value: 63.17354143441101\n - type: cosine_f1_threshold\n value: 97.22164869308472\n - type: cosine_precision\n value: 47.6457399103139\n - type: cosine_recall\n value: 93.71554575523705\n - type: dot_accuracy\n value: 55.7\n - type: dot_accuracy_threshold\n value: 48353.62548828125\n - type: dot_ap\n value: 48.53805970536875\n - type: dot_f1\n value: 62.42214532871972\n - type: dot_f1_threshold\n value: 38215.53955078125\n - type: dot_precision\n value: 45.48663640948058\n - type: dot_recall\n value: 99.44873208379272\n - type: euclidean_accuracy\n value: 61.75000000000001\n - type: euclidean_accuracy_threshold\n value: 189.0761137008667\n - type: euclidean_ap\n value: 60.55517418691518\n - type: euclidean_f1\n value: 63.07977736549165\n - type: euclidean_f1_threshold\n value: 504.3168067932129\n - type: euclidean_precision\n value: 47.53914988814318\n - type: euclidean_recall\n value: 93.71554575523705\n - type: main_score\n value: 60.679317786040585\n - type: manhattan_accuracy\n value: 61.9\n - type: manhattan_accuracy_threshold\n value: 4695.778274536133\n - type: manhattan_ap\n value: 60.48686620413608\n - type: manhattan_f1\n value: 62.92880855772778\n - type: manhattan_f1_threshold\n value: 12542.36831665039\n - type: manhattan_precision\n value: 47.28381374722838\n - type: manhattan_recall\n value: 94.04630650496141\n - type: max_accuracy\n value: 61.9\n - type: max_ap\n value: 60.679317786040585\n - type: max_f1\n value: 63.17354143441101\n - type: max_precision\n value: 47.6457399103139\n - type: max_recall\n value: 99.44873208379272\n - type: similarity_accuracy\n value: 61.8\n - type: similarity_accuracy_threshold\n value: 99.5664119720459\n - type: similarity_ap\n value: 60.679317786040585\n - type: similarity_f1\n value: 63.17354143441101\n - type: similarity_f1_threshold\n value: 97.22164869308472\n - type: similarity_precision\n value: 47.6457399103139\n - type: similarity_recall\n value: 93.71554575523705\n task:\n type: PairClassification\n - dataset:\n config: de\n name: MTEB PawsXPairClassification (de)\n revision: 8a04d940a42cd40658986fdd8e3da561533a3646\n split: test\n type: google-research-datasets/paws-x\n metrics:\n - type: cosine_accuracy\n value: 60.25\n - type: cosine_accuracy_threshold\n value: 99.54338073730469\n - type: cosine_ap\n value: 56.7863613689054\n - type: cosine_f1\n value: 62.23499820337766\n - type: cosine_f1_threshold\n value: 89.95014429092407\n - type: cosine_precision\n value: 45.86864406779661\n - type: cosine_recall\n value: 96.75977653631284\n - type: dot_accuracy\n value: 56.8\n - type: dot_accuracy_threshold\n value: 47349.78332519531\n - type: dot_ap\n value: 49.7857806061729\n - type: dot_f1\n value: 62.31225986727209\n - type: dot_f1_threshold\n value: 30143.206787109375\n - type: dot_precision\n value: 45.32520325203252\n - type: dot_recall\n value: 99.66480446927373\n - type: euclidean_accuracy\n value: 60.3\n - type: euclidean_accuracy_threshold\n value: 219.78106498718262\n - type: euclidean_ap\n value: 56.731544327179606\n - type: euclidean_f1\n value: 62.19895287958115\n - type: euclidean_f1_threshold\n value: 1792.1623229980469\n - type: euclidean_precision\n value: 45.22842639593909\n - type: euclidean_recall\n value: 99.55307262569832\n - type: main_score\n value: 56.7863613689054\n - type: manhattan_accuracy\n value: 60.150000000000006\n - type: manhattan_accuracy_threshold\n value: 5104.503631591797\n - type: manhattan_ap\n value: 56.70304479768734\n - type: manhattan_f1\n value: 62.22067039106145\n - type: manhattan_f1_threshold\n value: 42839.471435546875\n - type: manhattan_precision\n value: 45.2513966480447\n - type: manhattan_recall\n value: 99.55307262569832\n - type: max_accuracy\n value: 60.3\n - type: max_ap\n value: 56.7863613689054\n - type: max_f1\n value: 62.31225986727209\n - type: max_precision\n value: 45.86864406779661\n - type: max_recall\n value: 99.66480446927373\n - type: similarity_accuracy\n value: 60.25\n - type: similarity_accuracy_threshold\n value: 99.54338073730469\n - type: similarity_ap\n value: 56.7863613689054\n - type: similarity_f1\n value: 62.23499820337766\n - type: similarity_f1_threshold\n value: 89.95014429092407\n - type: similarity_precision\n value: 45.86864406779661\n - type: similarity_recall\n value: 96.75977653631284\n task:\n type: PairClassification\n - dataset:\n config: es\n name: MTEB PawsXPairClassification (es)\n revision: 8a04d940a42cd40658986fdd8e3da561533a3646\n split: test\n type: google-research-datasets/paws-x\n metrics:\n - type: cosine_accuracy\n value: 59.699999999999996\n - type: cosine_accuracy_threshold\n value: 99.55930709838867\n - type: cosine_ap\n value: 57.31662248806265\n - type: cosine_f1\n value: 62.444061962134256\n - type: cosine_f1_threshold\n value: 74.75898265838623\n - type: cosine_precision\n value: 45.3953953953954\n - type: cosine_recall\n value: 100.0\n - type: dot_accuracy\n value: 55.900000000000006\n - type: dot_accuracy_threshold\n value: 47512.90283203125\n - type: dot_ap\n value: 49.39339147787568\n - type: dot_f1\n value: 62.487082328625554\n - type: dot_f1_threshold\n value: 34989.03503417969\n - type: dot_precision\n value: 45.44088176352705\n - type: dot_recall\n value: 100.0\n - type: euclidean_accuracy\n value: 59.599999999999994\n - type: euclidean_accuracy_threshold\n value: 200.82547664642334\n - type: euclidean_ap\n value: 57.19737488445163\n - type: euclidean_f1\n value: 62.444061962134256\n - type: euclidean_f1_threshold\n value: 1538.8837814331055\n - type: euclidean_precision\n value: 45.3953953953954\n - type: euclidean_recall\n value: 100.0\n - type: main_score\n value: 57.31662248806265\n - type: manhattan_accuracy\n value: 59.550000000000004\n - type: manhattan_accuracy_threshold\n value: 5016.501617431641\n - type: manhattan_ap\n value: 57.089959907945065\n - type: manhattan_f1\n value: 62.444061962134256\n - type: manhattan_f1_threshold\n value: 37523.53515625\n - type: manhattan_precision\n value: 45.3953953953954\n - type: manhattan_recall\n value: 100.0\n - type: max_accuracy\n value: 59.699999999999996\n - type: max_ap\n value: 57.31662248806265\n - type: max_f1\n value: 62.487082328625554\n - type: max_precision\n value: 45.44088176352705\n - type: max_recall\n value: 100.0\n - type: similarity_accuracy\n value: 59.699999999999996\n - type: similarity_accuracy_threshold\n value: 99.55930709838867\n - type: similarity_ap\n value: 57.31662248806265\n - type: similarity_f1\n value: 62.444061962134256\n - type: similarity_f1_threshold\n value: 74.75898265838623\n - type: similarity_precision\n value: 45.3953953953954\n - type: similarity_recall\n value: 100.0\n task:\n type: PairClassification\n - dataset:\n config: fr\n name: MTEB PawsXPairClassification (fr)\n revision: 8a04d940a42cd40658986fdd8e3da561533a3646\n split: test\n type: google-research-datasets/paws-x\n metrics:\n - type: cosine_accuracy\n value: 61.150000000000006\n - type: cosine_accuracy_threshold\n value: 99.36153888702393\n - type: cosine_ap\n value: 59.43845317938599\n - type: cosine_f1\n value: 62.51298026998961\n - type: cosine_f1_threshold\n value: 76.77866220474243\n - type: cosine_precision\n value: 45.468277945619334\n - type: cosine_recall\n value: 100.0\n - type: dot_accuracy\n value: 55.75\n - type: dot_accuracy_threshold\n value: 48931.55212402344\n - type: dot_ap\n value: 50.15949290538757\n - type: dot_f1\n value: 62.53462603878117\n - type: dot_f1_threshold\n value: 34415.7958984375\n - type: dot_precision\n value: 45.4911838790932\n - type: dot_recall\n value: 100.0\n - type: euclidean_accuracy\n value: 61.050000000000004\n - type: euclidean_accuracy_threshold\n value: 240.8097267150879\n - type: euclidean_ap\n value: 59.367971294226216\n - type: euclidean_f1\n value: 62.51298026998961\n - type: euclidean_f1_threshold\n value: 1444.132423400879\n - type: euclidean_precision\n value: 45.468277945619334\n - type: euclidean_recall\n value: 100.0\n - type: main_score\n value: 59.43845317938599\n - type: manhattan_accuracy\n value: 60.95\n - type: manhattan_accuracy_threshold\n value: 5701.206207275391\n - type: manhattan_ap\n value: 59.30094096378774\n - type: manhattan_f1\n value: 62.53462603878117\n - type: manhattan_f1_threshold\n value: 33445.672607421875\n - type: manhattan_precision\n value: 45.4911838790932\n - type: manhattan_recall\n value: 100.0\n - type: max_accuracy\n value: 61.150000000000006\n - type: max_ap\n value: 59.43845317938599\n - type: max_f1\n value: 62.53462603878117\n - type: max_precision\n value: 45.4911838790932\n - type: max_recall\n value: 100.0\n - type: similarity_accuracy\n value: 61.150000000000006\n - type: similarity_accuracy_threshold\n value: 99.36153888702393\n - type: similarity_ap\n value: 59.43845317938599\n - type: similarity_f1\n value: 62.51298026998961\n - type: similarity_f1_threshold\n value: 76.77866220474243\n - type: similarity_precision\n value: 45.468277945619334\n - type: similarity_recall\n value: 100.0\n task:\n type: PairClassification\n - dataset:\n config: zh\n name: MTEB PawsXPairClassification (zh)\n revision: 8a04d940a42cd40658986fdd8e3da561533a3646\n split: test\n type: google-research-datasets/paws-x\n metrics:\n - type: cosine_accuracy\n value: 58.85\n - type: cosine_accuracy_threshold\n value: 99.73838329315186\n - type: cosine_ap\n value: 54.66913160570546\n - type: cosine_f1\n value: 62.32136632973162\n - type: cosine_f1_threshold\n value: 76.4499306678772\n - type: cosine_precision\n value: 45.265822784810126\n - type: cosine_recall\n value: 100.0\n - type: dot_accuracy\n value: 56.25\n - type: dot_accuracy_threshold\n value: 47351.9287109375\n - type: dot_ap\n value: 48.5266232989438\n - type: dot_f1\n value: 62.277951933124356\n - type: dot_f1_threshold\n value: 31325.28076171875\n - type: dot_precision\n value: 45.220030349013655\n - type: dot_recall\n value: 100.0\n - type: euclidean_accuracy\n value: 58.9\n - type: euclidean_accuracy_threshold\n value: 144.24468278884888\n - type: euclidean_ap\n value: 54.66981490353506\n - type: euclidean_f1\n value: 62.32136632973162\n - type: euclidean_f1_threshold\n value: 1484.908676147461\n - type: euclidean_precision\n value: 45.265822784810126\n - type: euclidean_recall\n value: 100.0\n - type: main_score\n value: 54.66981490353506\n - type: manhattan_accuracy\n value: 58.9\n - type: manhattan_accuracy_threshold\n value: 3586.785125732422\n - type: manhattan_ap\n value: 54.668355260247736\n - type: manhattan_f1\n value: 62.32136632973162\n - type: manhattan_f1_threshold\n value: 36031.22863769531\n - type: manhattan_precision\n value: 45.265822784810126\n - type: manhattan_recall\n value: 100.0\n - type: max_accuracy\n value: 58.9\n - type: max_ap\n value: 54.66981490353506\n - type: max_f1\n value: 62.32136632973162\n - type: max_precision\n value: 45.265822784810126\n - type: max_recall\n value: 100.0\n - type: similarity_accuracy\n value: 58.85\n - type: similarity_accuracy_threshold\n value: 99.73838329315186\n - type: similarity_ap\n value: 54.66913160570546\n - type: similarity_f1\n value: 62.32136632973162\n - type: similarity_f1_threshold\n value: 76.4499306678772\n - type: similarity_precision\n value: 45.265822784810126\n - type: similarity_recall\n value: 100.0\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB PolEmo2.0-IN (default)\n revision: d90724373c70959f17d2331ad51fb60c71176b03\n split: test\n type: PL-MTEB/polemo2_in\n metrics:\n - type: accuracy\n value: 83.75346260387812\n - type: f1\n value: 81.98304891214909\n - type: f1_weighted\n value: 84.29623200830078\n - type: main_score\n value: 83.75346260387812\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB PolEmo2.0-OUT (default)\n revision: 6a21ab8716e255ab1867265f8b396105e8aa63d4\n split: test\n type: PL-MTEB/polemo2_out\n metrics:\n - type: accuracy\n value: 66.53846153846153\n - type: f1\n value: 52.71826064368638\n - type: f1_weighted\n value: 69.10010124630334\n - type: main_score\n value: 66.53846153846153\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB PPC\n revision: None\n split: test\n type: PL-MTEB/ppc-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 81.8\n - type: cosine_accuracy_threshold\n value: 90.47793745994568\n - type: cosine_ap\n value: 91.42490266080884\n - type: cosine_f1\n value: 85.4632587859425\n - type: cosine_f1_threshold\n value: 90.47793745994568\n - type: cosine_precision\n value: 82.56172839506173\n - type: cosine_recall\n value: 88.57615894039735\n - type: dot_accuracy\n value: 74.6\n - type: dot_accuracy_threshold\n value: 42102.23693847656\n - type: dot_ap\n value: 86.20060009096979\n - type: dot_f1\n value: 80.02842928216063\n - type: dot_f1_threshold\n value: 38970.16906738281\n - type: dot_precision\n value: 70.1120797011208\n - type: dot_recall\n value: 93.21192052980133\n - type: euclidean_accuracy\n value: 81.5\n - type: euclidean_accuracy_threshold\n value: 880.433464050293\n - type: euclidean_ap\n value: 91.33143477982087\n - type: euclidean_f1\n value: 85.44600938967135\n - type: euclidean_f1_threshold\n value: 964.0384674072266\n - type: euclidean_precision\n value: 81.00890207715133\n - type: euclidean_recall\n value: 90.39735099337747\n - type: main_score\n value: 91.42490266080884\n - type: manhattan_accuracy\n value: 81.3\n - type: manhattan_accuracy_threshold\n value: 22100.830078125\n - type: manhattan_ap\n value: 91.25996158651282\n - type: manhattan_f1\n value: 85.38102643856921\n - type: manhattan_f1_threshold\n value: 24043.515014648438\n - type: manhattan_precision\n value: 80.49853372434018\n - type: manhattan_recall\n value: 90.89403973509934\n - type: max_accuracy\n value: 81.8\n - type: max_ap\n value: 91.42490266080884\n - type: max_f1\n value: 85.4632587859425\n - type: max_precision\n value: 82.56172839506173\n - type: max_recall\n value: 93.21192052980133\n - type: similarity_accuracy\n value: 81.8\n - type: similarity_accuracy_threshold\n value: 90.47793745994568\n - type: similarity_ap\n value: 91.42490266080884\n - type: similarity_f1\n value: 85.4632587859425\n - type: similarity_f1_threshold\n value: 90.47793745994568\n - type: similarity_precision\n value: 82.56172839506173\n - type: similarity_recall\n value: 88.57615894039735\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB QuoraRetrieval (default)\n revision: e4e08e0b7dbe3c8700f0daef558ff32256715259\n split: test\n type: mteb/quora\n metrics:\n - type: map_at_1\n value: 71.419\n - type: map_at_10\n value: 85.542\n - type: map_at_100\n value: 86.161\n - type: map_at_1000\n value: 86.175\n - type: map_at_20\n value: 85.949\n - type: map_at_3\n value: 82.623\n - type: map_at_5\n value: 84.5\n - type: mrr_at_1\n value: 82.27\n - type: mrr_at_10\n value: 88.21900000000001\n - type: mrr_at_100\n value: 88.313\n - type: mrr_at_1000\n value: 88.31400000000001\n - type: mrr_at_20\n value: 88.286\n - type: mrr_at_3\n value: 87.325\n - type: mrr_at_5\n value: 87.97500000000001\n - type: ndcg_at_1\n value: 82.3\n - type: ndcg_at_10\n value: 89.088\n - type: ndcg_at_100\n value: 90.217\n - type: ndcg_at_1000\n value: 90.29700000000001\n - type: ndcg_at_20\n value: 89.697\n - type: ndcg_at_3\n value: 86.435\n - type: ndcg_at_5\n value: 87.966\n - type: precision_at_1\n value: 82.3\n - type: precision_at_10\n value: 13.527000000000001\n - type: precision_at_100\n value: 1.537\n - type: precision_at_1000\n value: 0.157\n - type: precision_at_20\n value: 7.165000000000001\n - type: precision_at_3\n value: 37.92\n - type: precision_at_5\n value: 24.914\n - type: recall_at_1\n value: 71.419\n - type: recall_at_10\n value: 95.831\n - type: recall_at_100\n value: 99.64\n - type: recall_at_1000\n value: 99.988\n - type: recall_at_20\n value: 97.76599999999999\n - type: recall_at_3\n value: 88.081\n - type: recall_at_5\n value: 92.50500000000001\n - type: main_score\n value: 89.088\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB RUParaPhraserSTS (default)\n revision: 43265056790b8f7c59e0139acb4be0a8dad2c8f4\n split: test\n type: merionum/ru_paraphraser\n metrics:\n - type: cosine_pearson\n value: 67.91177744712421\n - type: cosine_spearman\n value: 76.77113726753656\n - type: euclidean_pearson\n value: 73.81454206068638\n - type: euclidean_spearman\n value: 76.92529493599028\n - type: main_score\n value: 76.77113726753656\n - type: manhattan_pearson\n value: 73.81690454439168\n - type: manhattan_spearman\n value: 76.87333776705002\n - type: pearson\n value: 67.91177744712421\n - type: spearman\n value: 76.77113726753656\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB RedditClustering (default)\n revision: 24640382cdbf8abc73003fb0fa6d111a705499eb\n split: test\n type: mteb/reddit-clustering\n metrics:\n - type: main_score\n value: 55.39924225216962\n - type: v_measure\n value: 55.39924225216962\n - type: v_measure_std\n value: 4.723802279292467\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB RedditClusteringP2P (default)\n revision: 385e3cb46b4cfa89021f56c4380204149d0efe33\n split: test\n type: mteb/reddit-clustering-p2p\n metrics:\n - type: main_score\n value: 62.87465161304012\n - type: v_measure\n value: 62.87465161304012\n - type: v_measure_std\n value: 12.082670914488473\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB RiaNewsRetrieval (default)\n revision: 82374b0bbacda6114f39ff9c5b925fa1512ca5d7\n split: test\n type: ai-forever/ria-news-retrieval\n metrics:\n - type: main_score\n value: 79.209\n - type: map_at_1\n value: 67.33\n - type: map_at_10\n value: 75.633\n - type: map_at_100\n value: 75.897\n - type: map_at_1000\n value: 75.907\n - type: map_at_20\n value: 75.804\n - type: map_at_3\n value: 74.2\n - type: map_at_5\n value: 75.13300000000001\n - type: mrr_at_1\n value: 67.31\n - type: mrr_at_10\n value: 75.62709126984095\n - type: mrr_at_100\n value: 75.89105697041113\n - type: mrr_at_1000\n value: 75.90115653883124\n - type: mrr_at_20\n value: 75.79802332308172\n - type: mrr_at_3\n value: 74.19499999999961\n - type: mrr_at_5\n value: 75.12849999999939\n - type: nauc_map_at_1000_diff1\n value: 74.30304869630591\n - type: nauc_map_at_1000_max\n value: 36.477146725784046\n - type: nauc_map_at_1000_std\n value: -20.862772498461723\n - type: nauc_map_at_100_diff1\n value: 74.29833058090355\n - type: nauc_map_at_100_max\n value: 36.483678619667884\n - type: nauc_map_at_100_std\n value: -20.856274849980135\n - type: nauc_map_at_10_diff1\n value: 74.20729220697967\n - type: nauc_map_at_10_max\n value: 36.56543146170092\n - type: nauc_map_at_10_std\n value: -20.991081015484728\n - type: nauc_map_at_1_diff1\n value: 77.38899022125185\n - type: nauc_map_at_1_max\n value: 32.45918619669731\n - type: nauc_map_at_1_std\n value: -22.149586336167324\n - type: nauc_map_at_20_diff1\n value: 74.2447573558587\n - type: nauc_map_at_20_max\n value: 36.50383130240387\n - type: nauc_map_at_20_std\n value: -20.87013743041831\n - type: nauc_map_at_3_diff1\n value: 74.3054577294586\n - type: nauc_map_at_3_max\n value: 36.484530586652724\n - type: nauc_map_at_3_std\n value: -21.90543024607988\n - type: nauc_map_at_5_diff1\n value: 74.21062368961503\n - type: nauc_map_at_5_max\n value: 36.55670532498779\n - type: nauc_map_at_5_std\n value: -21.488786900676942\n - type: nauc_mrr_at_1000_diff1\n value: 74.31619177956684\n - type: nauc_mrr_at_1000_max\n value: 36.53498918453189\n - type: nauc_mrr_at_1000_std\n value: -20.75986704931237\n - type: nauc_mrr_at_100_diff1\n value: 74.31146790382356\n - type: nauc_mrr_at_100_max\n value: 36.54149252857106\n - type: nauc_mrr_at_100_std\n value: -20.75341959250079\n - type: nauc_mrr_at_10_diff1\n value: 74.22027806145095\n - type: nauc_mrr_at_10_max\n value: 36.622542969971725\n - type: nauc_mrr_at_10_std\n value: -20.889417384064117\n - type: nauc_mrr_at_1_diff1\n value: 77.4306709551449\n - type: nauc_mrr_at_1_max\n value: 32.57259463438259\n - type: nauc_mrr_at_1_std\n value: -21.964402859613937\n - type: nauc_mrr_at_20_diff1\n value: 74.25784396230718\n - type: nauc_mrr_at_20_max\n value: 36.561412224507336\n - type: nauc_mrr_at_20_std\n value: -20.767665000065723\n - type: nauc_mrr_at_3_diff1\n value: 74.31423253547214\n - type: nauc_mrr_at_3_max\n value: 36.537745749488906\n - type: nauc_mrr_at_3_std\n value: -21.81259529019546\n - type: nauc_mrr_at_5_diff1\n value: 74.22404613312771\n - type: nauc_mrr_at_5_max\n value: 36.60743768455219\n - type: nauc_mrr_at_5_std\n value: -21.39479216331971\n - type: nauc_ndcg_at_1000_diff1\n value: 73.48182819705742\n - type: nauc_ndcg_at_1000_max\n value: 37.86991608461793\n - type: nauc_ndcg_at_1000_std\n value: -19.021499322688904\n - type: nauc_ndcg_at_100_diff1\n value: 73.34941250585759\n - type: nauc_ndcg_at_100_max\n value: 38.11150275625829\n - type: nauc_ndcg_at_100_std\n value: -18.70624087206104\n - type: nauc_ndcg_at_10_diff1\n value: 72.82520265115987\n - type: nauc_ndcg_at_10_max\n value: 38.43323357650525\n - type: nauc_ndcg_at_10_std\n value: -19.410953792830878\n - type: nauc_ndcg_at_1_diff1\n value: 77.38899022125185\n - type: nauc_ndcg_at_1_max\n value: 32.45918619669731\n - type: nauc_ndcg_at_1_std\n value: -22.149586336167324\n - type: nauc_ndcg_at_20_diff1\n value: 72.93309285256507\n - type: nauc_ndcg_at_20_max\n value: 38.217372819067755\n - type: nauc_ndcg_at_20_std\n value: -18.864113576359333\n - type: nauc_ndcg_at_3_diff1\n value: 73.18253776744112\n - type: nauc_ndcg_at_3_max\n value: 38.008109328364\n - type: nauc_ndcg_at_3_std\n value: -21.68785687594153\n - type: nauc_ndcg_at_5_diff1\n value: 72.90474739784793\n - type: nauc_ndcg_at_5_max\n value: 38.29483039202184\n - type: nauc_ndcg_at_5_std\n value: -20.833049811453474\n - type: nauc_precision_at_1000_diff1\n value: 59.306217613750334\n - type: nauc_precision_at_1000_max\n value: 72.20747948302262\n - type: nauc_precision_at_1000_std\n value: 45.58837180096227\n - type: nauc_precision_at_100_diff1\n value: 62.87286844562389\n - type: nauc_precision_at_100_max\n value: 61.33108214045868\n - type: nauc_precision_at_100_std\n value: 20.67481963545654\n - type: nauc_precision_at_10_diff1\n value: 64.11222984256685\n - type: nauc_precision_at_10_max\n value: 50.323697746037496\n - type: nauc_precision_at_10_std\n value: -7.9994544634332625\n - type: nauc_precision_at_1_diff1\n value: 77.38899022125185\n - type: nauc_precision_at_1_max\n value: 32.45918619669731\n - type: nauc_precision_at_1_std\n value: -22.149586336167324\n - type: nauc_precision_at_20_diff1\n value: 62.30228127286973\n - type: nauc_precision_at_20_max\n value: 52.02090746208407\n - type: nauc_precision_at_20_std\n value: 0.7629898806370331\n - type: nauc_precision_at_3_diff1\n value: 68.82856645994157\n - type: nauc_precision_at_3_max\n value: 43.94171571306625\n - type: nauc_precision_at_3_std\n value: -20.78595255410148\n - type: nauc_precision_at_5_diff1\n value: 66.62157622497887\n - type: nauc_precision_at_5_max\n value: 46.69398173603811\n - type: nauc_precision_at_5_std\n value: -17.412423571163057\n - type: nauc_recall_at_1000_diff1\n value: 59.30621761375148\n - type: nauc_recall_at_1000_max\n value: 72.20747948302191\n - type: nauc_recall_at_1000_std\n value: 45.588371800962655\n - type: nauc_recall_at_100_diff1\n value: 62.872868445623894\n - type: nauc_recall_at_100_max\n value: 61.33108214045813\n - type: nauc_recall_at_100_std\n value: 20.67481963545666\n - type: nauc_recall_at_10_diff1\n value: 64.11222984256698\n - type: nauc_recall_at_10_max\n value: 50.32369774603755\n - type: nauc_recall_at_10_std\n value: -7.999454463433321\n - type: nauc_recall_at_1_diff1\n value: 77.38899022125185\n - type: nauc_recall_at_1_max\n value: 32.45918619669731\n - type: nauc_recall_at_1_std\n value: -22.149586336167324\n - type: nauc_recall_at_20_diff1\n value: 62.3022812728695\n - type: nauc_recall_at_20_max\n value: 52.02090746208397\n - type: nauc_recall_at_20_std\n value: 0.7629898806369458\n - type: nauc_recall_at_3_diff1\n value: 68.82856645994157\n - type: nauc_recall_at_3_max\n value: 43.94171571306612\n - type: nauc_recall_at_3_std\n value: -20.78595255410157\n - type: nauc_recall_at_5_diff1\n value: 66.62157622497897\n - type: nauc_recall_at_5_max\n value: 46.693981736038246\n - type: nauc_recall_at_5_std\n value: -17.412423571162954\n - type: ndcg_at_1\n value: 67.33\n - type: ndcg_at_10\n value: 79.209\n - type: ndcg_at_100\n value: 80.463\n - type: ndcg_at_1000\n value: 80.74799999999999\n - type: ndcg_at_20\n value: 79.81899999999999\n - type: ndcg_at_3\n value: 76.335\n - type: ndcg_at_5\n value: 78.011\n - type: precision_at_1\n value: 67.33\n - type: precision_at_10\n value: 9.020999999999999\n - type: precision_at_100\n value: 0.96\n - type: precision_at_1000\n value: 0.098\n - type: precision_at_20\n value: 4.63\n - type: precision_at_3\n value: 27.493000000000002\n - type: precision_at_5\n value: 17.308\n - type: recall_at_1\n value: 67.33\n - type: recall_at_10\n value: 90.21000000000001\n - type: recall_at_100\n value: 96.00999999999999\n - type: recall_at_1000\n value: 98.29\n - type: recall_at_20\n value: 92.60000000000001\n - type: recall_at_3\n value: 82.48\n - type: recall_at_5\n value: 86.53999999999999\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB RuBQReranking (default)\n revision: 2e96b8f098fa4b0950fc58eacadeb31c0d0c7fa2\n split: test\n type: ai-forever/rubq-reranking\n metrics:\n - type: main_score\n value: 65.57453932493252\n - type: map\n value: 65.57453932493252\n - type: mrr\n value: 70.51408205663526\n - type: nAUC_map_diff1\n value: 26.69583260609023\n - type: nAUC_map_max\n value: 12.928262749610663\n - type: nAUC_map_std\n value: 11.702468857903128\n - type: nAUC_mrr_diff1\n value: 28.5206955462174\n - type: nAUC_mrr_max\n value: 14.207162454694227\n - type: nAUC_mrr_std\n value: 10.725721001555296\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB RuBQRetrieval (default)\n revision: e19b6ffa60b3bc248e0b41f4cc37c26a55c2a67b\n split: test\n type: ai-forever/rubq-retrieval\n metrics:\n - type: main_score\n value: 72.306\n - type: map_at_1\n value: 44.187\n - type: map_at_10\n value: 64.836\n - type: map_at_100\n value: 65.771\n - type: map_at_1000\n value: 65.8\n - type: map_at_20\n value: 65.497\n - type: map_at_3\n value: 59.692\n - type: map_at_5\n value: 63.105\n - type: mrr_at_1\n value: 62.23404255319149\n - type: mrr_at_10\n value: 73.40810161732159\n - type: mrr_at_100\n value: 73.67949305473395\n - type: mrr_at_1000\n value: 73.68707852294746\n - type: mrr_at_20\n value: 73.60429051697479\n - type: mrr_at_3\n value: 71.47360126083535\n - type: mrr_at_5\n value: 72.8447596532704\n - type: nauc_map_at_1000_diff1\n value: 39.838449035736886\n - type: nauc_map_at_1000_max\n value: 32.29962306877408\n - type: nauc_map_at_1000_std\n value: -6.324859592714388\n - type: nauc_map_at_100_diff1\n value: 39.824361938745426\n - type: nauc_map_at_100_max\n value: 32.32055222704763\n - type: nauc_map_at_100_std\n value: -6.301641111869559\n - type: nauc_map_at_10_diff1\n value: 39.50155328718487\n - type: nauc_map_at_10_max\n value: 31.745730244960672\n - type: nauc_map_at_10_std\n value: -6.867215137329693\n - type: nauc_map_at_1_diff1\n value: 47.66181128677822\n - type: nauc_map_at_1_max\n value: 21.75204233166764\n - type: nauc_map_at_1_std\n value: -8.06951079061697\n - type: nauc_map_at_20_diff1\n value: 39.78364637902108\n - type: nauc_map_at_20_max\n value: 32.39065528029405\n - type: nauc_map_at_20_std\n value: -6.368994332729006\n - type: nauc_map_at_3_diff1\n value: 39.51829474433183\n - type: nauc_map_at_3_max\n value: 28.633292697821673\n - type: nauc_map_at_3_std\n value: -7.2561170814963925\n - type: nauc_map_at_5_diff1\n value: 39.288433237676266\n - type: nauc_map_at_5_max\n value: 31.007702201615515\n - type: nauc_map_at_5_std\n value: -7.235131195162474\n - type: nauc_mrr_at_1000_diff1\n value: 49.599102391215226\n - type: nauc_mrr_at_1000_max\n value: 38.25521825911133\n - type: nauc_mrr_at_1000_std\n value: -10.448180939809435\n - type: nauc_mrr_at_100_diff1\n value: 49.5957067716212\n - type: nauc_mrr_at_100_max\n value: 38.26760703964535\n - type: nauc_mrr_at_100_std\n value: -10.438443051971081\n - type: nauc_mrr_at_10_diff1\n value: 49.35269710190271\n - type: nauc_mrr_at_10_max\n value: 38.43782589127069\n - type: nauc_mrr_at_10_std\n value: -10.404402063509815\n - type: nauc_mrr_at_1_diff1\n value: 53.32206103688421\n - type: nauc_mrr_at_1_max\n value: 33.52402390241035\n - type: nauc_mrr_at_1_std\n value: -12.73473393949936\n - type: nauc_mrr_at_20_diff1\n value: 49.550630850826636\n - type: nauc_mrr_at_20_max\n value: 38.35964703941151\n - type: nauc_mrr_at_20_std\n value: -10.444577766284766\n - type: nauc_mrr_at_3_diff1\n value: 49.12029127633829\n - type: nauc_mrr_at_3_max\n value: 38.01631275124067\n - type: nauc_mrr_at_3_std\n value: -10.523724301481309\n - type: nauc_mrr_at_5_diff1\n value: 49.04606949432458\n - type: nauc_mrr_at_5_max\n value: 38.33647550077891\n - type: nauc_mrr_at_5_std\n value: -10.47076409263114\n - type: nauc_ndcg_at_1000_diff1\n value: 41.342785916264226\n - type: nauc_ndcg_at_1000_max\n value: 35.75731064862711\n - type: nauc_ndcg_at_1000_std\n value: -5.45573422899229\n - type: nauc_ndcg_at_100_diff1\n value: 40.972974559636086\n - type: nauc_ndcg_at_100_max\n value: 36.32938573321036\n - type: nauc_ndcg_at_100_std\n value: -4.749631537590004\n - type: nauc_ndcg_at_10_diff1\n value: 39.67813474464166\n - type: nauc_ndcg_at_10_max\n value: 35.480200504848966\n - type: nauc_ndcg_at_10_std\n value: -6.318561293935512\n - type: nauc_ndcg_at_1_diff1\n value: 53.45970160222764\n - type: nauc_ndcg_at_1_max\n value: 33.14759013278075\n - type: nauc_ndcg_at_1_std\n value: -12.579833891774847\n - type: nauc_ndcg_at_20_diff1\n value: 40.67492861219249\n - type: nauc_ndcg_at_20_max\n value: 36.84960799838019\n - type: nauc_ndcg_at_20_std\n value: -5.202530835850179\n - type: nauc_ndcg_at_3_diff1\n value: 39.574906207408844\n - type: nauc_ndcg_at_3_max\n value: 31.76512164509258\n - type: nauc_ndcg_at_3_std\n value: -7.656143208565999\n - type: nauc_ndcg_at_5_diff1\n value: 39.096348529742095\n - type: nauc_ndcg_at_5_max\n value: 34.075926475544165\n - type: nauc_ndcg_at_5_std\n value: -7.238045445366631\n - type: nauc_precision_at_1000_diff1\n value: -14.283799754212609\n - type: nauc_precision_at_1000_max\n value: 6.449741756717101\n - type: nauc_precision_at_1000_std\n value: 4.862828679759048\n - type: nauc_precision_at_100_diff1\n value: -13.23173132700258\n - type: nauc_precision_at_100_max\n value: 11.058898534529195\n - type: nauc_precision_at_100_std\n value: 7.343683941814956\n - type: nauc_precision_at_10_diff1\n value: -7.202951643546464\n - type: nauc_precision_at_10_max\n value: 17.499446869433278\n - type: nauc_precision_at_10_std\n value: 2.8367985220406307\n - type: nauc_precision_at_1_diff1\n value: 53.45970160222764\n - type: nauc_precision_at_1_max\n value: 33.14759013278075\n - type: nauc_precision_at_1_std\n value: -12.579833891774847\n - type: nauc_precision_at_20_diff1\n value: -9.477122699154124\n - type: nauc_precision_at_20_max\n value: 16.80556031564312\n - type: nauc_precision_at_20_std\n value: 6.420218284416923\n - type: nauc_precision_at_3_diff1\n value: 5.5276143574150245\n - type: nauc_precision_at_3_max\n value: 23.65952688481666\n - type: nauc_precision_at_3_std\n value: -1.8730348729295785\n - type: nauc_precision_at_5_diff1\n value: -2.4537029093721308\n - type: nauc_precision_at_5_max\n value: 21.41469327545133\n - type: nauc_precision_at_5_std\n value: 0.1543890645722277\n - type: nauc_recall_at_1000_diff1\n value: -1.7474947956413491\n - type: nauc_recall_at_1000_max\n value: 46.22670991970479\n - type: nauc_recall_at_1000_std\n value: 62.582840705588794\n - type: nauc_recall_at_100_diff1\n value: 16.116089801097345\n - type: nauc_recall_at_100_max\n value: 52.54794580975103\n - type: nauc_recall_at_100_std\n value: 33.720245696003246\n - type: nauc_recall_at_10_diff1\n value: 23.134924318655482\n - type: nauc_recall_at_10_max\n value: 38.73754275649077\n - type: nauc_recall_at_10_std\n value: 0.6137471711639239\n - type: nauc_recall_at_1_diff1\n value: 47.66181128677822\n - type: nauc_recall_at_1_max\n value: 21.75204233166764\n - type: nauc_recall_at_1_std\n value: -8.06951079061697\n - type: nauc_recall_at_20_diff1\n value: 24.130616271355017\n - type: nauc_recall_at_20_max\n value: 48.306178640146136\n - type: nauc_recall_at_20_std\n value: 9.290819557000022\n - type: nauc_recall_at_3_diff1\n value: 29.767415016250226\n - type: nauc_recall_at_3_max\n value: 28.54289782140701\n - type: nauc_recall_at_3_std\n value: -5.1395675072005576\n - type: nauc_recall_at_5_diff1\n value: 25.410613126870174\n - type: nauc_recall_at_5_max\n value: 33.24658754857624\n - type: nauc_recall_at_5_std\n value: -4.211226036746632\n - type: ndcg_at_1\n value: 62.175000000000004\n - type: ndcg_at_10\n value: 72.306\n - type: ndcg_at_100\n value: 75.074\n - type: ndcg_at_1000\n value: 75.581\n - type: ndcg_at_20\n value: 73.875\n - type: ndcg_at_3\n value: 65.641\n - type: ndcg_at_5\n value: 69.48299999999999\n - type: precision_at_1\n value: 62.175000000000004\n - type: precision_at_10\n value: 13.907\n - type: precision_at_100\n value: 1.591\n - type: precision_at_1000\n value: 0.166\n - type: precision_at_20\n value: 7.446999999999999\n - type: precision_at_3\n value: 35.619\n - type: precision_at_5\n value: 24.917\n - type: recall_at_1\n value: 44.187\n - type: recall_at_10\n value: 85.10600000000001\n - type: recall_at_100\n value: 95.488\n - type: recall_at_1000\n value: 98.831\n - type: recall_at_20\n value: 90.22200000000001\n - type: recall_at_3\n value: 68.789\n - type: recall_at_5\n value: 77.85499999999999\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB RuReviewsClassification (default)\n revision: f6d2c31f4dc6b88f468552750bfec05b4b41b05a\n split: test\n type: ai-forever/ru-reviews-classification\n metrics:\n - type: accuracy\n value: 67.5830078125\n - type: f1\n value: 67.56931936632446\n - type: f1_weighted\n value: 67.57137733752779\n - type: main_score\n value: 67.5830078125\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB RuSTSBenchmarkSTS (default)\n revision: 7cf24f325c6da6195df55bef3d86b5e0616f3018\n split: test\n type: ai-forever/ru-stsbenchmark-sts\n metrics:\n - type: cosine_pearson\n value: 85.90493484626788\n - type: cosine_spearman\n value: 86.21965691667411\n - type: euclidean_pearson\n value: 86.07499842984909\n - type: euclidean_spearman\n value: 86.55506818735688\n - type: main_score\n value: 86.21965691667411\n - type: manhattan_pearson\n value: 85.95976420231729\n - type: manhattan_spearman\n value: 86.48604243661234\n - type: pearson\n value: 85.90493484626788\n - type: spearman\n value: 86.21965691667411\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB RuSciBenchGRNTIClassification (default)\n revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1\n split: test\n type: ai-forever/ru-scibench-grnti-classification\n metrics:\n - type: accuracy\n value: 59.1943359375\n - type: f1\n value: 58.894480861440414\n - type: f1_weighted\n value: 58.903615560240866\n - type: main_score\n value: 59.1943359375\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB RuSciBenchGRNTIClusteringP2P (default)\n revision: 673a610d6d3dd91a547a0d57ae1b56f37ebbf6a1\n split: test\n type: ai-forever/ru-scibench-grnti-classification\n metrics:\n - type: main_score\n value: 57.99209448663228\n - type: v_measure\n value: 57.99209448663228\n - type: v_measure_std\n value: 1.0381163861993816\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB RuSciBenchOECDClassification (default)\n revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471\n split: test\n type: ai-forever/ru-scibench-oecd-classification\n metrics:\n - type: accuracy\n value: 45.556640625\n - type: f1\n value: 45.159163104085906\n - type: f1_weighted\n value: 45.16098316398626\n - type: main_score\n value: 45.556640625\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB RuSciBenchOECDClusteringP2P (default)\n revision: 26c88e99dcaba32bb45d0e1bfc21902337f6d471\n split: test\n type: ai-forever/ru-scibench-oecd-classification\n metrics:\n - type: main_score\n value: 50.787548070488974\n - type: v_measure\n value: 50.787548070488974\n - type: v_measure_std\n value: 0.8569958168946827\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB SCIDOCS (default)\n revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88\n split: test\n type: mteb/scidocs\n metrics:\n - type: map_at_1\n value: 4.843\n - type: map_at_10\n value: 11.752\n - type: map_at_100\n value: 13.919\n - type: map_at_1000\n value: 14.198\n - type: map_at_20\n value: 12.898000000000001\n - type: map_at_3\n value: 8.603\n - type: map_at_5\n value: 10.069\n - type: mrr_at_1\n value: 23.799999999999997\n - type: mrr_at_10\n value: 34.449999999999996\n - type: mrr_at_100\n value: 35.64\n - type: mrr_at_1000\n value: 35.691\n - type: mrr_at_20\n value: 35.213\n - type: mrr_at_3\n value: 31.383\n - type: mrr_at_5\n value: 33.062999999999995\n - type: ndcg_at_1\n value: 23.799999999999997\n - type: ndcg_at_10\n value: 19.811\n - type: ndcg_at_100\n value: 28.108\n - type: ndcg_at_1000\n value: 33.1\n - type: ndcg_at_20\n value: 22.980999999999998\n - type: ndcg_at_3\n value: 19.153000000000002\n - type: ndcg_at_5\n value: 16.408\n - type: precision_at_1\n value: 23.799999999999997\n - type: precision_at_10\n value: 10.16\n - type: precision_at_100\n value: 2.1999999999999997\n - type: precision_at_1000\n value: 0.34099999999999997\n - type: precision_at_20\n value: 6.915\n - type: precision_at_3\n value: 17.8\n - type: precision_at_5\n value: 14.14\n - type: recall_at_1\n value: 4.843\n - type: recall_at_10\n value: 20.595\n - type: recall_at_100\n value: 44.66\n - type: recall_at_1000\n value: 69.152\n - type: recall_at_20\n value: 28.04\n - type: recall_at_3\n value: 10.833\n - type: recall_at_5\n value: 14.346999999999998\n - type: main_score\n value: 19.811\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB SICK-E-PL (default)\n revision: 71bba34b0ece6c56dfcf46d9758a27f7a90f17e9\n split: test\n type: PL-MTEB/sicke-pl-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 80.90093762739502\n - type: cosine_accuracy_threshold\n value: 94.40930485725403\n - type: cosine_ap\n value: 71.15400909912427\n - type: cosine_f1\n value: 66.8213457076566\n - type: cosine_f1_threshold\n value: 91.53673648834229\n - type: cosine_precision\n value: 62.4922504649721\n - type: cosine_recall\n value: 71.7948717948718\n - type: dot_accuracy\n value: 78.41418671015083\n - type: dot_accuracy_threshold\n value: 42924.45068359375\n - type: dot_ap\n value: 63.34003025365763\n - type: dot_f1\n value: 62.518258837277244\n - type: dot_f1_threshold\n value: 40900.738525390625\n - type: dot_precision\n value: 52.99653293709758\n - type: dot_recall\n value: 76.21082621082621\n - type: euclidean_accuracy\n value: 80.67672238075826\n - type: euclidean_accuracy_threshold\n value: 696.0524559020996\n - type: euclidean_ap\n value: 70.88762835990224\n - type: euclidean_f1\n value: 66.711051930759\n - type: euclidean_f1_threshold\n value: 878.5581588745117\n - type: euclidean_precision\n value: 62.625\n - type: euclidean_recall\n value: 71.36752136752136\n - type: main_score\n value: 71.15400909912427\n - type: manhattan_accuracy\n value: 80.65633917651854\n - type: manhattan_accuracy_threshold\n value: 17277.72674560547\n - type: manhattan_ap\n value: 70.67105336611716\n - type: manhattan_f1\n value: 66.51346027577151\n - type: manhattan_f1_threshold\n value: 21687.957763671875\n - type: manhattan_precision\n value: 61.69305724725944\n - type: manhattan_recall\n value: 72.15099715099716\n - type: max_accuracy\n value: 80.90093762739502\n - type: max_ap\n value: 71.15400909912427\n - type: max_f1\n value: 66.8213457076566\n - type: max_precision\n value: 62.625\n - type: max_recall\n value: 76.21082621082621\n - type: similarity_accuracy\n value: 80.90093762739502\n - type: similarity_accuracy_threshold\n value: 94.40930485725403\n - type: similarity_ap\n value: 71.15400909912427\n - type: similarity_f1\n value: 66.8213457076566\n - type: similarity_f1_threshold\n value: 91.53673648834229\n - type: similarity_precision\n value: 62.4922504649721\n - type: similarity_recall\n value: 71.7948717948718\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB SICK-R (default)\n revision: 20a6d6f312dd54037fe07a32d58e5e168867909d\n split: test\n type: mteb/sickr-sts\n metrics:\n - type: cosine_pearson\n value: 92.3339946866199\n - type: cosine_spearman\n value: 89.61697355115497\n - type: euclidean_pearson\n value: 90.3264916449669\n - type: euclidean_spearman\n value: 89.36270451308866\n - type: main_score\n value: 89.61697355115497\n - type: manhattan_pearson\n value: 90.18909339052534\n - type: manhattan_spearman\n value: 89.28337093097377\n - type: pearson\n value: 92.3339946866199\n - type: spearman\n value: 89.61697355115497\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB SICK-R-PL (default)\n revision: fd5c2441b7eeff8676768036142af4cfa42c1339\n split: test\n type: PL-MTEB/sickr-pl-sts\n metrics:\n - type: cosine_pearson\n value: 85.27883048457821\n - type: cosine_spearman\n value: 80.53204892678619\n - type: euclidean_pearson\n value: 82.78520705216168\n - type: euclidean_spearman\n value: 80.27848359873212\n - type: main_score\n value: 80.53204892678619\n - type: manhattan_pearson\n value: 82.63270640583454\n - type: manhattan_spearman\n value: 80.21507977473146\n - type: pearson\n value: 85.27883048457821\n - type: spearman\n value: 80.53204892678619\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB SICKFr (default)\n revision: e077ab4cf4774a1e36d86d593b150422fafd8e8a\n split: test\n type: Lajavaness/SICK-fr\n metrics:\n - type: cosine_pearson\n value: 88.77029361817212\n - type: cosine_spearman\n value: 83.9453600346894\n - type: euclidean_pearson\n value: 85.85331086208573\n - type: euclidean_spearman\n value: 83.70852031985308\n - type: main_score\n value: 83.9453600346894\n - type: manhattan_pearson\n value: 85.66222265885914\n - type: manhattan_spearman\n value: 83.60833111525962\n - type: pearson\n value: 88.77029361817212\n - type: spearman\n value: 83.9453600346894\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STS12 (default)\n revision: a0d554a64d88156834ff5ae9920b964011b16384\n split: test\n type: mteb/sts12-sts\n metrics:\n - type: cosine_pearson\n value: 88.76435859522375\n - type: cosine_spearman\n value: 82.43768167804375\n - type: euclidean_pearson\n value: 87.43566183874832\n - type: euclidean_spearman\n value: 82.82166873757507\n - type: main_score\n value: 82.43768167804375\n - type: manhattan_pearson\n value: 87.39450871380951\n - type: manhattan_spearman\n value: 82.89253043430163\n - type: pearson\n value: 88.76435859522375\n - type: spearman\n value: 82.43768167804375\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STS13 (default)\n revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca\n split: test\n type: mteb/sts13-sts\n metrics:\n - type: cosine_pearson\n value: 88.86627241652141\n - type: cosine_spearman\n value: 89.49011599120688\n - type: euclidean_pearson\n value: 89.3314120073772\n - type: euclidean_spearman\n value: 89.8226502776963\n - type: main_score\n value: 89.49011599120688\n - type: manhattan_pearson\n value: 89.2252179076963\n - type: manhattan_spearman\n value: 89.74573844021225\n - type: pearson\n value: 88.86627241652141\n - type: spearman\n value: 89.49011599120688\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STS14 (default)\n revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375\n split: test\n type: mteb/sts14-sts\n metrics:\n - type: cosine_pearson\n value: 87.22891405215968\n - type: cosine_spearman\n value: 84.9467188157614\n - type: euclidean_pearson\n value: 87.20330004726237\n - type: euclidean_spearman\n value: 85.34806059461808\n - type: main_score\n value: 84.9467188157614\n - type: manhattan_pearson\n value: 87.15224666107623\n - type: manhattan_spearman\n value: 85.34596898699708\n - type: pearson\n value: 87.22891405215968\n - type: spearman\n value: 84.9467188157614\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STS15 (default)\n revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3\n split: test\n type: mteb/sts15-sts\n metrics:\n - type: cosine_pearson\n value: 88.14066430111033\n - type: cosine_spearman\n value: 89.31337445552545\n - type: euclidean_pearson\n value: 89.08039335366983\n - type: euclidean_spearman\n value: 89.6658762856415\n - type: main_score\n value: 89.31337445552545\n - type: manhattan_pearson\n value: 89.08057438154486\n - type: manhattan_spearman\n value: 89.68673984203022\n - type: pearson\n value: 88.14066430111033\n - type: spearman\n value: 89.31337445552545\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STS16 (default)\n revision: 4d8694f8f0e0100860b497b999b3dbed754a0513\n split: test\n type: mteb/sts16-sts\n metrics:\n - type: cosine_pearson\n value: 85.14908856657084\n - type: cosine_spearman\n value: 86.84648320786727\n - type: euclidean_pearson\n value: 86.11454713131947\n - type: euclidean_spearman\n value: 86.77738862047961\n - type: main_score\n value: 86.84648320786727\n - type: manhattan_pearson\n value: 86.07804821916372\n - type: manhattan_spearman\n value: 86.78676064310474\n - type: pearson\n value: 85.14908856657084\n - type: spearman\n value: 86.84648320786727\n task:\n type: STS\n - dataset:\n config: en-en\n name: MTEB STS17 (en-en)\n revision: faeb762787bd10488a50c8b5be4a3b82e411949c\n split: test\n type: mteb/sts17-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 89.61633502468356\n - type: cosine_spearman\n value: 89.99772663224805\n - type: euclidean_pearson\n value: 90.14056501501044\n - type: euclidean_spearman\n value: 90.04496896837503\n - type: main_score\n value: 89.99772663224805\n - type: manhattan_pearson\n value: 90.08964860311801\n - type: manhattan_spearman\n value: 90.00091712362196\n - type: pearson\n value: 89.61633502468356\n - type: spearman\n value: 89.99772663224805\n task:\n type: STS\n - dataset:\n config: es-en\n name: MTEB STS17 (es-en)\n revision: faeb762787bd10488a50c8b5be4a3b82e411949c\n split: test\n type: mteb/sts17-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 86.44548026840202\n - type: cosine_spearman\n value: 87.26263108768539\n - type: euclidean_pearson\n value: 86.42844593583838\n - type: euclidean_spearman\n value: 86.89388428664364\n - type: main_score\n value: 87.26263108768539\n - type: manhattan_pearson\n value: 86.47186940800881\n - type: manhattan_spearman\n value: 87.02163091089946\n - type: pearson\n value: 86.44548026840202\n - type: spearman\n value: 87.26263108768539\n task:\n type: STS\n - dataset:\n config: en-de\n name: MTEB STS17 (en-de)\n revision: faeb762787bd10488a50c8b5be4a3b82e411949c\n split: test\n type: mteb/sts17-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 87.89345132532758\n - type: cosine_spearman\n value: 87.96246221327699\n - type: euclidean_pearson\n value: 88.49013032701419\n - type: euclidean_spearman\n value: 87.81981265317344\n - type: main_score\n value: 87.96246221327699\n - type: manhattan_pearson\n value: 88.31360914178538\n - type: manhattan_spearman\n value: 87.62734530005075\n - type: pearson\n value: 87.89345132532758\n - type: spearman\n value: 87.96246221327699\n task:\n type: STS\n - dataset:\n config: es-es\n name: MTEB STS17 (es-es)\n revision: faeb762787bd10488a50c8b5be4a3b82e411949c\n split: test\n type: mteb/sts17-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 88.4084678497171\n - type: cosine_spearman\n value: 88.77640638748285\n - type: euclidean_pearson\n value: 89.60124312475843\n - type: euclidean_spearman\n value: 88.4321442688528\n - type: main_score\n value: 88.77640638748285\n - type: manhattan_pearson\n value: 89.62375118021299\n - type: manhattan_spearman\n value: 88.46998118661577\n - type: pearson\n value: 88.4084678497171\n - type: spearman\n value: 88.77640638748285\n task:\n type: STS\n - dataset:\n config: fr-en\n name: MTEB STS17 (fr-en)\n revision: faeb762787bd10488a50c8b5be4a3b82e411949c\n split: test\n type: mteb/sts17-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 87.30688801326498\n - type: cosine_spearman\n value: 87.55684697258378\n - type: euclidean_pearson\n value: 87.89672951056794\n - type: euclidean_spearman\n value: 87.28050429201674\n - type: main_score\n value: 87.55684697258378\n - type: manhattan_pearson\n value: 87.74292745320572\n - type: manhattan_spearman\n value: 87.16383993876582\n - type: pearson\n value: 87.30688801326498\n - type: spearman\n value: 87.55684697258378\n task:\n type: STS\n - dataset:\n config: zh-en\n name: MTEB STS22 (zh-en)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 73.46180375170147\n - type: cosine_spearman\n value: 73.39559590127081\n - type: euclidean_pearson\n value: 73.72613901293681\n - type: euclidean_spearman\n value: 71.85465165176795\n - type: main_score\n value: 73.39559590127081\n - type: manhattan_pearson\n value: 73.07859140869076\n - type: manhattan_spearman\n value: 71.22047343718893\n - type: pearson\n value: 73.46180375170147\n - type: spearman\n value: 73.39559590127081\n task:\n type: STS\n - dataset:\n config: zh\n name: MTEB STS22 (zh)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 62.47531620842637\n - type: cosine_spearman\n value: 66.22504667157702\n - type: euclidean_pearson\n value: 66.76201254783692\n - type: euclidean_spearman\n value: 66.86115760269463\n - type: main_score\n value: 66.22504667157702\n - type: manhattan_pearson\n value: 66.73847836793489\n - type: manhattan_spearman\n value: 66.7677116377695\n - type: pearson\n value: 62.47531620842637\n - type: spearman\n value: 66.22504667157702\n task:\n type: STS\n - dataset:\n config: es\n name: MTEB STS22 (es)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 69.89707002436481\n - type: cosine_spearman\n value: 72.2054865735116\n - type: euclidean_pearson\n value: 71.81856615570756\n - type: euclidean_spearman\n value: 72.72593304629407\n - type: main_score\n value: 72.2054865735116\n - type: manhattan_pearson\n value: 72.00362684700072\n - type: manhattan_spearman\n value: 72.62783534769964\n - type: pearson\n value: 69.89707002436481\n - type: spearman\n value: 72.2054865735116\n task:\n type: STS\n - dataset:\n config: fr\n name: MTEB STS22 (fr)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 81.59623734395916\n - type: cosine_spearman\n value: 83.28946105111358\n - type: euclidean_pearson\n value: 79.377330171466\n - type: euclidean_spearman\n value: 81.81029781662205\n - type: main_score\n value: 83.28946105111358\n - type: manhattan_pearson\n value: 78.96970881689698\n - type: manhattan_spearman\n value: 81.91773236079703\n - type: pearson\n value: 81.59623734395916\n - type: spearman\n value: 83.28946105111358\n task:\n type: STS\n - dataset:\n config: de-fr\n name: MTEB STS22 (de-fr)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 55.03825643126142\n - type: cosine_spearman\n value: 58.25792501780429\n - type: euclidean_pearson\n value: 50.38007603973409\n - type: euclidean_spearman\n value: 59.39961789383097\n - type: main_score\n value: 58.25792501780429\n - type: manhattan_pearson\n value: 50.518568927999155\n - type: manhattan_spearman\n value: 59.84185466003894\n - type: pearson\n value: 55.03825643126142\n - type: spearman\n value: 58.25792501780429\n task:\n type: STS\n - dataset:\n config: pl-en\n name: MTEB STS22 (pl-en)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 77.77233721490776\n - type: cosine_spearman\n value: 76.17596588017625\n - type: euclidean_pearson\n value: 74.47600468156611\n - type: euclidean_spearman\n value: 72.61278728057012\n - type: main_score\n value: 76.17596588017625\n - type: manhattan_pearson\n value: 74.48118910099699\n - type: manhattan_spearman\n value: 73.33167419101696\n - type: pearson\n value: 77.77233721490776\n - type: spearman\n value: 76.17596588017625\n task:\n type: STS\n - dataset:\n config: pl\n name: MTEB STS22 (pl)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 42.87453608131507\n - type: cosine_spearman\n value: 45.137849894401185\n - type: euclidean_pearson\n value: 31.66964197694796\n - type: euclidean_spearman\n value: 44.1014900837869\n - type: main_score\n value: 45.137849894401185\n - type: manhattan_pearson\n value: 31.007199259384745\n - type: manhattan_spearman\n value: 43.48181523288926\n - type: pearson\n value: 42.87453608131507\n - type: spearman\n value: 45.137849894401185\n task:\n type: STS\n - dataset:\n config: en\n name: MTEB STS22 (en)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 66.87400150638176\n - type: cosine_spearman\n value: 67.27861354834066\n - type: euclidean_pearson\n value: 66.81789582140216\n - type: euclidean_spearman\n value: 66.44220479858708\n - type: main_score\n value: 67.27861354834066\n - type: manhattan_pearson\n value: 66.92509859033235\n - type: manhattan_spearman\n value: 66.46841124185076\n - type: pearson\n value: 66.87400150638176\n - type: spearman\n value: 67.27861354834066\n task:\n type: STS\n - dataset:\n config: ru\n name: MTEB STS22 (ru)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 61.819804551576084\n - type: cosine_spearman\n value: 65.0864146772135\n - type: euclidean_pearson\n value: 62.518151090361876\n - type: euclidean_spearman\n value: 65.13608138548017\n - type: main_score\n value: 65.0864146772135\n - type: manhattan_pearson\n value: 62.51413246915267\n - type: manhattan_spearman\n value: 65.19077543064323\n - type: pearson\n value: 61.819804551576084\n - type: spearman\n value: 65.0864146772135\n task:\n type: STS\n - dataset:\n config: de\n name: MTEB STS22 (de)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 54.85728696035389\n - type: cosine_spearman\n value: 61.60906359227576\n - type: euclidean_pearson\n value: 52.57582587901851\n - type: euclidean_spearman\n value: 61.41823097598308\n - type: main_score\n value: 61.60906359227576\n - type: manhattan_pearson\n value: 52.500978361080506\n - type: manhattan_spearman\n value: 61.30365596659758\n - type: pearson\n value: 54.85728696035389\n - type: spearman\n value: 61.60906359227576\n task:\n type: STS\n - dataset:\n config: fr-pl\n name: MTEB STS22 (fr-pl)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 67.68016005631422\n - type: cosine_spearman\n value: 84.51542547285167\n - type: euclidean_pearson\n value: 66.19871164667245\n - type: euclidean_spearman\n value: 73.24670207647144\n - type: main_score\n value: 84.51542547285167\n - type: manhattan_pearson\n value: 67.0443525268974\n - type: manhattan_spearman\n value: 73.24670207647144\n - type: pearson\n value: 67.68016005631422\n - type: spearman\n value: 84.51542547285167\n task:\n type: STS\n - dataset:\n config: de-pl\n name: MTEB STS22 (de-pl)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 47.49467414030747\n - type: cosine_spearman\n value: 56.81512095681289\n - type: euclidean_pearson\n value: 48.42860221765214\n - type: euclidean_spearman\n value: 58.63197306329092\n - type: main_score\n value: 56.81512095681289\n - type: manhattan_pearson\n value: 48.39594959260441\n - type: manhattan_spearman\n value: 58.63197306329092\n - type: pearson\n value: 47.49467414030747\n - type: spearman\n value: 56.81512095681289\n task:\n type: STS\n - dataset:\n config: es-en\n name: MTEB STS22 (es-en)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 76.8364678896155\n - type: cosine_spearman\n value: 78.45516413087114\n - type: euclidean_pearson\n value: 78.62779318576634\n - type: euclidean_spearman\n value: 78.88760695649488\n - type: main_score\n value: 78.45516413087114\n - type: manhattan_pearson\n value: 78.62131335760031\n - type: manhattan_spearman\n value: 78.81861844200388\n - type: pearson\n value: 76.8364678896155\n - type: spearman\n value: 78.45516413087114\n task:\n type: STS\n - dataset:\n config: de-en\n name: MTEB STS22 (de-en)\n revision: de9d86b3b84231dc21f76c7b7af1f28e2f57f6e3\n split: test\n type: mteb/sts22-crosslingual-sts\n metrics:\n - type: cosine_pearson\n value: 65.16640313911604\n - type: cosine_spearman\n value: 60.887608967403914\n - type: euclidean_pearson\n value: 67.49902244990913\n - type: euclidean_spearman\n value: 59.2458787136538\n - type: main_score\n value: 60.887608967403914\n - type: manhattan_pearson\n value: 67.34313506388378\n - type: manhattan_spearman\n value: 59.05283429200166\n - type: pearson\n value: 65.16640313911604\n - type: spearman\n value: 60.887608967403914\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB QBQTC (default)\n revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7\n split: test\n type: C-MTEB/QBQTC\n metrics:\n - type: cosine_pearson\n value: 34.20049144526891\n - type: cosine_spearman\n value: 36.41802814113771\n - type: euclidean_pearson\n value: 34.569942139590626\n - type: euclidean_spearman\n value: 36.06141660786936\n - type: main_score\n value: 36.41802814113771\n - type: manhattan_pearson\n value: 34.537041543916003\n - type: manhattan_spearman\n value: 36.033418927773825\n - type: pearson\n value: 34.20049144526891\n - type: spearman\n value: 36.41802814113771\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STSB (default)\n revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0\n split: test\n type: C-MTEB/STSB\n metrics:\n - type: cosine_pearson\n value: 81.5092853013241\n - type: cosine_spearman\n value: 83.54005474244292\n - type: euclidean_pearson\n value: 83.7246578378554\n - type: euclidean_spearman\n value: 84.46767551087716\n - type: main_score\n value: 83.54005474244292\n - type: manhattan_pearson\n value: 83.65922665594636\n - type: manhattan_spearman\n value: 84.42431449101848\n - type: pearson\n value: 81.5092853013241\n - type: spearman\n value: 83.54005474244292\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB STSBenchmark (default)\n revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831\n split: test\n type: mteb/stsbenchmark-sts\n metrics:\n - type: cosine_pearson\n value: 87.70246866744966\n - type: cosine_spearman\n value: 89.44070045346106\n - type: euclidean_pearson\n value: 89.56956519641007\n - type: euclidean_spearman\n value: 89.95830112784283\n - type: main_score\n value: 89.44070045346106\n - type: manhattan_pearson\n value: 89.48264471425145\n - type: manhattan_spearman\n value: 89.87900732483114\n - type: pearson\n value: 87.70246866744966\n - type: spearman\n value: 89.44070045346106\n task:\n type: STS\n - dataset:\n config: de\n name: MTEB STSBenchmarkMultilingualSTS (de)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 86.83701990805217\n - type: cosine_spearman\n value: 87.80280785492258\n - type: euclidean_pearson\n value: 87.77325330043514\n - type: euclidean_spearman\n value: 88.3564607283144\n - type: main_score\n value: 87.80280785492258\n - type: manhattan_pearson\n value: 87.6745449945946\n - type: manhattan_spearman\n value: 88.30660465978795\n - type: pearson\n value: 86.83701990805217\n - type: spearman\n value: 87.80280785492258\n task:\n type: STS\n - dataset:\n config: zh\n name: MTEB STSBenchmarkMultilingualSTS (zh)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 84.27751020600267\n - type: cosine_spearman\n value: 85.63500407412486\n - type: euclidean_pearson\n value: 85.21829891649696\n - type: euclidean_spearman\n value: 85.9384575715382\n - type: main_score\n value: 85.63500407412486\n - type: manhattan_pearson\n value: 85.10797194089801\n - type: manhattan_spearman\n value: 85.8770162042784\n - type: pearson\n value: 84.27751020600267\n - type: spearman\n value: 85.63500407412486\n task:\n type: STS\n - dataset:\n config: fr\n name: MTEB STSBenchmarkMultilingualSTS (fr)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 86.56833656723254\n - type: cosine_spearman\n value: 87.4393978501382\n - type: euclidean_pearson\n value: 87.45171512751267\n - type: euclidean_spearman\n value: 88.13106516566947\n - type: main_score\n value: 87.4393978501382\n - type: manhattan_pearson\n value: 87.33010961793333\n - type: manhattan_spearman\n value: 88.06707425102182\n - type: pearson\n value: 86.56833656723254\n - type: spearman\n value: 87.4393978501382\n task:\n type: STS\n - dataset:\n config: pl\n name: MTEB STSBenchmarkMultilingualSTS (pl)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 85.45065540325523\n - type: cosine_spearman\n value: 85.47881076789359\n - type: euclidean_pearson\n value: 85.1999493863155\n - type: euclidean_spearman\n value: 85.7874947669187\n - type: main_score\n value: 85.47881076789359\n - type: manhattan_pearson\n value: 85.06075305990376\n - type: manhattan_spearman\n value: 85.71563015639558\n - type: pearson\n value: 85.45065540325523\n - type: spearman\n value: 85.47881076789359\n task:\n type: STS\n - dataset:\n config: es\n name: MTEB STSBenchmarkMultilingualSTS (es)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 87.11952824079832\n - type: cosine_spearman\n value: 87.9643473573153\n - type: euclidean_pearson\n value: 88.11750364639971\n - type: euclidean_spearman\n value: 88.63695109016498\n - type: main_score\n value: 87.9643473573153\n - type: manhattan_pearson\n value: 88.00294453126699\n - type: manhattan_spearman\n value: 88.53750241758391\n - type: pearson\n value: 87.11952824079832\n - type: spearman\n value: 87.9643473573153\n task:\n type: STS\n - dataset:\n config: ru\n name: MTEB STSBenchmarkMultilingualSTS (ru)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 85.99804354414991\n - type: cosine_spearman\n value: 86.30252111551002\n - type: euclidean_pearson\n value: 86.1880652037762\n - type: euclidean_spearman\n value: 86.69556223944502\n - type: main_score\n value: 86.30252111551002\n - type: manhattan_pearson\n value: 86.0736400320898\n - type: manhattan_spearman\n value: 86.61747927593393\n - type: pearson\n value: 85.99804354414991\n - type: spearman\n value: 86.30252111551002\n task:\n type: STS\n - dataset:\n config: en\n name: MTEB STSBenchmarkMultilingualSTS (en)\n revision: 29afa2569dcedaaa2fe6a3dcfebab33d28b82e8c\n split: test\n type: mteb/stsb_multi_mt\n metrics:\n - type: cosine_pearson\n value: 87.70246861738103\n - type: cosine_spearman\n value: 89.44070045346106\n - type: euclidean_pearson\n value: 89.56956518833663\n - type: euclidean_spearman\n value: 89.95830112784283\n - type: main_score\n value: 89.44070045346106\n - type: manhattan_pearson\n value: 89.48264470792915\n - type: manhattan_spearman\n value: 89.87900732483114\n - type: pearson\n value: 87.70246861738103\n - type: spearman\n value: 89.44070045346106\n task:\n type: STS\n - dataset:\n config: default\n name: MTEB SciDocsRR (default)\n revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab\n split: test\n type: mteb/scidocs-reranking\n metrics:\n - type: map\n value: 84.88064122814694\n - type: mrr\n value: 95.84832651009123\n - type: main_score\n value: 84.88064122814694\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB SciFact (default)\n revision: 0228b52cf27578f30900b9e5271d331663a030d7\n split: test\n type: mteb/scifact\n metrics:\n - type: map_at_1\n value: 57.289\n - type: map_at_10\n value: 67.88499999999999\n - type: map_at_100\n value: 68.477\n - type: map_at_1000\n value: 68.50500000000001\n - type: map_at_20\n value: 68.33500000000001\n - type: map_at_3\n value: 65.08\n - type: map_at_5\n value: 67.001\n - type: mrr_at_1\n value: 59.667\n - type: mrr_at_10\n value: 68.626\n - type: mrr_at_100\n value: 69.082\n - type: mrr_at_1000\n value: 69.108\n - type: mrr_at_20\n value: 68.958\n - type: mrr_at_3\n value: 66.667\n - type: mrr_at_5\n value: 67.983\n - type: ndcg_at_1\n value: 59.667\n - type: ndcg_at_10\n value: 72.309\n - type: ndcg_at_100\n value: 74.58399999999999\n - type: ndcg_at_1000\n value: 75.25500000000001\n - type: ndcg_at_20\n value: 73.656\n - type: ndcg_at_3\n value: 67.791\n - type: ndcg_at_5\n value: 70.45\n - type: precision_at_1\n value: 59.667\n - type: precision_at_10\n value: 9.567\n - type: precision_at_100\n value: 1.073\n - type: precision_at_1000\n value: 0.11299999999999999\n - type: precision_at_20\n value: 5.083\n - type: precision_at_3\n value: 26.333000000000002\n - type: precision_at_5\n value: 17.666999999999998\n - type: recall_at_1\n value: 57.289\n - type: recall_at_10\n value: 84.756\n - type: recall_at_100\n value: 94.5\n - type: recall_at_1000\n value: 99.667\n - type: recall_at_20\n value: 89.7\n - type: recall_at_3\n value: 73.22800000000001\n - type: recall_at_5\n value: 79.444\n - type: main_score\n value: 72.309\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB SpanishNewsClusteringP2P (default)\n revision: bf8ca8ddc5b7da4f7004720ddf99bbe0483480e6\n split: test\n type: jinaai/spanish_news_clustering\n metrics:\n - type: main_score\n value: 45.04477709795154\n - type: v_measure\n value: 45.04477709795154\n - type: v_measure_std\n value: 0.0\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB SpanishPassageRetrievalS2S (default)\n revision: 9cddf2ce5209ade52c2115ccfa00eb22c6d3a837\n split: test\n type: jinaai/spanish_passage_retrieval\n metrics:\n - type: main_score\n value: 69.83\n - type: map_at_1\n value: 15.736\n - type: map_at_10\n value: 52.027\n - type: map_at_100\n value: 65.08800000000001\n - type: map_at_1000\n value: 65.08800000000001\n - type: map_at_20\n value: 60.79900000000001\n - type: map_at_3\n value: 32.869\n - type: map_at_5\n value: 41.436\n - type: mrr_at_1\n value: 75.44910179640718\n - type: mrr_at_10\n value: 84.43446440452426\n - type: mrr_at_100\n value: 84.48052612723271\n - type: mrr_at_1000\n value: 84.48052612723271\n - type: mrr_at_20\n value: 84.48052612723271\n - type: mrr_at_3\n value: 83.13373253493013\n - type: mrr_at_5\n value: 84.3013972055888\n - type: nauc_map_at_1000_diff1\n value: 50.611540149694356\n - type: nauc_map_at_1000_max\n value: 2.1102430434260238\n - type: nauc_map_at_1000_std\n value: -18.88993521335793\n - type: nauc_map_at_100_diff1\n value: 50.611540149694356\n - type: nauc_map_at_100_max\n value: 2.1102430434260238\n - type: nauc_map_at_100_std\n value: -18.88993521335793\n - type: nauc_map_at_10_diff1\n value: 59.13518981755268\n - type: nauc_map_at_10_max\n value: -9.810386627392807\n - type: nauc_map_at_10_std\n value: -38.31810152345078\n - type: nauc_map_at_1_diff1\n value: 74.96782567287174\n - type: nauc_map_at_1_max\n value: -29.648279252607875\n - type: nauc_map_at_1_std\n value: -54.017459339141595\n - type: nauc_map_at_20_diff1\n value: 55.26694458629849\n - type: nauc_map_at_20_max\n value: -1.9490244535020729\n - type: nauc_map_at_20_std\n value: -25.22211659104076\n - type: nauc_map_at_3_diff1\n value: 71.67607885031732\n - type: nauc_map_at_3_max\n value: -25.078101661694507\n - type: nauc_map_at_3_std\n value: -50.55408861920259\n - type: nauc_map_at_5_diff1\n value: 61.50111515417668\n - type: nauc_map_at_5_max\n value: -16.4114670513168\n - type: nauc_map_at_5_std\n value: -44.391416134859135\n - type: nauc_mrr_at_1000_diff1\n value: 74.18848063283234\n - type: nauc_mrr_at_1000_max\n value: 21.929205946778005\n - type: nauc_mrr_at_1000_std\n value: -36.27399268489433\n - type: nauc_mrr_at_100_diff1\n value: 74.18848063283234\n - type: nauc_mrr_at_100_max\n value: 21.929205946778005\n - type: nauc_mrr_at_100_std\n value: -36.27399268489433\n - type: nauc_mrr_at_10_diff1\n value: 74.27231582268745\n - type: nauc_mrr_at_10_max\n value: 21.481133301135337\n - type: nauc_mrr_at_10_std\n value: -36.72070854872902\n - type: nauc_mrr_at_1_diff1\n value: 76.54855950439561\n - type: nauc_mrr_at_1_max\n value: 26.99938321212366\n - type: nauc_mrr_at_1_std\n value: -33.098742603429635\n - type: nauc_mrr_at_20_diff1\n value: 74.18848063283234\n - type: nauc_mrr_at_20_max\n value: 21.929205946778005\n - type: nauc_mrr_at_20_std\n value: -36.27399268489433\n - type: nauc_mrr_at_3_diff1\n value: 72.05379526740143\n - type: nauc_mrr_at_3_max\n value: 18.875831185752528\n - type: nauc_mrr_at_3_std\n value: -37.27302006456391\n - type: nauc_mrr_at_5_diff1\n value: 74.25342356682029\n - type: nauc_mrr_at_5_max\n value: 20.756340085088738\n - type: nauc_mrr_at_5_std\n value: -37.99507208540703\n - type: nauc_ndcg_at_1000_diff1\n value: 53.259363764380275\n - type: nauc_ndcg_at_1000_max\n value: 12.936954959423218\n - type: nauc_ndcg_at_1000_std\n value: -16.953898675672153\n - type: nauc_ndcg_at_100_diff1\n value: 53.259363764380275\n - type: nauc_ndcg_at_100_max\n value: 12.936954959423218\n - type: nauc_ndcg_at_100_std\n value: -16.953898675672153\n - type: nauc_ndcg_at_10_diff1\n value: 53.70942345413554\n - type: nauc_ndcg_at_10_max\n value: -3.8465093347016186\n - type: nauc_ndcg_at_10_std\n value: -31.208127919994755\n - type: nauc_ndcg_at_1_diff1\n value: 75.30551289259554\n - type: nauc_ndcg_at_1_max\n value: 25.53292054129834\n - type: nauc_ndcg_at_1_std\n value: -33.285498788395145\n - type: nauc_ndcg_at_20_diff1\n value: 57.62409278278133\n - type: nauc_ndcg_at_20_max\n value: 2.8040586426056233\n - type: nauc_ndcg_at_20_std\n value: -26.270875776221704\n - type: nauc_ndcg_at_3_diff1\n value: 48.42294834754225\n - type: nauc_ndcg_at_3_max\n value: 16.912467881065822\n - type: nauc_ndcg_at_3_std\n value: -13.324841189277873\n - type: nauc_ndcg_at_5_diff1\n value: 47.512819802794596\n - type: nauc_ndcg_at_5_max\n value: 14.645518203506594\n - type: nauc_ndcg_at_5_std\n value: -17.641450435599275\n - type: nauc_precision_at_1000_diff1\n value: -34.43320975829637\n - type: nauc_precision_at_1000_max\n value: 29.08585622578186\n - type: nauc_precision_at_1000_std\n value: 46.55117940162061\n - type: nauc_precision_at_100_diff1\n value: -34.433209758296364\n - type: nauc_precision_at_100_max\n value: 29.085856225781885\n - type: nauc_precision_at_100_std\n value: 46.55117940162065\n - type: nauc_precision_at_10_diff1\n value: -21.895306304096902\n - type: nauc_precision_at_10_max\n value: 33.190476527593745\n - type: nauc_precision_at_10_std\n value: 37.64916268614298\n - type: nauc_precision_at_1_diff1\n value: 75.30551289259554\n - type: nauc_precision_at_1_max\n value: 25.53292054129834\n - type: nauc_precision_at_1_std\n value: -33.285498788395145\n - type: nauc_precision_at_20_diff1\n value: -27.63076748060466\n - type: nauc_precision_at_20_max\n value: 30.689810416086154\n - type: nauc_precision_at_20_std\n value: 46.164191636131626\n - type: nauc_precision_at_3_diff1\n value: 20.547345067837288\n - type: nauc_precision_at_3_max\n value: 26.177050942827528\n - type: nauc_precision_at_3_std\n value: 5.960466052973099\n - type: nauc_precision_at_5_diff1\n value: -8.928755534002669\n - type: nauc_precision_at_5_max\n value: 40.83262650073459\n - type: nauc_precision_at_5_std\n value: 26.158537031161494\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: .nan\n - type: nauc_recall_at_100_max\n value: .nan\n - type: nauc_recall_at_100_std\n value: .nan\n - type: nauc_recall_at_10_diff1\n value: 53.08654386169444\n - type: nauc_recall_at_10_max\n value: -23.276269379519356\n - type: nauc_recall_at_10_std\n value: -50.80707792706157\n - type: nauc_recall_at_1_diff1\n value: 74.96782567287174\n - type: nauc_recall_at_1_max\n value: -29.648279252607875\n - type: nauc_recall_at_1_std\n value: -54.017459339141595\n - type: nauc_recall_at_20_diff1\n value: 51.60121897059633\n - type: nauc_recall_at_20_max\n value: -14.241779530735387\n - type: nauc_recall_at_20_std\n value: -37.877451525215456\n - type: nauc_recall_at_3_diff1\n value: 66.99474984329694\n - type: nauc_recall_at_3_max\n value: -30.802787353187966\n - type: nauc_recall_at_3_std\n value: -53.58737792129713\n - type: nauc_recall_at_5_diff1\n value: 54.64214444958567\n - type: nauc_recall_at_5_max\n value: -23.341309362104703\n - type: nauc_recall_at_5_std\n value: -51.381363923145265\n - type: ndcg_at_1\n value: 76.048\n - type: ndcg_at_10\n value: 69.83\n - type: ndcg_at_100\n value: 82.11500000000001\n - type: ndcg_at_1000\n value: 82.11500000000001\n - type: ndcg_at_20\n value: 75.995\n - type: ndcg_at_3\n value: 69.587\n - type: ndcg_at_5\n value: 69.062\n - type: precision_at_1\n value: 76.048\n - type: precision_at_10\n value: 43.653\n - type: precision_at_100\n value: 7.718999999999999\n - type: precision_at_1000\n value: 0.772\n - type: precision_at_20\n value: 31.108000000000004\n - type: precision_at_3\n value: 63.87199999999999\n - type: precision_at_5\n value: 56.407\n - type: recall_at_1\n value: 15.736\n - type: recall_at_10\n value: 66.873\n - type: recall_at_100\n value: 100.0\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 85.01100000000001\n - type: recall_at_3\n value: 36.441\n - type: recall_at_5\n value: 49.109\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB SprintDuplicateQuestions (default)\n revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46\n split: test\n type: mteb/sprintduplicatequestions-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 99.87326732673267\n - type: cosine_accuracy_threshold\n value: 86.0752820968628\n - type: cosine_ap\n value: 96.98758090713252\n - type: cosine_f1\n value: 93.52881698685542\n - type: cosine_f1_threshold\n value: 86.0752820968628\n - type: cosine_precision\n value: 94.58077709611452\n - type: cosine_recall\n value: 92.5\n - type: dot_accuracy\n value: 99.82574257425742\n - type: dot_accuracy_threshold\n value: 40484.73815917969\n - type: dot_ap\n value: 95.68959907254845\n - type: dot_f1\n value: 91.31293188548865\n - type: dot_f1_threshold\n value: 40336.810302734375\n - type: dot_precision\n value: 90.15594541910332\n - type: dot_recall\n value: 92.5\n - type: euclidean_accuracy\n value: 99.87128712871286\n - type: euclidean_accuracy_threshold\n value: 1162.5749588012695\n - type: euclidean_ap\n value: 96.92640435656577\n - type: euclidean_f1\n value: 93.4475806451613\n - type: euclidean_f1_threshold\n value: 1162.5749588012695\n - type: euclidean_precision\n value: 94.20731707317073\n - type: euclidean_recall\n value: 92.7\n - type: main_score\n value: 96.98758090713252\n - type: manhattan_accuracy\n value: 99.86930693069307\n - type: manhattan_accuracy_threshold\n value: 28348.71826171875\n - type: manhattan_ap\n value: 96.93832673967925\n - type: manhattan_f1\n value: 93.33333333333333\n - type: manhattan_f1_threshold\n value: 28348.71826171875\n - type: manhattan_precision\n value: 94.28571428571428\n - type: manhattan_recall\n value: 92.4\n - type: max_accuracy\n value: 99.87326732673267\n - type: max_ap\n value: 96.98758090713252\n - type: max_f1\n value: 93.52881698685542\n - type: max_precision\n value: 94.58077709611452\n - type: max_recall\n value: 92.7\n - type: similarity_accuracy\n value: 99.87326732673267\n - type: similarity_accuracy_threshold\n value: 86.0752820968628\n - type: similarity_ap\n value: 96.98758090713252\n - type: similarity_f1\n value: 93.52881698685542\n - type: similarity_f1_threshold\n value: 86.0752820968628\n - type: similarity_precision\n value: 94.58077709611452\n - type: similarity_recall\n value: 92.5\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB StackExchangeClustering (default)\n revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259\n split: test\n type: mteb/stackexchange-clustering\n metrics:\n - type: main_score\n value: 65.6560129719848\n - type: v_measure\n value: 65.6560129719848\n - type: v_measure_std\n value: 4.781229811487539\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB StackExchangeClusteringP2P (default)\n revision: 815ca46b2622cec33ccafc3735d572c266efdb44\n split: test\n type: mteb/stackexchange-clustering-p2p\n metrics:\n - type: main_score\n value: 35.07546243853692\n - type: v_measure\n value: 35.07546243853692\n - type: v_measure_std\n value: 1.1978740356240998\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB StackOverflowDupQuestions (default)\n revision: e185fbe320c72810689fc5848eb6114e1ef5ec69\n split: test\n type: mteb/stackoverflowdupquestions-reranking\n metrics:\n - type: map\n value: 51.771005199508835\n - type: mrr\n value: 52.65443298531534\n - type: main_score\n value: 51.771005199508835\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB SummEval (default)\n revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c\n split: test\n type: mteb/summeval\n metrics:\n - type: cosine_pearson\n value: 29.48686238342228\n - type: cosine_spearman\n value: 29.706543509170054\n - type: dot_pearson\n value: 27.95853155597859\n - type: dot_spearman\n value: 27.604287986935162\n - type: main_score\n value: 29.706543509170054\n - type: pearson\n value: 29.48686238342228\n - type: spearman\n value: 29.706543509170054\n task:\n type: Summarization\n - dataset:\n config: default\n name: MTEB SummEvalFr (default)\n revision: b385812de6a9577b6f4d0f88c6a6e35395a94054\n split: test\n type: lyon-nlp/summarization-summeval-fr-p2p\n metrics:\n - type: cosine_pearson\n value: 31.551301434917868\n - type: cosine_spearman\n value: 30.709049789175186\n - type: dot_pearson\n value: 27.77050901756549\n - type: dot_spearman\n value: 26.715505953561795\n - type: main_score\n value: 30.709049789175186\n - type: pearson\n value: 31.551301434917868\n - type: spearman\n value: 30.709049789175186\n task:\n type: Summarization\n - dataset:\n config: default\n name: MTEB SyntecReranking (default)\n revision: b205c5084a0934ce8af14338bf03feb19499c84d\n split: test\n type: lyon-nlp/mteb-fr-reranking-syntec-s2p\n metrics:\n - type: map\n value: 73.31666666666666\n - type: mrr\n value: 73.31666666666666\n - type: main_score\n value: 73.31666666666666\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB SyntecRetrieval (default)\n revision: 19661ccdca4dfc2d15122d776b61685f48c68ca9\n split: test\n type: lyon-nlp/mteb-fr-retrieval-syntec-s2p\n metrics:\n - type: main_score\n value: 83.851\n - type: map_at_1\n value: 68.0\n - type: map_at_10\n value: 79.187\n - type: map_at_100\n value: 79.32900000000001\n - type: map_at_1000\n value: 79.32900000000001\n - type: map_at_20\n value: 79.32900000000001\n - type: map_at_3\n value: 77.333\n - type: map_at_5\n value: 78.93299999999999\n - type: mrr_at_1\n value: 68.0\n - type: mrr_at_10\n value: 79.18730158730159\n - type: mrr_at_100\n value: 79.32945845004669\n - type: mrr_at_1000\n value: 79.32945845004669\n - type: mrr_at_20\n value: 79.32945845004669\n - type: mrr_at_3\n value: 77.33333333333333\n - type: mrr_at_5\n value: 78.93333333333332\n - type: nauc_map_at_1000_diff1\n value: 63.31103256935259\n - type: nauc_map_at_1000_max\n value: 11.073749121365623\n - type: nauc_map_at_1000_std\n value: 7.4973309839738\n - type: nauc_map_at_100_diff1\n value: 63.31103256935259\n - type: nauc_map_at_100_max\n value: 11.073749121365623\n - type: nauc_map_at_100_std\n value: 7.4973309839738\n - type: nauc_map_at_10_diff1\n value: 62.91585737195978\n - type: nauc_map_at_10_max\n value: 11.770664508983133\n - type: nauc_map_at_10_std\n value: 8.179883948527962\n - type: nauc_map_at_1_diff1\n value: 66.1236265634718\n - type: nauc_map_at_1_max\n value: 7.000207311173955\n - type: nauc_map_at_1_std\n value: 6.54412272821497\n - type: nauc_map_at_20_diff1\n value: 63.31103256935259\n - type: nauc_map_at_20_max\n value: 11.073749121365623\n - type: nauc_map_at_20_std\n value: 7.4973309839738\n - type: nauc_map_at_3_diff1\n value: 62.14039574010254\n - type: nauc_map_at_3_max\n value: 11.06996398110187\n - type: nauc_map_at_3_std\n value: 7.288759297085769\n - type: nauc_map_at_5_diff1\n value: 63.0401271126211\n - type: nauc_map_at_5_max\n value: 10.779317801858609\n - type: nauc_map_at_5_std\n value: 6.476660484760681\n - type: nauc_mrr_at_1000_diff1\n value: 63.31103256935259\n - type: nauc_mrr_at_1000_max\n value: 11.073749121365623\n - type: nauc_mrr_at_1000_std\n value: 7.4973309839738\n - type: nauc_mrr_at_100_diff1\n value: 63.31103256935259\n - type: nauc_mrr_at_100_max\n value: 11.073749121365623\n - type: nauc_mrr_at_100_std\n value: 7.4973309839738\n - type: nauc_mrr_at_10_diff1\n value: 62.91585737195978\n - type: nauc_mrr_at_10_max\n value: 11.770664508983133\n - type: nauc_mrr_at_10_std\n value: 8.179883948527962\n - type: nauc_mrr_at_1_diff1\n value: 66.1236265634718\n - type: nauc_mrr_at_1_max\n value: 7.000207311173955\n - type: nauc_mrr_at_1_std\n value: 6.54412272821497\n - type: nauc_mrr_at_20_diff1\n value: 63.31103256935259\n - type: nauc_mrr_at_20_max\n value: 11.073749121365623\n - type: nauc_mrr_at_20_std\n value: 7.4973309839738\n - type: nauc_mrr_at_3_diff1\n value: 62.14039574010254\n - type: nauc_mrr_at_3_max\n value: 11.06996398110187\n - type: nauc_mrr_at_3_std\n value: 7.288759297085769\n - type: nauc_mrr_at_5_diff1\n value: 63.0401271126211\n - type: nauc_mrr_at_5_max\n value: 10.779317801858609\n - type: nauc_mrr_at_5_std\n value: 6.476660484760681\n - type: nauc_ndcg_at_1000_diff1\n value: 62.9544299483241\n - type: nauc_ndcg_at_1000_max\n value: 11.577079766964538\n - type: nauc_ndcg_at_1000_std\n value: 7.703856790100716\n - type: nauc_ndcg_at_100_diff1\n value: 62.9544299483241\n - type: nauc_ndcg_at_100_max\n value: 11.577079766964538\n - type: nauc_ndcg_at_100_std\n value: 7.703856790100716\n - type: nauc_ndcg_at_10_diff1\n value: 61.29907952217381\n - type: nauc_ndcg_at_10_max\n value: 14.760627422715425\n - type: nauc_ndcg_at_10_std\n value: 10.805573898143368\n - type: nauc_ndcg_at_1_diff1\n value: 66.1236265634718\n - type: nauc_ndcg_at_1_max\n value: 7.000207311173955\n - type: nauc_ndcg_at_1_std\n value: 6.54412272821497\n - type: nauc_ndcg_at_20_diff1\n value: 62.9544299483241\n - type: nauc_ndcg_at_20_max\n value: 11.577079766964538\n - type: nauc_ndcg_at_20_std\n value: 7.703856790100716\n - type: nauc_ndcg_at_3_diff1\n value: 60.25643527856101\n - type: nauc_ndcg_at_3_max\n value: 12.236302709487546\n - type: nauc_ndcg_at_3_std\n value: 7.36883189112067\n - type: nauc_ndcg_at_5_diff1\n value: 61.65220590318238\n - type: nauc_ndcg_at_5_max\n value: 11.39969101913945\n - type: nauc_ndcg_at_5_std\n value: 5.406207922379402\n - type: nauc_precision_at_1000_diff1\n value: .nan\n - type: nauc_precision_at_1000_max\n value: .nan\n - type: nauc_precision_at_1000_std\n value: .nan\n - type: nauc_precision_at_100_diff1\n value: .nan\n - type: nauc_precision_at_100_max\n value: .nan\n - type: nauc_precision_at_100_std\n value: .nan\n - type: nauc_precision_at_10_diff1\n value: 19.14098972922579\n - type: nauc_precision_at_10_max\n value: 100.0\n - type: nauc_precision_at_10_std\n value: 93.46405228758135\n - type: nauc_precision_at_1_diff1\n value: 66.1236265634718\n - type: nauc_precision_at_1_max\n value: 7.000207311173955\n - type: nauc_precision_at_1_std\n value: 6.54412272821497\n - type: nauc_precision_at_20_diff1\n value: 100.0\n - type: nauc_precision_at_20_max\n value: 100.0\n - type: nauc_precision_at_20_std\n value: 100.0\n - type: nauc_precision_at_3_diff1\n value: 50.29636629155561\n - type: nauc_precision_at_3_max\n value: 18.00532600292076\n - type: nauc_precision_at_3_std\n value: 7.649686453053768\n - type: nauc_precision_at_5_diff1\n value: 43.522408963585356\n - type: nauc_precision_at_5_max\n value: 16.923436041082983\n - type: nauc_precision_at_5_std\n value: -10.854341736694092\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: .nan\n - type: nauc_recall_at_100_max\n value: .nan\n - type: nauc_recall_at_100_std\n value: .nan\n - type: nauc_recall_at_10_diff1\n value: 19.1409897292252\n - type: nauc_recall_at_10_max\n value: 100.0\n - type: nauc_recall_at_10_std\n value: 93.46405228758134\n - type: nauc_recall_at_1_diff1\n value: 66.1236265634718\n - type: nauc_recall_at_1_max\n value: 7.000207311173955\n - type: nauc_recall_at_1_std\n value: 6.54412272821497\n - type: nauc_recall_at_20_diff1\n value: .nan\n - type: nauc_recall_at_20_max\n value: .nan\n - type: nauc_recall_at_20_std\n value: .nan\n - type: nauc_recall_at_3_diff1\n value: 50.29636629155569\n - type: nauc_recall_at_3_max\n value: 18.005326002920754\n - type: nauc_recall_at_3_std\n value: 7.649686453053851\n - type: nauc_recall_at_5_diff1\n value: 43.5224089635856\n - type: nauc_recall_at_5_max\n value: 16.92343604108335\n - type: nauc_recall_at_5_std\n value: -10.854341736694499\n - type: ndcg_at_1\n value: 68.0\n - type: ndcg_at_10\n value: 83.851\n - type: ndcg_at_100\n value: 84.36099999999999\n - type: ndcg_at_1000\n value: 84.36099999999999\n - type: ndcg_at_20\n value: 84.36099999999999\n - type: ndcg_at_3\n value: 80.333\n - type: ndcg_at_5\n value: 83.21600000000001\n - type: precision_at_1\n value: 68.0\n - type: precision_at_10\n value: 9.8\n - type: precision_at_100\n value: 1.0\n - type: precision_at_1000\n value: 0.1\n - type: precision_at_20\n value: 5.0\n - type: precision_at_3\n value: 29.666999999999998\n - type: precision_at_5\n value: 19.2\n - type: recall_at_1\n value: 68.0\n - type: recall_at_10\n value: 98.0\n - type: recall_at_100\n value: 100.0\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 100.0\n - type: recall_at_3\n value: 89.0\n - type: recall_at_5\n value: 96.0\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB T2Reranking (default)\n revision: 76631901a18387f85eaa53e5450019b87ad58ef9\n split: dev\n type: C-MTEB/T2Reranking\n metrics:\n - type: map\n value: 65.3088203970324\n - type: mrr\n value: 74.79505862376546\n - type: main_score\n value: 65.3088203970324\n task:\n type: Reranking\n - dataset:\n config: default\n name: MTEB T2Retrieval (default)\n revision: 8731a845f1bf500a4f111cf1070785c793d10e64\n split: dev\n type: C-MTEB/T2Retrieval\n metrics:\n - type: main_score\n value: 83.163\n - type: map_at_1\n value: 26.875\n - type: map_at_10\n value: 75.454\n - type: map_at_100\n value: 79.036\n - type: map_at_1000\n value: 79.111\n - type: map_at_20\n value: 78.145\n - type: map_at_3\n value: 53.181\n - type: map_at_5\n value: 65.362\n - type: mrr_at_1\n value: 88.90057864281957\n - type: mrr_at_10\n value: 91.53186397301344\n - type: mrr_at_100\n value: 91.62809075510003\n - type: mrr_at_1000\n value: 91.63198173030787\n - type: mrr_at_20\n value: 91.59414668799909\n - type: mrr_at_3\n value: 91.0792565316499\n - type: mrr_at_5\n value: 91.35718043135199\n - type: nauc_map_at_1000_diff1\n value: 12.364843957982409\n - type: nauc_map_at_1000_max\n value: 52.07043464458799\n - type: nauc_map_at_1000_std\n value: 16.040095055100494\n - type: nauc_map_at_100_diff1\n value: 12.370621073823022\n - type: nauc_map_at_100_max\n value: 51.960738727635636\n - type: nauc_map_at_100_std\n value: 15.935832440430747\n - type: nauc_map_at_10_diff1\n value: 16.852819486606585\n - type: nauc_map_at_10_max\n value: 40.11184760756059\n - type: nauc_map_at_10_std\n value: 0.9306648364102376\n - type: nauc_map_at_1_diff1\n value: 52.87356542654683\n - type: nauc_map_at_1_max\n value: -22.210039746171255\n - type: nauc_map_at_1_std\n value: -38.11345358035342\n - type: nauc_map_at_20_diff1\n value: 13.045089059562837\n - type: nauc_map_at_20_max\n value: 49.591383082160036\n - type: nauc_map_at_20_std\n value: 12.54330050352008\n - type: nauc_map_at_3_diff1\n value: 38.08172234377615\n - type: nauc_map_at_3_max\n value: -6.868621684867697\n - type: nauc_map_at_3_std\n value: -35.4712388845996\n - type: nauc_map_at_5_diff1\n value: 29.665551705577474\n - type: nauc_map_at_5_max\n value: 10.958628576519045\n - type: nauc_map_at_5_std\n value: -25.113120842097057\n - type: nauc_mrr_at_1000_diff1\n value: 47.39372999496945\n - type: nauc_mrr_at_1000_max\n value: 83.11274997493808\n - type: nauc_mrr_at_1000_std\n value: 39.74195374546631\n - type: nauc_mrr_at_100_diff1\n value: 47.396678946057676\n - type: nauc_mrr_at_100_max\n value: 83.1192584274415\n - type: nauc_mrr_at_100_std\n value: 39.75840860374685\n - type: nauc_mrr_at_10_diff1\n value: 47.35365644138715\n - type: nauc_mrr_at_10_max\n value: 83.189165639531\n - type: nauc_mrr_at_10_std\n value: 39.83653157887758\n - type: nauc_mrr_at_1_diff1\n value: 47.98740362820094\n - type: nauc_mrr_at_1_max\n value: 80.32340034580369\n - type: nauc_mrr_at_1_std\n value: 34.57857131423388\n - type: nauc_mrr_at_20_diff1\n value: 47.399132055537194\n - type: nauc_mrr_at_20_max\n value: 83.16329919869686\n - type: nauc_mrr_at_20_std\n value: 39.84204692042734\n - type: nauc_mrr_at_3_diff1\n value: 47.09295580511751\n - type: nauc_mrr_at_3_max\n value: 82.95831045602642\n - type: nauc_mrr_at_3_std\n value: 38.98036804692351\n - type: nauc_mrr_at_5_diff1\n value: 47.20100268549764\n - type: nauc_mrr_at_5_max\n value: 83.16652480381642\n - type: nauc_mrr_at_5_std\n value: 39.55690491560902\n - type: nauc_ndcg_at_1000_diff1\n value: 17.201962509184547\n - type: nauc_ndcg_at_1000_max\n value: 63.75820559259539\n - type: nauc_ndcg_at_1000_std\n value: 29.28676096486067\n - type: nauc_ndcg_at_100_diff1\n value: 16.76847216096811\n - type: nauc_ndcg_at_100_max\n value: 62.646517934470744\n - type: nauc_ndcg_at_100_std\n value: 28.7441617667637\n - type: nauc_ndcg_at_10_diff1\n value: 16.559511980751886\n - type: nauc_ndcg_at_10_max\n value: 54.35027464277944\n - type: nauc_ndcg_at_10_std\n value: 16.98089333577716\n - type: nauc_ndcg_at_1_diff1\n value: 47.98740362820094\n - type: nauc_ndcg_at_1_max\n value: 80.32340034580369\n - type: nauc_ndcg_at_1_std\n value: 34.57857131423388\n - type: nauc_ndcg_at_20_diff1\n value: 16.721525245428243\n - type: nauc_ndcg_at_20_max\n value: 57.683661870555724\n - type: nauc_ndcg_at_20_std\n value: 21.736044200026853\n - type: nauc_ndcg_at_3_diff1\n value: 12.488009696556192\n - type: nauc_ndcg_at_3_max\n value: 69.2365575305502\n - type: nauc_ndcg_at_3_std\n value: 30.622418945055323\n - type: nauc_ndcg_at_5_diff1\n value: 12.364114556230609\n - type: nauc_ndcg_at_5_max\n value: 62.33360746285387\n - type: nauc_ndcg_at_5_std\n value: 24.898000803570227\n - type: nauc_precision_at_1000_diff1\n value: -35.14745130154524\n - type: nauc_precision_at_1000_max\n value: 48.811507982849065\n - type: nauc_precision_at_1000_std\n value: 62.43036496029399\n - type: nauc_precision_at_100_diff1\n value: -35.15276411320076\n - type: nauc_precision_at_100_max\n value: 50.87010333741109\n - type: nauc_precision_at_100_std\n value: 63.418221030407175\n - type: nauc_precision_at_10_diff1\n value: -34.84255710936113\n - type: nauc_precision_at_10_max\n value: 56.588401051428825\n - type: nauc_precision_at_10_std\n value: 57.4763370653757\n - type: nauc_precision_at_1_diff1\n value: 47.98740362820094\n - type: nauc_precision_at_1_max\n value: 80.32340034580369\n - type: nauc_precision_at_1_std\n value: 34.57857131423388\n - type: nauc_precision_at_20_diff1\n value: -35.165762365233505\n - type: nauc_precision_at_20_max\n value: 54.148762449660424\n - type: nauc_precision_at_20_std\n value: 61.569719669368716\n - type: nauc_precision_at_3_diff1\n value: -28.63023175340299\n - type: nauc_precision_at_3_max\n value: 68.69825987618499\n - type: nauc_precision_at_3_std\n value: 48.15479495755423\n - type: nauc_precision_at_5_diff1\n value: -34.13811355456687\n - type: nauc_precision_at_5_max\n value: 62.369363941490604\n - type: nauc_precision_at_5_std\n value: 52.282904411187914\n - type: nauc_recall_at_1000_diff1\n value: 8.686444579162663\n - type: nauc_recall_at_1000_max\n value: 59.58864478011338\n - type: nauc_recall_at_1000_std\n value: 56.692774954297455\n - type: nauc_recall_at_100_diff1\n value: 8.820596225758342\n - type: nauc_recall_at_100_max\n value: 53.15048885657892\n - type: nauc_recall_at_100_std\n value: 39.78931159236714\n - type: nauc_recall_at_10_diff1\n value: 16.022301106315027\n - type: nauc_recall_at_10_max\n value: 29.83242342459543\n - type: nauc_recall_at_10_std\n value: -4.805965555875844\n - type: nauc_recall_at_1_diff1\n value: 52.87356542654683\n - type: nauc_recall_at_1_max\n value: -22.210039746171255\n - type: nauc_recall_at_1_std\n value: -38.11345358035342\n - type: nauc_recall_at_20_diff1\n value: 10.35772828627265\n - type: nauc_recall_at_20_max\n value: 43.06420839754062\n - type: nauc_recall_at_20_std\n value: 15.040522218235692\n - type: nauc_recall_at_3_diff1\n value: 36.23953684770224\n - type: nauc_recall_at_3_max\n value: -11.709269151700374\n - type: nauc_recall_at_3_std\n value: -38.13943178150384\n - type: nauc_recall_at_5_diff1\n value: 28.644872415763384\n - type: nauc_recall_at_5_max\n value: 2.062151266111129\n - type: nauc_recall_at_5_std\n value: -30.81114034774277\n - type: ndcg_at_1\n value: 88.901\n - type: ndcg_at_10\n value: 83.163\n - type: ndcg_at_100\n value: 86.854\n - type: ndcg_at_1000\n value: 87.602\n - type: ndcg_at_20\n value: 84.908\n - type: ndcg_at_3\n value: 84.848\n - type: ndcg_at_5\n value: 83.372\n - type: precision_at_1\n value: 88.901\n - type: precision_at_10\n value: 41.343\n - type: precision_at_100\n value: 4.957000000000001\n - type: precision_at_1000\n value: 0.513\n - type: precision_at_20\n value: 22.955000000000002\n - type: precision_at_3\n value: 74.29599999999999\n - type: precision_at_5\n value: 62.251999999999995\n - type: recall_at_1\n value: 26.875\n - type: recall_at_10\n value: 81.902\n - type: recall_at_100\n value: 93.988\n - type: recall_at_1000\n value: 97.801\n - type: recall_at_20\n value: 87.809\n - type: recall_at_3\n value: 54.869\n - type: recall_at_5\n value: 68.728\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB TERRa (default)\n revision: 7b58f24536063837d644aab9a023c62199b2a612\n split: dev\n type: ai-forever/terra-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 60.586319218241044\n - type: cosine_accuracy_threshold\n value: 82.49806761741638\n - type: cosine_ap\n value: 58.73198048427448\n - type: cosine_f1\n value: 67.37967914438502\n - type: cosine_f1_threshold\n value: 77.46461033821106\n - type: cosine_precision\n value: 57.01357466063348\n - type: cosine_recall\n value: 82.35294117647058\n - type: dot_accuracy\n value: 60.26058631921825\n - type: dot_accuracy_threshold\n value: 35627.020263671875\n - type: dot_ap\n value: 57.418783612898224\n - type: dot_f1\n value: 66.51982378854623\n - type: dot_f1_threshold\n value: 27620.843505859375\n - type: dot_precision\n value: 50.16611295681063\n - type: dot_recall\n value: 98.69281045751634\n - type: euclidean_accuracy\n value: 60.26058631921825\n - type: euclidean_accuracy_threshold\n value: 1255.4466247558594\n - type: euclidean_ap\n value: 58.748656145387955\n - type: euclidean_f1\n value: 66.99029126213591\n - type: euclidean_f1_threshold\n value: 1565.1330947875977\n - type: euclidean_precision\n value: 53.28185328185329\n - type: euclidean_recall\n value: 90.19607843137256\n - type: main_score\n value: 58.8479126365766\n - type: manhattan_accuracy\n value: 59.934853420195445\n - type: manhattan_accuracy_threshold\n value: 29897.271728515625\n - type: manhattan_ap\n value: 58.8479126365766\n - type: manhattan_f1\n value: 66.81318681318683\n - type: manhattan_f1_threshold\n value: 46291.802978515625\n - type: manhattan_precision\n value: 50.331125827814574\n - type: manhattan_recall\n value: 99.34640522875817\n - type: max_accuracy\n value: 60.586319218241044\n - type: max_ap\n value: 58.8479126365766\n - type: max_f1\n value: 67.37967914438502\n - type: max_precision\n value: 57.01357466063348\n - type: max_recall\n value: 99.34640522875817\n - type: similarity_accuracy\n value: 60.586319218241044\n - type: similarity_accuracy_threshold\n value: 82.49806761741638\n - type: similarity_ap\n value: 58.73198048427448\n - type: similarity_f1\n value: 67.37967914438502\n - type: similarity_f1_threshold\n value: 77.46461033821106\n - type: similarity_precision\n value: 57.01357466063348\n - type: similarity_recall\n value: 82.35294117647058\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB TNews (default)\n revision: 317f262bf1e6126357bbe89e875451e4b0938fe4\n split: validation\n type: C-MTEB/TNews-classification\n metrics:\n - type: accuracy\n value: 45.967999999999996\n - type: f1\n value: 44.699306100915706\n - type: f1_weighted\n value: 46.03730319014832\n - type: main_score\n value: 45.967999999999996\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB TRECCOVID (default)\n revision: bb9466bac8153a0349341eb1b22e06409e78ef4e\n split: test\n type: mteb/trec-covid\n metrics:\n - type: map_at_1\n value: 0.251\n - type: map_at_10\n value: 1.9480000000000002\n - type: map_at_100\n value: 11.082\n - type: map_at_1000\n value: 26.700000000000003\n - type: map_at_20\n value: 3.3529999999999998\n - type: map_at_3\n value: 0.679\n - type: map_at_5\n value: 1.079\n - type: mrr_at_1\n value: 94.0\n - type: mrr_at_10\n value: 95.786\n - type: mrr_at_100\n value: 95.786\n - type: mrr_at_1000\n value: 95.786\n - type: mrr_at_20\n value: 95.786\n - type: mrr_at_3\n value: 95.0\n - type: mrr_at_5\n value: 95.5\n - type: ndcg_at_1\n value: 91.0\n - type: ndcg_at_10\n value: 77.71900000000001\n - type: ndcg_at_100\n value: 57.726\n - type: ndcg_at_1000\n value: 52.737\n - type: ndcg_at_20\n value: 72.54\n - type: ndcg_at_3\n value: 83.397\n - type: ndcg_at_5\n value: 80.806\n - type: precision_at_1\n value: 94.0\n - type: precision_at_10\n value: 81.0\n - type: precision_at_100\n value: 59.199999999999996\n - type: precision_at_1000\n value: 23.244\n - type: precision_at_20\n value: 75.2\n - type: precision_at_3\n value: 88.0\n - type: precision_at_5\n value: 84.8\n - type: recall_at_1\n value: 0.251\n - type: recall_at_10\n value: 2.1229999999999998\n - type: recall_at_100\n value: 14.496999999999998\n - type: recall_at_1000\n value: 50.09\n - type: recall_at_20\n value: 3.8309999999999995\n - type: recall_at_3\n value: 0.696\n - type: recall_at_5\n value: 1.1400000000000001\n - type: main_score\n value: 77.71900000000001\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB TenKGnadClusteringP2P (default)\n revision: 5c59e41555244b7e45c9a6be2d720ab4bafae558\n split: test\n type: slvnwhrl/tenkgnad-clustering-p2p\n metrics:\n - type: main_score\n value: 43.763609722295215\n - type: v_measure\n value: 43.763609722295215\n - type: v_measure_std\n value: 2.8751199473862457\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB TenKGnadClusteringS2S (default)\n revision: 6cddbe003f12b9b140aec477b583ac4191f01786\n split: test\n type: slvnwhrl/tenkgnad-clustering-s2s\n metrics:\n - type: main_score\n value: 39.762424448504355\n - type: v_measure\n value: 39.762424448504355\n - type: v_measure_std\n value: 3.30146124979502\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB ThuNewsClusteringP2P (default)\n revision: 5798586b105c0434e4f0fe5e767abe619442cf93\n split: test\n type: C-MTEB/ThuNewsClusteringP2P\n metrics:\n - type: main_score\n value: 63.133819258289456\n - type: v_measure\n value: 63.133819258289456\n - type: v_measure_std\n value: 1.8854253356479695\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB ThuNewsClusteringS2S (default)\n revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d\n split: test\n type: C-MTEB/ThuNewsClusteringS2S\n metrics:\n - type: main_score\n value: 58.98195851785808\n - type: v_measure\n value: 58.98195851785808\n - type: v_measure_std\n value: 1.6237600076393737\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB Touche2020 (default)\n revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f\n split: test\n type: mteb/touche2020\n metrics:\n - type: map_at_1\n value: 3.3550000000000004\n - type: map_at_10\n value: 10.08\n - type: map_at_100\n value: 16.136\n - type: map_at_1000\n value: 17.605\n - type: map_at_20\n value: 12.561\n - type: map_at_3\n value: 5.641\n - type: map_at_5\n value: 7.3260000000000005\n - type: mrr_at_1\n value: 46.939\n - type: mrr_at_10\n value: 58.152\n - type: mrr_at_100\n value: 58.594\n - type: mrr_at_1000\n value: 58.601000000000006\n - type: mrr_at_20\n value: 58.279\n - type: mrr_at_3\n value: 55.102\n - type: mrr_at_5\n value: 56.531\n - type: ndcg_at_1\n value: 44.897999999999996\n - type: ndcg_at_10\n value: 26.298\n - type: ndcg_at_100\n value: 37.596000000000004\n - type: ndcg_at_1000\n value: 49.424\n - type: ndcg_at_20\n value: 27.066000000000003\n - type: ndcg_at_3\n value: 31.528\n - type: ndcg_at_5\n value: 28.219\n - type: precision_at_1\n value: 46.939\n - type: precision_at_10\n value: 22.245\n - type: precision_at_100\n value: 7.531000000000001\n - type: precision_at_1000\n value: 1.5350000000000001\n - type: precision_at_20\n value: 17.041\n - type: precision_at_3\n value: 30.612000000000002\n - type: precision_at_5\n value: 26.122\n - type: recall_at_1\n value: 3.3550000000000004\n - type: recall_at_10\n value: 16.41\n - type: recall_at_100\n value: 47.272\n - type: recall_at_1000\n value: 83.584\n - type: recall_at_20\n value: 24.091\n - type: recall_at_3\n value: 6.8180000000000005\n - type: recall_at_5\n value: 9.677\n - type: main_score\n value: 26.298\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB ToxicConversationsClassification (default)\n revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de\n split: test\n type: mteb/toxic_conversations_50k\n metrics:\n - type: accuracy\n value: 91.2890625\n - type: ap\n value: 33.95547153875715\n - type: ap_weighted\n value: 33.95547153875715\n - type: f1\n value: 75.10768597556462\n - type: f1_weighted\n value: 92.00161208992606\n - type: main_score\n value: 91.2890625\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB TweetSentimentExtractionClassification (default)\n revision: d604517c81ca91fe16a244d1248fc021f9ecee7a\n split: test\n type: mteb/tweet_sentiment_extraction\n metrics:\n - type: accuracy\n value: 71.3978494623656\n - type: f1\n value: 71.7194818511814\n - type: f1_weighted\n value: 71.13860187349744\n - type: main_score\n value: 71.3978494623656\n task:\n type: Classification\n - dataset:\n config: default\n name: MTEB TwentyNewsgroupsClustering (default)\n revision: 6125ec4e24fa026cec8a478383ee943acfbd5449\n split: test\n type: mteb/twentynewsgroups-clustering\n metrics:\n - type: main_score\n value: 52.4921688720602\n - type: v_measure\n value: 52.4921688720602\n - type: v_measure_std\n value: 0.992768152658908\n task:\n type: Clustering\n - dataset:\n config: default\n name: MTEB TwitterSemEval2015 (default)\n revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1\n split: test\n type: mteb/twittersemeval2015-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 85.11652858079513\n - type: cosine_accuracy_threshold\n value: 87.90839910507202\n - type: cosine_ap\n value: 70.90459908851724\n - type: cosine_f1\n value: 65.66581227877457\n - type: cosine_f1_threshold\n value: 85.13308763504028\n - type: cosine_precision\n value: 61.094708153531684\n - type: cosine_recall\n value: 70.97625329815304\n - type: dot_accuracy\n value: 83.41181379269239\n - type: dot_accuracy_threshold\n value: 43110.113525390625\n - type: dot_ap\n value: 65.64869491143095\n - type: dot_f1\n value: 62.05308447460914\n - type: dot_f1_threshold\n value: 41412.542724609375\n - type: dot_precision\n value: 57.38623626989464\n - type: dot_recall\n value: 67.54617414248021\n - type: euclidean_accuracy\n value: 85.15229182809799\n - type: euclidean_accuracy_threshold\n value: 1043.08500289917\n - type: euclidean_ap\n value: 70.71204383269375\n - type: euclidean_f1\n value: 65.20304568527919\n - type: euclidean_f1_threshold\n value: 1179.2595863342285\n - type: euclidean_precision\n value: 62.81173594132029\n - type: euclidean_recall\n value: 67.78364116094987\n - type: main_score\n value: 70.90459908851724\n - type: manhattan_accuracy\n value: 85.1820945341837\n - type: manhattan_accuracy_threshold\n value: 26115.0390625\n - type: manhattan_ap\n value: 70.66113937117431\n - type: manhattan_f1\n value: 65.33383628819313\n - type: manhattan_f1_threshold\n value: 29105.181884765625\n - type: manhattan_precision\n value: 62.40691808791736\n - type: manhattan_recall\n value: 68.54881266490766\n - type: max_accuracy\n value: 85.1820945341837\n - type: max_ap\n value: 70.90459908851724\n - type: max_f1\n value: 65.66581227877457\n - type: max_precision\n value: 62.81173594132029\n - type: max_recall\n value: 70.97625329815304\n - type: similarity_accuracy\n value: 85.11652858079513\n - type: similarity_accuracy_threshold\n value: 87.90839910507202\n - type: similarity_ap\n value: 70.90459908851724\n - type: similarity_f1\n value: 65.66581227877457\n - type: similarity_f1_threshold\n value: 85.13308763504028\n - type: similarity_precision\n value: 61.094708153531684\n - type: similarity_recall\n value: 70.97625329815304\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB TwitterURLCorpus (default)\n revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf\n split: test\n type: mteb/twitterurlcorpus-pairclassification\n metrics:\n - type: cosine_accuracy\n value: 88.10299996119068\n - type: cosine_accuracy_threshold\n value: 84.34982895851135\n - type: cosine_ap\n value: 84.13755787769226\n - type: cosine_f1\n value: 76.0967548076923\n - type: cosine_f1_threshold\n value: 82.8936219215393\n - type: cosine_precision\n value: 74.28864769727193\n - type: cosine_recall\n value: 77.99507237449954\n - type: dot_accuracy\n value: 86.64182869561843\n - type: dot_accuracy_threshold\n value: 38794.677734375\n - type: dot_ap\n value: 80.20301567411457\n - type: dot_f1\n value: 73.50650291634967\n - type: dot_f1_threshold\n value: 37447.23205566406\n - type: dot_precision\n value: 69.41498460485802\n - type: dot_recall\n value: 78.11056359716662\n - type: euclidean_accuracy\n value: 87.9361198432103\n - type: euclidean_accuracy_threshold\n value: 1184.421157836914\n - type: euclidean_ap\n value: 83.79582690117218\n - type: euclidean_f1\n value: 75.81431709042175\n - type: euclidean_f1_threshold\n value: 1258.2727432250977\n - type: euclidean_precision\n value: 73.39099099099099\n - type: euclidean_recall\n value: 78.40314136125654\n - type: main_score\n value: 84.13755787769226\n - type: manhattan_accuracy\n value: 87.96134590755618\n - type: manhattan_accuracy_threshold\n value: 29077.291870117188\n - type: manhattan_ap\n value: 83.79487172269923\n - type: manhattan_f1\n value: 75.82421603424935\n - type: manhattan_f1_threshold\n value: 31224.124145507812\n - type: manhattan_precision\n value: 72.24740255212329\n - type: manhattan_recall\n value: 79.77363720357253\n - type: max_accuracy\n value: 88.10299996119068\n - type: max_ap\n value: 84.13755787769226\n - type: max_f1\n value: 76.0967548076923\n - type: max_precision\n value: 74.28864769727193\n - type: max_recall\n value: 79.77363720357253\n - type: similarity_accuracy\n value: 88.10299996119068\n - type: similarity_accuracy_threshold\n value: 84.34982895851135\n - type: similarity_ap\n value: 84.13755787769226\n - type: similarity_f1\n value: 76.0967548076923\n - type: similarity_f1_threshold\n value: 82.8936219215393\n - type: similarity_precision\n value: 74.28864769727193\n - type: similarity_recall\n value: 77.99507237449954\n task:\n type: PairClassification\n - dataset:\n config: default\n name: MTEB VideoRetrieval (default)\n revision: 58c2597a5943a2ba48f4668c3b90d796283c5639\n split: dev\n type: C-MTEB/VideoRetrieval\n metrics:\n - type: main_score\n value: 70.433\n - type: map_at_1\n value: 55.7\n - type: map_at_10\n value: 66.013\n - type: map_at_100\n value: 66.534\n - type: map_at_1000\n value: 66.547\n - type: map_at_20\n value: 66.334\n - type: map_at_3\n value: 64.2\n - type: map_at_5\n value: 65.445\n - type: mrr_at_1\n value: 55.7\n - type: mrr_at_10\n value: 66.01329365079364\n - type: mrr_at_100\n value: 66.53350061744233\n - type: mrr_at_1000\n value: 66.54744831962995\n - type: mrr_at_20\n value: 66.3335147364675\n - type: mrr_at_3\n value: 64.2\n - type: mrr_at_5\n value: 65.44500000000002\n - type: nauc_map_at_1000_diff1\n value: 76.26428836976245\n - type: nauc_map_at_1000_max\n value: 35.41847367373575\n - type: nauc_map_at_1000_std\n value: -33.04639860831992\n - type: nauc_map_at_100_diff1\n value: 76.25793229023193\n - type: nauc_map_at_100_max\n value: 35.43663260110076\n - type: nauc_map_at_100_std\n value: -33.04238139882945\n - type: nauc_map_at_10_diff1\n value: 76.2108281297711\n - type: nauc_map_at_10_max\n value: 35.59442419423183\n - type: nauc_map_at_10_std\n value: -33.32346518997277\n - type: nauc_map_at_1_diff1\n value: 79.17728405262736\n - type: nauc_map_at_1_max\n value: 31.880738163589527\n - type: nauc_map_at_1_std\n value: -30.891888718004584\n - type: nauc_map_at_20_diff1\n value: 76.2181333410193\n - type: nauc_map_at_20_max\n value: 35.43448818430876\n - type: nauc_map_at_20_std\n value: -33.35682442863193\n - type: nauc_map_at_3_diff1\n value: 76.10046541433466\n - type: nauc_map_at_3_max\n value: 34.6831278555291\n - type: nauc_map_at_3_std\n value: -34.030826044831116\n - type: nauc_map_at_5_diff1\n value: 75.96513023582064\n - type: nauc_map_at_5_max\n value: 34.66920832438069\n - type: nauc_map_at_5_std\n value: -33.79799777830796\n - type: nauc_mrr_at_1000_diff1\n value: 76.26428836976245\n - type: nauc_mrr_at_1000_max\n value: 35.41847367373575\n - type: nauc_mrr_at_1000_std\n value: -33.04639860831992\n - type: nauc_mrr_at_100_diff1\n value: 76.25793229023193\n - type: nauc_mrr_at_100_max\n value: 35.43663260110076\n - type: nauc_mrr_at_100_std\n value: -33.04238139882945\n - type: nauc_mrr_at_10_diff1\n value: 76.2108281297711\n - type: nauc_mrr_at_10_max\n value: 35.59442419423183\n - type: nauc_mrr_at_10_std\n value: -33.32346518997277\n - type: nauc_mrr_at_1_diff1\n value: 79.17728405262736\n - type: nauc_mrr_at_1_max\n value: 31.880738163589527\n - type: nauc_mrr_at_1_std\n value: -30.891888718004584\n - type: nauc_mrr_at_20_diff1\n value: 76.2181333410193\n - type: nauc_mrr_at_20_max\n value: 35.43448818430876\n - type: nauc_mrr_at_20_std\n value: -33.35682442863193\n - type: nauc_mrr_at_3_diff1\n value: 76.10046541433466\n - type: nauc_mrr_at_3_max\n value: 34.6831278555291\n - type: nauc_mrr_at_3_std\n value: -34.030826044831116\n - type: nauc_mrr_at_5_diff1\n value: 75.96513023582064\n - type: nauc_mrr_at_5_max\n value: 34.66920832438069\n - type: nauc_mrr_at_5_std\n value: -33.79799777830796\n - type: nauc_ndcg_at_1000_diff1\n value: 75.68118206798317\n - type: nauc_ndcg_at_1000_max\n value: 37.12252980787349\n - type: nauc_ndcg_at_1000_std\n value: -31.457578337430505\n - type: nauc_ndcg_at_100_diff1\n value: 75.46730761564156\n - type: nauc_ndcg_at_100_max\n value: 37.549890025544265\n - type: nauc_ndcg_at_100_std\n value: -31.35066985945112\n - type: nauc_ndcg_at_10_diff1\n value: 75.09890404887037\n - type: nauc_ndcg_at_10_max\n value: 38.024147790014204\n - type: nauc_ndcg_at_10_std\n value: -33.67408368593356\n - type: nauc_ndcg_at_1_diff1\n value: 79.17728405262736\n - type: nauc_ndcg_at_1_max\n value: 31.880738163589527\n - type: nauc_ndcg_at_1_std\n value: -30.891888718004584\n - type: nauc_ndcg_at_20_diff1\n value: 75.12977548171354\n - type: nauc_ndcg_at_20_max\n value: 37.524926748917956\n - type: nauc_ndcg_at_20_std\n value: -33.771344674947485\n - type: nauc_ndcg_at_3_diff1\n value: 74.94037476984154\n - type: nauc_ndcg_at_3_max\n value: 35.60345554050552\n - type: nauc_ndcg_at_3_std\n value: -35.256991346321854\n - type: nauc_ndcg_at_5_diff1\n value: 74.54265907753783\n - type: nauc_ndcg_at_5_max\n value: 35.57662819978585\n - type: nauc_ndcg_at_5_std\n value: -34.879794448418465\n - type: nauc_precision_at_1000_diff1\n value: 74.52277207179142\n - type: nauc_precision_at_1000_max\n value: 94.25510945118707\n - type: nauc_precision_at_1000_std\n value: 91.6874157070222\n - type: nauc_precision_at_100_diff1\n value: 65.98346655735419\n - type: nauc_precision_at_100_max\n value: 78.81168727653687\n - type: nauc_precision_at_100_std\n value: 27.241465691967708\n - type: nauc_precision_at_10_diff1\n value: 69.55050319096688\n - type: nauc_precision_at_10_max\n value: 51.827749140893374\n - type: nauc_precision_at_10_std\n value: -34.60818605792837\n - type: nauc_precision_at_1_diff1\n value: 79.17728405262736\n - type: nauc_precision_at_1_max\n value: 31.880738163589527\n - type: nauc_precision_at_1_std\n value: -30.891888718004584\n - type: nauc_precision_at_20_diff1\n value: 68.08078305042736\n - type: nauc_precision_at_20_max\n value: 52.83318878288501\n - type: nauc_precision_at_20_std\n value: -35.46070292817927\n - type: nauc_precision_at_3_diff1\n value: 70.76249609881901\n - type: nauc_precision_at_3_max\n value: 38.86561868624655\n - type: nauc_precision_at_3_std\n value: -39.68917853446992\n - type: nauc_precision_at_5_diff1\n value: 68.39110629013278\n - type: nauc_precision_at_5_max\n value: 39.28677163904683\n - type: nauc_precision_at_5_std\n value: -39.39101423819562\n - type: nauc_recall_at_1000_diff1\n value: 74.52277207179175\n - type: nauc_recall_at_1000_max\n value: 94.25510945118776\n - type: nauc_recall_at_1000_std\n value: 91.68741570702382\n - type: nauc_recall_at_100_diff1\n value: 65.9834665573548\n - type: nauc_recall_at_100_max\n value: 78.81168727653679\n - type: nauc_recall_at_100_std\n value: 27.241465691967598\n - type: nauc_recall_at_10_diff1\n value: 69.55050319096708\n - type: nauc_recall_at_10_max\n value: 51.82774914089347\n - type: nauc_recall_at_10_std\n value: -34.6081860579283\n - type: nauc_recall_at_1_diff1\n value: 79.17728405262736\n - type: nauc_recall_at_1_max\n value: 31.880738163589527\n - type: nauc_recall_at_1_std\n value: -30.891888718004584\n - type: nauc_recall_at_20_diff1\n value: 68.08078305042746\n - type: nauc_recall_at_20_max\n value: 52.833188782885244\n - type: nauc_recall_at_20_std\n value: -35.46070292817895\n - type: nauc_recall_at_3_diff1\n value: 70.76249609881896\n - type: nauc_recall_at_3_max\n value: 38.865618686246464\n - type: nauc_recall_at_3_std\n value: -39.68917853446999\n - type: nauc_recall_at_5_diff1\n value: 68.39110629013274\n - type: nauc_recall_at_5_max\n value: 39.28677163904688\n - type: nauc_recall_at_5_std\n value: -39.39101423819562\n - type: ndcg_at_1\n value: 55.7\n - type: ndcg_at_10\n value: 70.433\n - type: ndcg_at_100\n value: 72.975\n - type: ndcg_at_1000\n value: 73.283\n - type: ndcg_at_20\n value: 71.58\n - type: ndcg_at_3\n value: 66.83099999999999\n - type: ndcg_at_5\n value: 69.085\n - type: precision_at_1\n value: 55.7\n - type: precision_at_10\n value: 8.4\n - type: precision_at_100\n value: 0.959\n - type: precision_at_1000\n value: 0.098\n - type: precision_at_20\n value: 4.425\n - type: precision_at_3\n value: 24.8\n - type: precision_at_5\n value: 15.98\n - type: recall_at_1\n value: 55.7\n - type: recall_at_10\n value: 84.0\n - type: recall_at_100\n value: 95.89999999999999\n - type: recall_at_1000\n value: 98.2\n - type: recall_at_20\n value: 88.5\n - type: recall_at_3\n value: 74.4\n - type: recall_at_5\n value: 79.9\n task:\n type: Retrieval\n - dataset:\n config: default\n name: MTEB Waimai (default)\n revision: 339287def212450dcaa9df8c22bf93e9980c7023\n split: test\n type: C-MTEB/waimai-classification\n metrics:\n - type: accuracy\n value: 86.58999999999999\n - type: ap\n value: 70.02619249927523\n - type: ap_weighted\n value: 70.02619249927523\n - type: f1\n value: 84.97572770889423\n - type: f1_weighted\n value: 86.6865713531272\n - type: main_score\n value: 86.58999999999999\n task:\n type: Classification\n - dataset:\n config: en\n name: MTEB XMarket (en)\n revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b\n split: test\n type: jinaai/xmarket_ml\n metrics:\n - type: main_score\n value: 34.772999999999996\n - type: map_at_1\n value: 7.2620000000000005\n - type: map_at_10\n value: 17.98\n - type: map_at_100\n value: 24.828\n - type: map_at_1000\n value: 26.633000000000003\n - type: map_at_20\n value: 20.699\n - type: map_at_3\n value: 12.383\n - type: map_at_5\n value: 14.871\n - type: mrr_at_1\n value: 34.718100890207715\n - type: mrr_at_10\n value: 43.9336827525092\n - type: mrr_at_100\n value: 44.66474011066837\n - type: mrr_at_1000\n value: 44.7075592197356\n - type: mrr_at_20\n value: 44.35984436569346\n - type: mrr_at_3\n value: 41.73901893981052\n - type: mrr_at_5\n value: 43.025973550207134\n - type: nauc_map_at_1000_diff1\n value: 13.899869081196364\n - type: nauc_map_at_1000_max\n value: 46.60452816386231\n - type: nauc_map_at_1000_std\n value: 24.87925799401773\n - type: nauc_map_at_100_diff1\n value: 16.164805650871084\n - type: nauc_map_at_100_max\n value: 44.720912958558095\n - type: nauc_map_at_100_std\n value: 20.236734536210477\n - type: nauc_map_at_10_diff1\n value: 23.58580520913581\n - type: nauc_map_at_10_max\n value: 31.276151869914216\n - type: nauc_map_at_10_std\n value: -0.1833326246041355\n - type: nauc_map_at_1_diff1\n value: 37.02663305598722\n - type: nauc_map_at_1_max\n value: 14.931071531116528\n - type: nauc_map_at_1_std\n value: -12.478790028708453\n - type: nauc_map_at_20_diff1\n value: 20.718297881540593\n - type: nauc_map_at_20_max\n value: 36.62264094841859\n - type: nauc_map_at_20_std\n value: 6.658514770057742\n - type: nauc_map_at_3_diff1\n value: 29.379034581120006\n - type: nauc_map_at_3_max\n value: 21.387214269548803\n - type: nauc_map_at_3_std\n value: -9.3404121914247\n - type: nauc_map_at_5_diff1\n value: 26.627169792839485\n - type: nauc_map_at_5_max\n value: 25.393331109666388\n - type: nauc_map_at_5_std\n value: -6.023485287246353\n - type: nauc_mrr_at_1000_diff1\n value: 12.047232036652295\n - type: nauc_mrr_at_1000_max\n value: 46.611862580860645\n - type: nauc_mrr_at_1000_std\n value: 27.89146066442305\n - type: nauc_mrr_at_100_diff1\n value: 12.05261747449997\n - type: nauc_mrr_at_100_max\n value: 46.61328535381203\n - type: nauc_mrr_at_100_std\n value: 27.886145596874535\n - type: nauc_mrr_at_10_diff1\n value: 12.006935553036941\n - type: nauc_mrr_at_10_max\n value: 46.53351686240496\n - type: nauc_mrr_at_10_std\n value: 27.708742470257462\n - type: nauc_mrr_at_1_diff1\n value: 13.323408127738782\n - type: nauc_mrr_at_1_max\n value: 43.78884661002012\n - type: nauc_mrr_at_1_std\n value: 25.164417588165673\n - type: nauc_mrr_at_20_diff1\n value: 12.036022973968011\n - type: nauc_mrr_at_20_max\n value: 46.56537838037131\n - type: nauc_mrr_at_20_std\n value: 27.78189157249635\n - type: nauc_mrr_at_3_diff1\n value: 11.943896700976381\n - type: nauc_mrr_at_3_max\n value: 46.33644663073225\n - type: nauc_mrr_at_3_std\n value: 27.523915405053845\n - type: nauc_mrr_at_5_diff1\n value: 12.03108009033769\n - type: nauc_mrr_at_5_max\n value: 46.49103616896692\n - type: nauc_mrr_at_5_std\n value: 27.630879129863366\n - type: nauc_ndcg_at_1000_diff1\n value: 9.766823796017324\n - type: nauc_ndcg_at_1000_max\n value: 52.85844801910602\n - type: nauc_ndcg_at_1000_std\n value: 36.43271437761207\n - type: nauc_ndcg_at_100_diff1\n value: 12.035059298282036\n - type: nauc_ndcg_at_100_max\n value: 50.05520240705682\n - type: nauc_ndcg_at_100_std\n value: 29.87678724506636\n - type: nauc_ndcg_at_10_diff1\n value: 10.281893031139424\n - type: nauc_ndcg_at_10_max\n value: 47.02153679426017\n - type: nauc_ndcg_at_10_std\n value: 26.624948330369126\n - type: nauc_ndcg_at_1_diff1\n value: 13.323408127738782\n - type: nauc_ndcg_at_1_max\n value: 43.78884661002012\n - type: nauc_ndcg_at_1_std\n value: 25.164417588165673\n - type: nauc_ndcg_at_20_diff1\n value: 11.463524849646598\n - type: nauc_ndcg_at_20_max\n value: 47.415073186019704\n - type: nauc_ndcg_at_20_std\n value: 26.359019620164307\n - type: nauc_ndcg_at_3_diff1\n value: 9.689199913805394\n - type: nauc_ndcg_at_3_max\n value: 45.68151849572808\n - type: nauc_ndcg_at_3_std\n value: 26.559193219799486\n - type: nauc_ndcg_at_5_diff1\n value: 9.448823370356575\n - type: nauc_ndcg_at_5_max\n value: 46.19999662690141\n - type: nauc_ndcg_at_5_std\n value: 26.8411706726069\n - type: nauc_precision_at_1000_diff1\n value: -20.379065598727024\n - type: nauc_precision_at_1000_max\n value: 13.162562437268427\n - type: nauc_precision_at_1000_std\n value: 22.658226157785812\n - type: nauc_precision_at_100_diff1\n value: -16.458155977309282\n - type: nauc_precision_at_100_max\n value: 35.97956789169889\n - type: nauc_precision_at_100_std\n value: 48.878375009979194\n - type: nauc_precision_at_10_diff1\n value: -7.810992317607771\n - type: nauc_precision_at_10_max\n value: 49.307339277444754\n - type: nauc_precision_at_10_std\n value: 42.82533951854582\n - type: nauc_precision_at_1_diff1\n value: 13.323408127738782\n - type: nauc_precision_at_1_max\n value: 43.78884661002012\n - type: nauc_precision_at_1_std\n value: 25.164417588165673\n - type: nauc_precision_at_20_diff1\n value: -11.43933465149542\n - type: nauc_precision_at_20_max\n value: 46.93722753460038\n - type: nauc_precision_at_20_std\n value: 47.36223769029678\n - type: nauc_precision_at_3_diff1\n value: 1.3230178593599737\n - type: nauc_precision_at_3_max\n value: 48.49039534395576\n - type: nauc_precision_at_3_std\n value: 33.161384183129194\n - type: nauc_precision_at_5_diff1\n value: -3.185516457926519\n - type: nauc_precision_at_5_max\n value: 49.5814309394308\n - type: nauc_precision_at_5_std\n value: 37.57637865900281\n - type: nauc_recall_at_1000_diff1\n value: 7.839499443984168\n - type: nauc_recall_at_1000_max\n value: 52.67165467640894\n - type: nauc_recall_at_1000_std\n value: 48.85318316702583\n - type: nauc_recall_at_100_diff1\n value: 14.117557049589418\n - type: nauc_recall_at_100_max\n value: 40.59046301348715\n - type: nauc_recall_at_100_std\n value: 24.379680901739505\n - type: nauc_recall_at_10_diff1\n value: 20.04536052614054\n - type: nauc_recall_at_10_max\n value: 25.54148839721574\n - type: nauc_recall_at_10_std\n value: -1.938182527562211\n - type: nauc_recall_at_1_diff1\n value: 37.02663305598722\n - type: nauc_recall_at_1_max\n value: 14.931071531116528\n - type: nauc_recall_at_1_std\n value: -12.478790028708453\n - type: nauc_recall_at_20_diff1\n value: 17.959977483235566\n - type: nauc_recall_at_20_max\n value: 29.88502687870809\n - type: nauc_recall_at_20_std\n value: 4.26527395196852\n - type: nauc_recall_at_3_diff1\n value: 26.297810954500456\n - type: nauc_recall_at_3_max\n value: 18.819406079307402\n - type: nauc_recall_at_3_std\n value: -10.002237229729081\n - type: nauc_recall_at_5_diff1\n value: 22.739080899568485\n - type: nauc_recall_at_5_max\n value: 21.0322968243985\n - type: nauc_recall_at_5_std\n value: -6.927749435306422\n - type: ndcg_at_1\n value: 34.717999999999996\n - type: ndcg_at_10\n value: 34.772999999999996\n - type: ndcg_at_100\n value: 39.407\n - type: ndcg_at_1000\n value: 44.830999999999996\n - type: ndcg_at_20\n value: 35.667\n - type: ndcg_at_3\n value: 34.332\n - type: ndcg_at_5\n value: 34.408\n - type: precision_at_1\n value: 34.717999999999996\n - type: precision_at_10\n value: 23.430999999999997\n - type: precision_at_100\n value: 9.31\n - type: precision_at_1000\n value: 2.259\n - type: precision_at_20\n value: 18.826999999999998\n - type: precision_at_3\n value: 30.553\n - type: precision_at_5\n value: 27.792\n - type: recall_at_1\n value: 7.2620000000000005\n - type: recall_at_10\n value: 26.384\n - type: recall_at_100\n value: 52.506\n - type: recall_at_1000\n value: 73.38\n - type: recall_at_20\n value: 34.032000000000004\n - type: recall_at_3\n value: 14.821000000000002\n - type: recall_at_5\n value: 19.481\n task:\n type: Retrieval\n - dataset:\n config: de\n name: MTEB XMarket (de)\n revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b\n split: test\n type: jinaai/xmarket_ml\n metrics:\n - type: main_score\n value: 28.316000000000003\n - type: map_at_1\n value: 8.667\n - type: map_at_10\n value: 17.351\n - type: map_at_100\n value: 21.02\n - type: map_at_1000\n value: 21.951\n - type: map_at_20\n value: 18.994\n - type: map_at_3\n value: 13.23\n - type: map_at_5\n value: 15.17\n - type: mrr_at_1\n value: 27.27272727272727\n - type: mrr_at_10\n value: 36.10858487561485\n - type: mrr_at_100\n value: 36.92033814316568\n - type: mrr_at_1000\n value: 36.972226653870365\n - type: mrr_at_20\n value: 36.58914906427944\n - type: mrr_at_3\n value: 33.642969201552305\n - type: mrr_at_5\n value: 35.13417554289494\n - type: nauc_map_at_1000_diff1\n value: 23.345116790998063\n - type: nauc_map_at_1000_max\n value: 44.447240670835725\n - type: nauc_map_at_1000_std\n value: 18.34636500680144\n - type: nauc_map_at_100_diff1\n value: 24.458120909292347\n - type: nauc_map_at_100_max\n value: 43.31851431140378\n - type: nauc_map_at_100_std\n value: 15.654778355549965\n - type: nauc_map_at_10_diff1\n value: 29.376508937265044\n - type: nauc_map_at_10_max\n value: 36.650196725140795\n - type: nauc_map_at_10_std\n value: 4.682465435374843\n - type: nauc_map_at_1_diff1\n value: 40.382365672683214\n - type: nauc_map_at_1_max\n value: 22.894341150096785\n - type: nauc_map_at_1_std\n value: -5.610725673968323\n - type: nauc_map_at_20_diff1\n value: 27.197033425732908\n - type: nauc_map_at_20_max\n value: 39.71672400647207\n - type: nauc_map_at_20_std\n value: 8.944436813309933\n - type: nauc_map_at_3_diff1\n value: 34.49739294661502\n - type: nauc_map_at_3_max\n value: 29.006972420735284\n - type: nauc_map_at_3_std\n value: -3.0372650571243986\n - type: nauc_map_at_5_diff1\n value: 32.764901537277105\n - type: nauc_map_at_5_max\n value: 32.658533295918154\n - type: nauc_map_at_5_std\n value: 0.029626452286996906\n - type: nauc_mrr_at_1000_diff1\n value: 19.521229956280603\n - type: nauc_mrr_at_1000_max\n value: 44.39409866211472\n - type: nauc_mrr_at_1000_std\n value: 23.580697307036058\n - type: nauc_mrr_at_100_diff1\n value: 19.51312676591073\n - type: nauc_mrr_at_100_max\n value: 44.39559153963895\n - type: nauc_mrr_at_100_std\n value: 23.57913711397437\n - type: nauc_mrr_at_10_diff1\n value: 19.584635617935145\n - type: nauc_mrr_at_10_max\n value: 44.44842226236198\n - type: nauc_mrr_at_10_std\n value: 23.382684909390434\n - type: nauc_mrr_at_1_diff1\n value: 20.92594790923806\n - type: nauc_mrr_at_1_max\n value: 40.593939625252816\n - type: nauc_mrr_at_1_std\n value: 20.37467598073644\n - type: nauc_mrr_at_20_diff1\n value: 19.590641822115725\n - type: nauc_mrr_at_20_max\n value: 44.42512299604718\n - type: nauc_mrr_at_20_std\n value: 23.45564260800024\n - type: nauc_mrr_at_3_diff1\n value: 20.005307129527232\n - type: nauc_mrr_at_3_max\n value: 43.68300366192776\n - type: nauc_mrr_at_3_std\n value: 22.297190480842005\n - type: nauc_mrr_at_5_diff1\n value: 19.852896386271716\n - type: nauc_mrr_at_5_max\n value: 44.20641808920062\n - type: nauc_mrr_at_5_std\n value: 22.966517330852895\n - type: nauc_ndcg_at_1000_diff1\n value: 17.800116251376103\n - type: nauc_ndcg_at_1000_max\n value: 50.98332718061365\n - type: nauc_ndcg_at_1000_std\n value: 31.464484658102577\n - type: nauc_ndcg_at_100_diff1\n value: 19.555159680541088\n - type: nauc_ndcg_at_100_max\n value: 48.56377130899141\n - type: nauc_ndcg_at_100_std\n value: 25.77572748714817\n - type: nauc_ndcg_at_10_diff1\n value: 20.003008726679415\n - type: nauc_ndcg_at_10_max\n value: 45.1293725480628\n - type: nauc_ndcg_at_10_std\n value: 21.149213260765872\n - type: nauc_ndcg_at_1_diff1\n value: 21.00986278773023\n - type: nauc_ndcg_at_1_max\n value: 40.524637076774894\n - type: nauc_ndcg_at_1_std\n value: 20.29682194006685\n - type: nauc_ndcg_at_20_diff1\n value: 20.659734137312284\n - type: nauc_ndcg_at_20_max\n value: 45.73108736599869\n - type: nauc_ndcg_at_20_std\n value: 21.200736170346133\n - type: nauc_ndcg_at_3_diff1\n value: 19.200120542882544\n - type: nauc_ndcg_at_3_max\n value: 42.89772612963168\n - type: nauc_ndcg_at_3_std\n value: 20.713292754978983\n - type: nauc_ndcg_at_5_diff1\n value: 19.96329647992544\n - type: nauc_ndcg_at_5_max\n value: 44.296627037787324\n - type: nauc_ndcg_at_5_std\n value: 21.200135784971973\n - type: nauc_precision_at_1000_diff1\n value: -11.543221249009427\n - type: nauc_precision_at_1000_max\n value: 9.132801614448221\n - type: nauc_precision_at_1000_std\n value: 21.203720655381055\n - type: nauc_precision_at_100_diff1\n value: -12.510945425786039\n - type: nauc_precision_at_100_max\n value: 31.42530963666252\n - type: nauc_precision_at_100_std\n value: 44.99672783467617\n - type: nauc_precision_at_10_diff1\n value: -4.025802651746804\n - type: nauc_precision_at_10_max\n value: 47.50967924227793\n - type: nauc_precision_at_10_std\n value: 41.1558559268985\n - type: nauc_precision_at_1_diff1\n value: 21.00986278773023\n - type: nauc_precision_at_1_max\n value: 40.524637076774894\n - type: nauc_precision_at_1_std\n value: 20.29682194006685\n - type: nauc_precision_at_20_diff1\n value: -8.059482951110002\n - type: nauc_precision_at_20_max\n value: 44.28832115946278\n - type: nauc_precision_at_20_std\n value: 45.2005585353651\n - type: nauc_precision_at_3_diff1\n value: 8.53530005716248\n - type: nauc_precision_at_3_max\n value: 46.48353678905102\n - type: nauc_precision_at_3_std\n value: 28.868791323881972\n - type: nauc_precision_at_5_diff1\n value: 3.093619954821814\n - type: nauc_precision_at_5_max\n value: 48.43294475817019\n - type: nauc_precision_at_5_std\n value: 34.83430452745434\n - type: nauc_recall_at_1000_diff1\n value: 9.93680206699751\n - type: nauc_recall_at_1000_max\n value: 52.97840222394363\n - type: nauc_recall_at_1000_std\n value: 46.370023604436255\n - type: nauc_recall_at_100_diff1\n value: 14.100542445524972\n - type: nauc_recall_at_100_max\n value: 42.853775131475224\n - type: nauc_recall_at_100_std\n value: 26.93029971231028\n - type: nauc_recall_at_10_diff1\n value: 22.774547475714716\n - type: nauc_recall_at_10_max\n value: 33.984586405015044\n - type: nauc_recall_at_10_std\n value: 5.332325172373655\n - type: nauc_recall_at_1_diff1\n value: 40.382365672683214\n - type: nauc_recall_at_1_max\n value: 22.894341150096785\n - type: nauc_recall_at_1_std\n value: -5.610725673968323\n - type: nauc_recall_at_20_diff1\n value: 19.751060483835936\n - type: nauc_recall_at_20_max\n value: 36.18774034635102\n - type: nauc_recall_at_20_std\n value: 10.362242090308577\n - type: nauc_recall_at_3_diff1\n value: 30.29462372902671\n - type: nauc_recall_at_3_max\n value: 27.377175450099635\n - type: nauc_recall_at_3_std\n value: -3.015752705993425\n - type: nauc_recall_at_5_diff1\n value: 28.096893312615723\n - type: nauc_recall_at_5_max\n value: 30.485075571512425\n - type: nauc_recall_at_5_std\n value: 0.09106417003502826\n - type: ndcg_at_1\n value: 27.248\n - type: ndcg_at_10\n value: 28.316000000000003\n - type: ndcg_at_100\n value: 33.419\n - type: ndcg_at_1000\n value: 38.134\n - type: ndcg_at_20\n value: 29.707\n - type: ndcg_at_3\n value: 26.93\n - type: ndcg_at_5\n value: 27.363\n - type: precision_at_1\n value: 27.248\n - type: precision_at_10\n value: 15.073\n - type: precision_at_100\n value: 5.061\n - type: precision_at_1000\n value: 1.325\n - type: precision_at_20\n value: 11.407\n - type: precision_at_3\n value: 21.823\n - type: precision_at_5\n value: 18.984\n - type: recall_at_1\n value: 8.667\n - type: recall_at_10\n value: 26.984\n - type: recall_at_100\n value: 49.753\n - type: recall_at_1000\n value: 70.354\n - type: recall_at_20\n value: 33.955999999999996\n - type: recall_at_3\n value: 16.086\n - type: recall_at_5\n value: 20.544999999999998\n task:\n type: Retrieval\n - dataset:\n config: es\n name: MTEB XMarket (es)\n revision: dfe57acff5b62c23732a7b7d3e3fb84ff501708b\n split: test\n type: jinaai/xmarket_ml\n metrics:\n - type: main_score\n value: 26.592\n - type: map_at_1\n value: 8.081000000000001\n - type: map_at_10\n value: 16.486\n - type: map_at_100\n value: 19.996\n - type: map_at_1000\n value: 20.889\n - type: map_at_20\n value: 18.088\n - type: map_at_3\n value: 12.864\n - type: map_at_5\n value: 14.515\n - type: mrr_at_1\n value: 24.643356643356643\n - type: mrr_at_10\n value: 33.755599955599926\n - type: mrr_at_100\n value: 34.55914769326114\n - type: mrr_at_1000\n value: 34.614384237219745\n - type: mrr_at_20\n value: 34.228909650276194\n - type: mrr_at_3\n value: 31.445221445221456\n - type: mrr_at_5\n value: 32.71375291375297\n - type: nauc_map_at_1000_diff1\n value: 19.17751654240679\n - type: nauc_map_at_1000_max\n value: 43.493743561136434\n - type: nauc_map_at_1000_std\n value: 21.14477911550252\n - type: nauc_map_at_100_diff1\n value: 20.259227234415395\n - type: nauc_map_at_100_max\n value: 42.510860292169106\n - type: nauc_map_at_100_std\n value: 18.63085160442346\n - type: nauc_map_at_10_diff1\n value: 24.12419385640694\n - type: nauc_map_at_10_max\n value: 35.99892932069915\n - type: nauc_map_at_10_std\n value: 8.488520124325058\n - type: nauc_map_at_1_diff1\n value: 35.09239143996649\n - type: nauc_map_at_1_max\n value: 23.72498533914286\n - type: nauc_map_at_1_std\n value: -4.164387883546102\n - type: nauc_map_at_20_diff1\n value: 22.411418237320817\n - type: nauc_map_at_20_max\n value: 39.12496266094892\n - type: nauc_map_at_20_std\n value: 12.371656353894227\n - type: nauc_map_at_3_diff1\n value: 28.106972376813506\n - type: nauc_map_at_3_max\n value: 29.57824316865409\n - type: nauc_map_at_3_std\n value: 1.8928791254813127\n - type: nauc_map_at_5_diff1\n value: 26.4958239149419\n - type: nauc_map_at_5_max\n value: 32.45906016649239\n - type: nauc_map_at_5_std\n value: 4.612735963224018\n - type: nauc_mrr_at_1000_diff1\n value: 17.614812607094446\n - type: nauc_mrr_at_1000_max\n value: 41.13031556228715\n - type: nauc_mrr_at_1000_std\n value: 22.564112871230318\n - type: nauc_mrr_at_100_diff1\n value: 17.614044568011085\n - type: nauc_mrr_at_100_max\n value: 41.129436273086796\n - type: nauc_mrr_at_100_std\n value: 22.566763500658766\n - type: nauc_mrr_at_10_diff1\n value: 17.61869494452089\n - type: nauc_mrr_at_10_max\n value: 41.091542329381426\n - type: nauc_mrr_at_10_std\n value: 22.370473458633594\n - type: nauc_mrr_at_1_diff1\n value: 20.321421442201913\n - type: nauc_mrr_at_1_max\n value: 38.36531448180009\n - type: nauc_mrr_at_1_std\n value: 18.422203207777688\n - type: nauc_mrr_at_20_diff1\n value: 17.614767736091625\n - type: nauc_mrr_at_20_max\n value: 41.11221420736687\n - type: nauc_mrr_at_20_std\n value: 22.44271891522012\n - type: nauc_mrr_at_3_diff1\n value: 17.98184651584625\n - type: nauc_mrr_at_3_max\n value: 40.424293610470144\n - type: nauc_mrr_at_3_std\n value: 21.554750947206706\n - type: nauc_mrr_at_5_diff1\n value: 17.72088314927416\n - type: nauc_mrr_at_5_max\n value: 40.662724739072694\n - type: nauc_mrr_at_5_std\n value: 21.822957528431928\n - type: nauc_ndcg_at_1000_diff1\n value: 15.310699428328398\n - type: nauc_ndcg_at_1000_max\n value: 48.83921393349997\n - type: nauc_ndcg_at_1000_std\n value: 32.22600294110774\n - type: nauc_ndcg_at_100_diff1\n value: 16.62672763977423\n - type: nauc_ndcg_at_100_max\n value: 47.36060653537392\n - type: nauc_ndcg_at_100_std\n value: 27.879865162871575\n - type: nauc_ndcg_at_10_diff1\n value: 16.436684176028116\n - type: nauc_ndcg_at_10_max\n value: 43.00026520872974\n - type: nauc_ndcg_at_10_std\n value: 22.507354939162806\n - type: nauc_ndcg_at_1_diff1\n value: 20.321421442201913\n - type: nauc_ndcg_at_1_max\n value: 38.36531448180009\n - type: nauc_ndcg_at_1_std\n value: 18.422203207777688\n - type: nauc_ndcg_at_20_diff1\n value: 17.127747123248835\n - type: nauc_ndcg_at_20_max\n value: 44.57322943752733\n - type: nauc_ndcg_at_20_std\n value: 23.146541187377036\n - type: nauc_ndcg_at_3_diff1\n value: 16.372742984728514\n - type: nauc_ndcg_at_3_max\n value: 40.91938017883993\n - type: nauc_ndcg_at_3_std\n value: 21.50917089194154\n - type: nauc_ndcg_at_5_diff1\n value: 16.40486505525073\n - type: nauc_ndcg_at_5_max\n value: 41.94597203181329\n - type: nauc_ndcg_at_5_std\n value: 22.068260809047562\n - type: nauc_precision_at_1000_diff1\n value: -15.9415313729527\n - type: nauc_precision_at_1000_max\n value: 12.653329948983643\n - type: nauc_precision_at_1000_std\n value: 26.371820703256173\n - type: nauc_precision_at_100_diff1\n value: -11.851070166675289\n - type: nauc_precision_at_100_max\n value: 32.164365923950115\n - type: nauc_precision_at_100_std\n value: 45.930226426725426\n - type: nauc_precision_at_10_diff1\n value: -3.1352660378259163\n - type: nauc_precision_at_10_max\n value: 45.48359878733272\n - type: nauc_precision_at_10_std\n value: 40.2917038044196\n - type: nauc_precision_at_1_diff1\n value: 20.321421442201913\n - type: nauc_precision_at_1_max\n value: 38.36531448180009\n - type: nauc_precision_at_1_std\n value: 18.422203207777688\n - type: nauc_precision_at_20_diff1\n value: -7.087513342144751\n - type: nauc_precision_at_20_max\n value: 43.66272019058357\n - type: nauc_precision_at_20_std\n value: 44.22863351071686\n - type: nauc_precision_at_3_diff1\n value: 7.836185032609045\n - type: nauc_precision_at_3_max\n value: 44.85412904097269\n - type: nauc_precision_at_3_std\n value: 30.209139149500057\n - type: nauc_precision_at_5_diff1\n value: 3.028150537253791\n - type: nauc_precision_at_5_max\n value: 45.73661708882973\n - type: nauc_precision_at_5_std\n value: 34.65500311185052\n - type: nauc_recall_at_1000_diff1\n value: 9.526124668370704\n - type: nauc_recall_at_1000_max\n value: 51.4190208452196\n - type: nauc_recall_at_1000_std\n value: 45.694891695646426\n - type: nauc_recall_at_100_diff1\n value: 12.68466215400009\n - type: nauc_recall_at_100_max\n value: 42.79112054268112\n - type: nauc_recall_at_100_std\n value: 28.61954251400998\n - type: nauc_recall_at_10_diff1\n value: 17.95124413416829\n - type: nauc_recall_at_10_max\n value: 33.1192036755167\n - type: nauc_recall_at_10_std\n value: 9.3588175959525\n - type: nauc_recall_at_1_diff1\n value: 35.09239143996649\n - type: nauc_recall_at_1_max\n value: 23.72498533914286\n - type: nauc_recall_at_1_std\n value: -4.164387883546102\n - type: nauc_recall_at_20_diff1\n value: 16.24916980445646\n - type: nauc_recall_at_20_max\n value: 36.51316122236076\n - type: nauc_recall_at_20_std\n value: 13.641588062425736\n - type: nauc_recall_at_3_diff1\n value: 23.263199724138786\n - type: nauc_recall_at_3_max\n value: 27.67354561610614\n - type: nauc_recall_at_3_std\n value: 3.103127242654415\n - type: nauc_recall_at_5_diff1\n value: 20.719704839229635\n - type: nauc_recall_at_5_max\n value: 29.66480839111333\n - type: nauc_recall_at_5_std\n value: 5.514884455797986\n - type: ndcg_at_1\n value: 24.643\n - type: ndcg_at_10\n value: 26.592\n - type: ndcg_at_100\n value: 31.887\n - type: ndcg_at_1000\n value: 36.695\n - type: ndcg_at_20\n value: 28.166000000000004\n - type: ndcg_at_3\n value: 25.238\n - type: ndcg_at_5\n value: 25.545\n - type: precision_at_1\n value: 24.643\n - type: precision_at_10\n value: 13.730999999999998\n - type: precision_at_100\n value: 4.744000000000001\n - type: precision_at_1000\n value: 1.167\n - type: precision_at_20\n value: 10.562000000000001\n - type: precision_at_3\n value: 20.288999999999998\n - type: precision_at_5\n value: 17.337\n - type: recall_at_1\n value: 8.081000000000001\n - type: recall_at_10\n value: 25.911\n - type: recall_at_100\n value: 48.176\n - type: recall_at_1000\n value: 69.655\n - type: recall_at_20\n value: 32.924\n - type: recall_at_3\n value: 16.125\n - type: recall_at_5\n value: 19.988\n task:\n type: Retrieval\n - dataset:\n config: deu-deu\n name: MTEB XPQARetrieval (deu-deu)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 84.552\n - type: map_at_1\n value: 59.023\n - type: map_at_10\n value: 81.051\n - type: map_at_100\n value: 81.539\n - type: map_at_1000\n value: 81.54299999999999\n - type: map_at_20\n value: 81.401\n - type: map_at_3\n value: 76.969\n - type: map_at_5\n value: 80.07600000000001\n - type: mrr_at_1\n value: 77.67624020887729\n - type: mrr_at_10\n value: 83.30509967259314\n - type: mrr_at_100\n value: 83.58599391639456\n - type: mrr_at_1000\n value: 83.58970114722587\n - type: mrr_at_20\n value: 83.50275980440317\n - type: mrr_at_3\n value: 82.07136640557006\n - type: mrr_at_5\n value: 82.94604003481287\n - type: nauc_map_at_1000_diff1\n value: 63.12885104269942\n - type: nauc_map_at_1000_max\n value: 57.7017996674959\n - type: nauc_map_at_1000_std\n value: -24.951068985070513\n - type: nauc_map_at_100_diff1\n value: 63.12866509393162\n - type: nauc_map_at_100_max\n value: 57.70176426013332\n - type: nauc_map_at_100_std\n value: -24.96012290790273\n - type: nauc_map_at_10_diff1\n value: 62.847709436211204\n - type: nauc_map_at_10_max\n value: 57.408873624779524\n - type: nauc_map_at_10_std\n value: -25.635130363219062\n - type: nauc_map_at_1_diff1\n value: 71.89683981857102\n - type: nauc_map_at_1_max\n value: 20.204460967432645\n - type: nauc_map_at_1_std\n value: -23.07894656629493\n - type: nauc_map_at_20_diff1\n value: 63.00504457011043\n - type: nauc_map_at_20_max\n value: 57.66009512514262\n - type: nauc_map_at_20_std\n value: -25.100138593754885\n - type: nauc_map_at_3_diff1\n value: 63.199874607788274\n - type: nauc_map_at_3_max\n value: 47.54482033763308\n - type: nauc_map_at_3_std\n value: -27.714557098916963\n - type: nauc_map_at_5_diff1\n value: 63.01006523518669\n - type: nauc_map_at_5_max\n value: 56.501965964288495\n - type: nauc_map_at_5_std\n value: -25.367825762790925\n - type: nauc_mrr_at_1000_diff1\n value: 66.24988063948112\n - type: nauc_mrr_at_1000_max\n value: 63.56921667744273\n - type: nauc_mrr_at_1000_std\n value: -22.073973768031863\n - type: nauc_mrr_at_100_diff1\n value: 66.24919554296275\n - type: nauc_mrr_at_100_max\n value: 63.57382447608361\n - type: nauc_mrr_at_100_std\n value: -22.084627248538187\n - type: nauc_mrr_at_10_diff1\n value: 66.0143885124066\n - type: nauc_mrr_at_10_max\n value: 63.51277586011898\n - type: nauc_mrr_at_10_std\n value: -22.477523960705454\n - type: nauc_mrr_at_1_diff1\n value: 68.25415199323474\n - type: nauc_mrr_at_1_max\n value: 63.069019003272416\n - type: nauc_mrr_at_1_std\n value: -18.77085924093244\n - type: nauc_mrr_at_20_diff1\n value: 66.16203167351055\n - type: nauc_mrr_at_20_max\n value: 63.607477776215845\n - type: nauc_mrr_at_20_std\n value: -22.15083176017266\n - type: nauc_mrr_at_3_diff1\n value: 66.39368842782302\n - type: nauc_mrr_at_3_max\n value: 63.11411066585295\n - type: nauc_mrr_at_3_std\n value: -22.63174342814071\n - type: nauc_mrr_at_5_diff1\n value: 66.17932562332354\n - type: nauc_mrr_at_5_max\n value: 63.70434825329594\n - type: nauc_mrr_at_5_std\n value: -21.704012812430438\n - type: nauc_ndcg_at_1000_diff1\n value: 63.958010361549356\n - type: nauc_ndcg_at_1000_max\n value: 60.516445000134624\n - type: nauc_ndcg_at_1000_std\n value: -24.264672248289923\n - type: nauc_ndcg_at_100_diff1\n value: 63.97654644758022\n - type: nauc_ndcg_at_100_max\n value: 60.62187552803407\n - type: nauc_ndcg_at_100_std\n value: -24.317149225778312\n - type: nauc_ndcg_at_10_diff1\n value: 62.505321221321566\n - type: nauc_ndcg_at_10_max\n value: 59.77891112351258\n - type: nauc_ndcg_at_10_std\n value: -26.90910005589911\n - type: nauc_ndcg_at_1_diff1\n value: 68.25415199323474\n - type: nauc_ndcg_at_1_max\n value: 63.069019003272416\n - type: nauc_ndcg_at_1_std\n value: -18.77085924093244\n - type: nauc_ndcg_at_20_diff1\n value: 63.04281805056225\n - type: nauc_ndcg_at_20_max\n value: 60.600957307444226\n - type: nauc_ndcg_at_20_std\n value: -24.954862079889203\n - type: nauc_ndcg_at_3_diff1\n value: 62.970441139740316\n - type: nauc_ndcg_at_3_max\n value: 57.543715669055295\n - type: nauc_ndcg_at_3_std\n value: -25.659388431714703\n - type: nauc_ndcg_at_5_diff1\n value: 62.82652127664541\n - type: nauc_ndcg_at_5_max\n value: 58.6970443258532\n - type: nauc_ndcg_at_5_std\n value: -25.66329354851023\n - type: nauc_precision_at_1000_diff1\n value: -33.38530947486223\n - type: nauc_precision_at_1000_max\n value: 25.972468024345414\n - type: nauc_precision_at_1000_std\n value: 17.460222955117978\n - type: nauc_precision_at_100_diff1\n value: -32.45175999251703\n - type: nauc_precision_at_100_max\n value: 26.367996120487337\n - type: nauc_precision_at_100_std\n value: 17.097957946391208\n - type: nauc_precision_at_10_diff1\n value: -26.97411235289487\n - type: nauc_precision_at_10_max\n value: 31.504961687240762\n - type: nauc_precision_at_10_std\n value: 11.125341183874687\n - type: nauc_precision_at_1_diff1\n value: 68.25415199323474\n - type: nauc_precision_at_1_max\n value: 63.069019003272416\n - type: nauc_precision_at_1_std\n value: -18.77085924093244\n - type: nauc_precision_at_20_diff1\n value: -29.8678078736273\n - type: nauc_precision_at_20_max\n value: 29.031222186584504\n - type: nauc_precision_at_20_std\n value: 14.943600563087928\n - type: nauc_precision_at_3_diff1\n value: -15.92947221299854\n - type: nauc_precision_at_3_max\n value: 37.73833494235097\n - type: nauc_precision_at_3_std\n value: 3.1573228443500847\n - type: nauc_precision_at_5_diff1\n value: -22.269156821101642\n - type: nauc_precision_at_5_max\n value: 35.65821838116355\n - type: nauc_precision_at_5_std\n value: 9.265930386198972\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: 66.17058859539249\n - type: nauc_recall_at_100_max\n value: 78.066942935192\n - type: nauc_recall_at_100_std\n value: -22.213377762074686\n - type: nauc_recall_at_10_diff1\n value: 50.82149700700275\n - type: nauc_recall_at_10_max\n value: 56.68053325008221\n - type: nauc_recall_at_10_std\n value: -41.81657941433277\n - type: nauc_recall_at_1_diff1\n value: 71.89683981857102\n - type: nauc_recall_at_1_max\n value: 20.204460967432645\n - type: nauc_recall_at_1_std\n value: -23.07894656629493\n - type: nauc_recall_at_20_diff1\n value: 48.28076011857885\n - type: nauc_recall_at_20_max\n value: 63.29641555519295\n - type: nauc_recall_at_20_std\n value: -32.953559708819405\n - type: nauc_recall_at_3_diff1\n value: 58.15516956312558\n - type: nauc_recall_at_3_max\n value: 42.66315890283056\n - type: nauc_recall_at_3_std\n value: -32.16572530544806\n - type: nauc_recall_at_5_diff1\n value: 55.900844052439766\n - type: nauc_recall_at_5_max\n value: 55.23702018862884\n - type: nauc_recall_at_5_std\n value: -30.105929528165\n - type: ndcg_at_1\n value: 77.676\n - type: ndcg_at_10\n value: 84.552\n - type: ndcg_at_100\n value: 86.232\n - type: ndcg_at_1000\n value: 86.33800000000001\n - type: ndcg_at_20\n value: 85.515\n - type: ndcg_at_3\n value: 81.112\n - type: ndcg_at_5\n value: 82.943\n - type: precision_at_1\n value: 77.676\n - type: precision_at_10\n value: 15.17\n - type: precision_at_100\n value: 1.6230000000000002\n - type: precision_at_1000\n value: 0.163\n - type: precision_at_20\n value: 7.858999999999999\n - type: precision_at_3\n value: 42.994\n - type: precision_at_5\n value: 28.747\n - type: recall_at_1\n value: 59.023\n - type: recall_at_10\n value: 92.465\n - type: recall_at_100\n value: 99.18400000000001\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 95.844\n - type: recall_at_3\n value: 81.826\n - type: recall_at_5\n value: 88.22\n task:\n type: Retrieval\n - dataset:\n config: deu-eng\n name: MTEB XPQARetrieval (deu-eng)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 82.149\n - type: map_at_1\n value: 56.277\n - type: map_at_10\n value: 78.36999999999999\n - type: map_at_100\n value: 78.94\n - type: map_at_1000\n value: 78.95\n - type: map_at_20\n value: 78.818\n - type: map_at_3\n value: 74.25\n - type: map_at_5\n value: 77.11099999999999\n - type: mrr_at_1\n value: 74.28198433420366\n - type: mrr_at_10\n value: 80.57487877657589\n - type: mrr_at_100\n value: 80.94025764149008\n - type: mrr_at_1000\n value: 80.94608738871234\n - type: mrr_at_20\n value: 80.86240675885023\n - type: mrr_at_3\n value: 79.4604003481288\n - type: mrr_at_5\n value: 80.10008703220191\n - type: nauc_map_at_1000_diff1\n value: 60.44369249057189\n - type: nauc_map_at_1000_max\n value: 49.822240441830246\n - type: nauc_map_at_1000_std\n value: -27.34026380762817\n - type: nauc_map_at_100_diff1\n value: 60.44635668050401\n - type: nauc_map_at_100_max\n value: 49.838675926660684\n - type: nauc_map_at_100_std\n value: -27.310365556055583\n - type: nauc_map_at_10_diff1\n value: 60.18546951726522\n - type: nauc_map_at_10_max\n value: 49.72075398096832\n - type: nauc_map_at_10_std\n value: -27.86056102461558\n - type: nauc_map_at_1_diff1\n value: 71.2906657099758\n - type: nauc_map_at_1_max\n value: 18.970399251589\n - type: nauc_map_at_1_std\n value: -27.260776614286602\n - type: nauc_map_at_20_diff1\n value: 60.3525975566164\n - type: nauc_map_at_20_max\n value: 49.852487866710646\n - type: nauc_map_at_20_std\n value: -27.305173830170332\n - type: nauc_map_at_3_diff1\n value: 60.66803500571236\n - type: nauc_map_at_3_max\n value: 41.18191941521972\n - type: nauc_map_at_3_std\n value: -28.71383593401732\n - type: nauc_map_at_5_diff1\n value: 60.57216514504887\n - type: nauc_map_at_5_max\n value: 47.99837400446299\n - type: nauc_map_at_5_std\n value: -28.756183015949986\n - type: nauc_mrr_at_1000_diff1\n value: 63.77031955602516\n - type: nauc_mrr_at_1000_max\n value: 54.26907383811417\n - type: nauc_mrr_at_1000_std\n value: -26.227442087164714\n - type: nauc_mrr_at_100_diff1\n value: 63.77196650108669\n - type: nauc_mrr_at_100_max\n value: 54.281801457913126\n - type: nauc_mrr_at_100_std\n value: -26.216077891830793\n - type: nauc_mrr_at_10_diff1\n value: 63.50095284903051\n - type: nauc_mrr_at_10_max\n value: 54.3186301730016\n - type: nauc_mrr_at_10_std\n value: -26.29570241722173\n - type: nauc_mrr_at_1_diff1\n value: 65.15855770999057\n - type: nauc_mrr_at_1_max\n value: 53.213286738515066\n - type: nauc_mrr_at_1_std\n value: -24.683178252901943\n - type: nauc_mrr_at_20_diff1\n value: 63.74936550280859\n - type: nauc_mrr_at_20_max\n value: 54.355343751439065\n - type: nauc_mrr_at_20_std\n value: -26.197316900009817\n - type: nauc_mrr_at_3_diff1\n value: 63.912612979082695\n - type: nauc_mrr_at_3_max\n value: 53.75399024225975\n - type: nauc_mrr_at_3_std\n value: -27.194143264554675\n - type: nauc_mrr_at_5_diff1\n value: 63.72491059053639\n - type: nauc_mrr_at_5_max\n value: 53.66107604019352\n - type: nauc_mrr_at_5_std\n value: -26.92281560584754\n - type: nauc_ndcg_at_1000_diff1\n value: 61.304218998714354\n - type: nauc_ndcg_at_1000_max\n value: 52.409135743660386\n - type: nauc_ndcg_at_1000_std\n value: -26.539796489464056\n - type: nauc_ndcg_at_100_diff1\n value: 61.40355045085304\n - type: nauc_ndcg_at_100_max\n value: 52.79402259608008\n - type: nauc_ndcg_at_100_std\n value: -25.927273456979965\n - type: nauc_ndcg_at_10_diff1\n value: 59.93675608684116\n - type: nauc_ndcg_at_10_max\n value: 52.617848197542706\n - type: nauc_ndcg_at_10_std\n value: -27.314820020095887\n - type: nauc_ndcg_at_1_diff1\n value: 65.15855770999057\n - type: nauc_ndcg_at_1_max\n value: 53.213286738515066\n - type: nauc_ndcg_at_1_std\n value: -24.683178252901943\n - type: nauc_ndcg_at_20_diff1\n value: 60.85093704358376\n - type: nauc_ndcg_at_20_max\n value: 53.14529242671602\n - type: nauc_ndcg_at_20_std\n value: -25.93187916231906\n - type: nauc_ndcg_at_3_diff1\n value: 60.42301123518882\n - type: nauc_ndcg_at_3_max\n value: 49.59021992975956\n - type: nauc_ndcg_at_3_std\n value: -27.397117967810363\n - type: nauc_ndcg_at_5_diff1\n value: 60.78655153154219\n - type: nauc_ndcg_at_5_max\n value: 49.54194799556953\n - type: nauc_ndcg_at_5_std\n value: -29.467910172913413\n - type: nauc_precision_at_1000_diff1\n value: -34.35027108027456\n - type: nauc_precision_at_1000_max\n value: 23.762671066858815\n - type: nauc_precision_at_1000_std\n value: 16.1704780298982\n - type: nauc_precision_at_100_diff1\n value: -32.66610016754961\n - type: nauc_precision_at_100_max\n value: 25.504044603109588\n - type: nauc_precision_at_100_std\n value: 16.932402988816786\n - type: nauc_precision_at_10_diff1\n value: -25.720903145017342\n - type: nauc_precision_at_10_max\n value: 30.37029690599926\n - type: nauc_precision_at_10_std\n value: 10.560753160200314\n - type: nauc_precision_at_1_diff1\n value: 65.15855770999057\n - type: nauc_precision_at_1_max\n value: 53.213286738515066\n - type: nauc_precision_at_1_std\n value: -24.683178252901943\n - type: nauc_precision_at_20_diff1\n value: -29.577582332619084\n - type: nauc_precision_at_20_max\n value: 27.984145595920417\n - type: nauc_precision_at_20_std\n value: 15.083711704044727\n - type: nauc_precision_at_3_diff1\n value: -14.736267532892697\n - type: nauc_precision_at_3_max\n value: 36.12211021824307\n - type: nauc_precision_at_3_std\n value: 3.068643876519412\n - type: nauc_precision_at_5_diff1\n value: -19.846707283120825\n - type: nauc_precision_at_5_max\n value: 33.573804532177896\n - type: nauc_precision_at_5_std\n value: 5.700545622744924\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: 68.24749796604452\n - type: nauc_recall_at_100_max\n value: 83.30024864929815\n - type: nauc_recall_at_100_std\n value: 21.23763053711522\n - type: nauc_recall_at_10_diff1\n value: 50.704049683241436\n - type: nauc_recall_at_10_max\n value: 57.64578984555556\n - type: nauc_recall_at_10_std\n value: -26.632759037746073\n - type: nauc_recall_at_1_diff1\n value: 71.2906657099758\n - type: nauc_recall_at_1_max\n value: 18.970399251589\n - type: nauc_recall_at_1_std\n value: -27.260776614286602\n - type: nauc_recall_at_20_diff1\n value: 54.124480837579505\n - type: nauc_recall_at_20_max\n value: 66.4641515433479\n - type: nauc_recall_at_20_std\n value: -14.615911455379393\n - type: nauc_recall_at_3_diff1\n value: 56.54358788321059\n - type: nauc_recall_at_3_max\n value: 37.765735322465744\n - type: nauc_recall_at_3_std\n value: -30.824147408598574\n - type: nauc_recall_at_5_diff1\n value: 56.392894535029214\n - type: nauc_recall_at_5_max\n value: 45.959268387521554\n - type: nauc_recall_at_5_std\n value: -33.58175576925282\n - type: ndcg_at_1\n value: 74.28200000000001\n - type: ndcg_at_10\n value: 82.149\n - type: ndcg_at_100\n value: 84.129\n - type: ndcg_at_1000\n value: 84.307\n - type: ndcg_at_20\n value: 83.39999999999999\n - type: ndcg_at_3\n value: 78.583\n - type: ndcg_at_5\n value: 80.13900000000001\n - type: precision_at_1\n value: 74.28200000000001\n - type: precision_at_10\n value: 14.960999999999999\n - type: precision_at_100\n value: 1.6119999999999999\n - type: precision_at_1000\n value: 0.163\n - type: precision_at_20\n value: 7.813000000000001\n - type: precision_at_3\n value: 41.819\n - type: precision_at_5\n value: 27.911\n - type: recall_at_1\n value: 56.277\n - type: recall_at_10\n value: 90.729\n - type: recall_at_100\n value: 98.792\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 95.148\n - type: recall_at_3\n value: 79.989\n - type: recall_at_5\n value: 85.603\n task:\n type: Retrieval\n - dataset:\n config: eng-deu\n name: MTEB XPQARetrieval (eng-deu)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 60.428000000000004\n - type: map_at_1\n value: 33.453\n - type: map_at_10\n value: 54.217000000000006\n - type: map_at_100\n value: 55.832\n - type: map_at_1000\n value: 55.884\n - type: map_at_20\n value: 55.236\n - type: map_at_3\n value: 48.302\n - type: map_at_5\n value: 51.902\n - type: mrr_at_1\n value: 53.916449086161876\n - type: mrr_at_10\n value: 61.4685647975465\n - type: mrr_at_100\n value: 62.13718159287348\n - type: mrr_at_1000\n value: 62.15799113826325\n - type: mrr_at_20\n value: 61.885388764243544\n - type: mrr_at_3\n value: 59.44299390774582\n - type: mrr_at_5\n value: 60.26544821583981\n - type: nauc_map_at_1000_diff1\n value: 39.824412602121804\n - type: nauc_map_at_1000_max\n value: 39.49332709959374\n - type: nauc_map_at_1000_std\n value: -17.27462623749702\n - type: nauc_map_at_100_diff1\n value: 39.80528910003463\n - type: nauc_map_at_100_max\n value: 39.51471609156093\n - type: nauc_map_at_100_std\n value: -17.275536933094937\n - type: nauc_map_at_10_diff1\n value: 39.28558292349772\n - type: nauc_map_at_10_max\n value: 38.13220294838968\n - type: nauc_map_at_10_std\n value: -18.235985574392863\n - type: nauc_map_at_1_diff1\n value: 43.68892397816937\n - type: nauc_map_at_1_max\n value: 14.478978190224353\n - type: nauc_map_at_1_std\n value: -18.435031919225477\n - type: nauc_map_at_20_diff1\n value: 39.8733530971344\n - type: nauc_map_at_20_max\n value: 39.30513202591992\n - type: nauc_map_at_20_std\n value: -17.62362848144766\n - type: nauc_map_at_3_diff1\n value: 40.31116611188815\n - type: nauc_map_at_3_max\n value: 31.107314675202165\n - type: nauc_map_at_3_std\n value: -19.52930881946966\n - type: nauc_map_at_5_diff1\n value: 39.1241499095765\n - type: nauc_map_at_5_max\n value: 37.330543901034055\n - type: nauc_map_at_5_std\n value: -17.893862772447548\n - type: nauc_mrr_at_1000_diff1\n value: 43.07490530140024\n - type: nauc_mrr_at_1000_max\n value: 42.28469195779226\n - type: nauc_mrr_at_1000_std\n value: -15.583217110180737\n - type: nauc_mrr_at_100_diff1\n value: 43.068836494603886\n - type: nauc_mrr_at_100_max\n value: 42.29612450479168\n - type: nauc_mrr_at_100_std\n value: -15.57218089438229\n - type: nauc_mrr_at_10_diff1\n value: 42.88685919151777\n - type: nauc_mrr_at_10_max\n value: 41.89944452003811\n - type: nauc_mrr_at_10_std\n value: -15.909673572763165\n - type: nauc_mrr_at_1_diff1\n value: 45.67646898532131\n - type: nauc_mrr_at_1_max\n value: 43.0541870425035\n - type: nauc_mrr_at_1_std\n value: -15.597124291613563\n - type: nauc_mrr_at_20_diff1\n value: 43.14141873150977\n - type: nauc_mrr_at_20_max\n value: 42.33063543184022\n - type: nauc_mrr_at_20_std\n value: -15.607612016107304\n - type: nauc_mrr_at_3_diff1\n value: 43.18370928261982\n - type: nauc_mrr_at_3_max\n value: 42.18529980773961\n - type: nauc_mrr_at_3_std\n value: -15.900151400673629\n - type: nauc_mrr_at_5_diff1\n value: 42.43443044877765\n - type: nauc_mrr_at_5_max\n value: 42.05818605278972\n - type: nauc_mrr_at_5_std\n value: -15.436502733299893\n - type: nauc_ndcg_at_1000_diff1\n value: 40.60606676178781\n - type: nauc_ndcg_at_1000_max\n value: 41.71923393878376\n - type: nauc_ndcg_at_1000_std\n value: -15.694740326899556\n - type: nauc_ndcg_at_100_diff1\n value: 40.15270376312309\n - type: nauc_ndcg_at_100_max\n value: 42.234126305709225\n - type: nauc_ndcg_at_100_std\n value: -15.436051984708952\n - type: nauc_ndcg_at_10_diff1\n value: 39.142259831299455\n - type: nauc_ndcg_at_10_max\n value: 38.61470104273746\n - type: nauc_ndcg_at_10_std\n value: -18.577452829132742\n - type: nauc_ndcg_at_1_diff1\n value: 45.67646898532131\n - type: nauc_ndcg_at_1_max\n value: 43.0541870425035\n - type: nauc_ndcg_at_1_std\n value: -15.597124291613563\n - type: nauc_ndcg_at_20_diff1\n value: 40.805159395901306\n - type: nauc_ndcg_at_20_max\n value: 41.58685629374952\n - type: nauc_ndcg_at_20_std\n value: -16.862408156222592\n - type: nauc_ndcg_at_3_diff1\n value: 39.12028215488432\n - type: nauc_ndcg_at_3_max\n value: 39.70580596343164\n - type: nauc_ndcg_at_3_std\n value: -16.705546903936213\n - type: nauc_ndcg_at_5_diff1\n value: 38.42075404927361\n - type: nauc_ndcg_at_5_max\n value: 38.064219879504385\n - type: nauc_ndcg_at_5_std\n value: -17.20282111665876\n - type: nauc_precision_at_1000_diff1\n value: -4.419224540552891\n - type: nauc_precision_at_1000_max\n value: 35.686022591225246\n - type: nauc_precision_at_1000_std\n value: 15.023520191032972\n - type: nauc_precision_at_100_diff1\n value: -2.9027602601603895\n - type: nauc_precision_at_100_max\n value: 39.99864013028808\n - type: nauc_precision_at_100_std\n value: 13.863497117255525\n - type: nauc_precision_at_10_diff1\n value: 5.539104839809501\n - type: nauc_precision_at_10_max\n value: 42.41625740557432\n - type: nauc_precision_at_10_std\n value: 1.0894693748662556\n - type: nauc_precision_at_1_diff1\n value: 45.67646898532131\n - type: nauc_precision_at_1_max\n value: 43.0541870425035\n - type: nauc_precision_at_1_std\n value: -15.597124291613563\n - type: nauc_precision_at_20_diff1\n value: 4.734562571681868\n - type: nauc_precision_at_20_max\n value: 44.35081213316202\n - type: nauc_precision_at_20_std\n value: 6.642891478284595\n - type: nauc_precision_at_3_diff1\n value: 13.936559341472101\n - type: nauc_precision_at_3_max\n value: 45.426668552497524\n - type: nauc_precision_at_3_std\n value: -5.219785419247125\n - type: nauc_precision_at_5_diff1\n value: 8.366706789546015\n - type: nauc_precision_at_5_max\n value: 46.161942989326896\n - type: nauc_precision_at_5_std\n value: -0.193140343545876\n - type: nauc_recall_at_1000_diff1\n value: 45.61785312444842\n - type: nauc_recall_at_1000_max\n value: 75.68258976531774\n - type: nauc_recall_at_1000_std\n value: 37.469059422121575\n - type: nauc_recall_at_100_diff1\n value: 26.798748531805096\n - type: nauc_recall_at_100_max\n value: 54.72134095197765\n - type: nauc_recall_at_100_std\n value: -1.5967608233799417\n - type: nauc_recall_at_10_diff1\n value: 32.13211696200521\n - type: nauc_recall_at_10_max\n value: 31.13866254975895\n - type: nauc_recall_at_10_std\n value: -22.31404161136118\n - type: nauc_recall_at_1_diff1\n value: 43.68892397816937\n - type: nauc_recall_at_1_max\n value: 14.478978190224353\n - type: nauc_recall_at_1_std\n value: -18.435031919225477\n - type: nauc_recall_at_20_diff1\n value: 38.597996930461385\n - type: nauc_recall_at_20_max\n value: 42.49849027366794\n - type: nauc_recall_at_20_std\n value: -16.536471900752154\n - type: nauc_recall_at_3_diff1\n value: 35.343730012759266\n - type: nauc_recall_at_3_max\n value: 26.898722085043392\n - type: nauc_recall_at_3_std\n value: -19.4459792273884\n - type: nauc_recall_at_5_diff1\n value: 31.8310298012186\n - type: nauc_recall_at_5_max\n value: 32.67800489655844\n - type: nauc_recall_at_5_std\n value: -16.800929103347283\n - type: ndcg_at_1\n value: 53.916\n - type: ndcg_at_10\n value: 60.428000000000004\n - type: ndcg_at_100\n value: 65.95\n - type: ndcg_at_1000\n value: 66.88\n - type: ndcg_at_20\n value: 62.989\n - type: ndcg_at_3\n value: 55.204\n - type: ndcg_at_5\n value: 56.42700000000001\n - type: precision_at_1\n value: 53.916\n - type: precision_at_10\n value: 14.346999999999998\n - type: precision_at_100\n value: 1.849\n - type: precision_at_1000\n value: 0.196\n - type: precision_at_20\n value: 8.022\n - type: precision_at_3\n value: 34.552\n - type: precision_at_5\n value: 24.569\n - type: recall_at_1\n value: 33.453\n - type: recall_at_10\n value: 71.07900000000001\n - type: recall_at_100\n value: 93.207\n - type: recall_at_1000\n value: 99.60799999999999\n - type: recall_at_20\n value: 79.482\n - type: recall_at_3\n value: 53.98\n - type: recall_at_5\n value: 60.781\n task:\n type: Retrieval\n - dataset:\n config: eng-pol\n name: MTEB XPQARetrieval (eng-pol)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 34.042\n - type: map_at_1\n value: 13.236\n - type: map_at_10\n value: 27.839999999999996\n - type: map_at_100\n value: 30.171999999999997\n - type: map_at_1000\n value: 30.349999999999998\n - type: map_at_20\n value: 29.044999999999998\n - type: map_at_3\n value: 22.58\n - type: map_at_5\n value: 25.83\n - type: mrr_at_1\n value: 30.318471337579616\n - type: mrr_at_10\n value: 37.4983823678091\n - type: mrr_at_100\n value: 38.5784523175009\n - type: mrr_at_1000\n value: 38.63608698968148\n - type: mrr_at_20\n value: 38.02996157871825\n - type: mrr_at_3\n value: 34.798301486199584\n - type: mrr_at_5\n value: 36.39702760084925\n - type: nauc_map_at_1000_diff1\n value: 21.07199789609177\n - type: nauc_map_at_1000_max\n value: 25.959233507893277\n - type: nauc_map_at_1000_std\n value: -28.011925372852826\n - type: nauc_map_at_100_diff1\n value: 21.086788412737548\n - type: nauc_map_at_100_max\n value: 25.8611620203686\n - type: nauc_map_at_100_std\n value: -28.179239912057515\n - type: nauc_map_at_10_diff1\n value: 21.23841745922078\n - type: nauc_map_at_10_max\n value: 25.44290342378288\n - type: nauc_map_at_10_std\n value: -28.75578689110275\n - type: nauc_map_at_1_diff1\n value: 28.87454015638211\n - type: nauc_map_at_1_max\n value: 17.50681123879997\n - type: nauc_map_at_1_std\n value: -30.382831850562432\n - type: nauc_map_at_20_diff1\n value: 21.076559713540455\n - type: nauc_map_at_20_max\n value: 25.538154202494535\n - type: nauc_map_at_20_std\n value: -28.518764617658555\n - type: nauc_map_at_3_diff1\n value: 22.159185358766468\n - type: nauc_map_at_3_max\n value: 23.01652660927249\n - type: nauc_map_at_3_std\n value: -29.567722713221862\n - type: nauc_map_at_5_diff1\n value: 21.35578810370897\n - type: nauc_map_at_5_max\n value: 25.550550437767395\n - type: nauc_map_at_5_std\n value: -28.7889035461355\n - type: nauc_mrr_at_1000_diff1\n value: 22.28633009221923\n - type: nauc_mrr_at_1000_max\n value: 26.920205393136392\n - type: nauc_mrr_at_1000_std\n value: -25.887791634977642\n - type: nauc_mrr_at_100_diff1\n value: 22.2754975739755\n - type: nauc_mrr_at_100_max\n value: 26.90235716615346\n - type: nauc_mrr_at_100_std\n value: -25.891596020584345\n - type: nauc_mrr_at_10_diff1\n value: 22.415076305593534\n - type: nauc_mrr_at_10_max\n value: 26.504643796222222\n - type: nauc_mrr_at_10_std\n value: -26.6046081215833\n - type: nauc_mrr_at_1_diff1\n value: 23.406748619244368\n - type: nauc_mrr_at_1_max\n value: 29.058228240823553\n - type: nauc_mrr_at_1_std\n value: -26.450169820901078\n - type: nauc_mrr_at_20_diff1\n value: 22.29233141817678\n - type: nauc_mrr_at_20_max\n value: 26.69021351064081\n - type: nauc_mrr_at_20_std\n value: -26.086596227376656\n - type: nauc_mrr_at_3_diff1\n value: 22.20746187500145\n - type: nauc_mrr_at_3_max\n value: 27.143725946169457\n - type: nauc_mrr_at_3_std\n value: -26.7017708594376\n - type: nauc_mrr_at_5_diff1\n value: 22.71898965233195\n - type: nauc_mrr_at_5_max\n value: 26.932386658571662\n - type: nauc_mrr_at_5_std\n value: -26.725541058780234\n - type: nauc_ndcg_at_1000_diff1\n value: 20.541734305148466\n - type: nauc_ndcg_at_1000_max\n value: 27.180534238090758\n - type: nauc_ndcg_at_1000_std\n value: -23.74197745177845\n - type: nauc_ndcg_at_100_diff1\n value: 20.570052839937468\n - type: nauc_ndcg_at_100_max\n value: 26.21605034405486\n - type: nauc_ndcg_at_100_std\n value: -25.359817188805028\n - type: nauc_ndcg_at_10_diff1\n value: 21.241423075073467\n - type: nauc_ndcg_at_10_max\n value: 24.599199195239475\n - type: nauc_ndcg_at_10_std\n value: -28.404540333309008\n - type: nauc_ndcg_at_1_diff1\n value: 23.406748619244368\n - type: nauc_ndcg_at_1_max\n value: 29.058228240823553\n - type: nauc_ndcg_at_1_std\n value: -26.450169820901078\n - type: nauc_ndcg_at_20_diff1\n value: 20.740460046196873\n - type: nauc_ndcg_at_20_max\n value: 24.82380195169634\n - type: nauc_ndcg_at_20_std\n value: -27.376298834244313\n - type: nauc_ndcg_at_3_diff1\n value: 19.994948682426504\n - type: nauc_ndcg_at_3_max\n value: 26.153790759405105\n - type: nauc_ndcg_at_3_std\n value: -27.194548404540885\n - type: nauc_ndcg_at_5_diff1\n value: 21.48414272096384\n - type: nauc_ndcg_at_5_max\n value: 25.239652015076373\n - type: nauc_ndcg_at_5_std\n value: -28.2620160957961\n - type: nauc_precision_at_1000_diff1\n value: -0.7557639926687744\n - type: nauc_precision_at_1000_max\n value: 24.265591636994436\n - type: nauc_precision_at_1000_std\n value: 16.833104654292654\n - type: nauc_precision_at_100_diff1\n value: 4.647847665941115\n - type: nauc_precision_at_100_max\n value: 24.42192644844434\n - type: nauc_precision_at_100_std\n value: 0.2718848568876648\n - type: nauc_precision_at_10_diff1\n value: 9.465969286722654\n - type: nauc_precision_at_10_max\n value: 27.448993150448043\n - type: nauc_precision_at_10_std\n value: -16.519099596502212\n - type: nauc_precision_at_1_diff1\n value: 23.406748619244368\n - type: nauc_precision_at_1_max\n value: 29.058228240823553\n - type: nauc_precision_at_1_std\n value: -26.450169820901078\n - type: nauc_precision_at_20_diff1\n value: 8.021421615668114\n - type: nauc_precision_at_20_max\n value: 26.18556481398635\n - type: nauc_precision_at_20_std\n value: -12.207152108668367\n - type: nauc_precision_at_3_diff1\n value: 11.783572803634241\n - type: nauc_precision_at_3_max\n value: 29.259715774978893\n - type: nauc_precision_at_3_std\n value: -20.407524967717425\n - type: nauc_precision_at_5_diff1\n value: 10.371728615220821\n - type: nauc_precision_at_5_max\n value: 30.270642833482864\n - type: nauc_precision_at_5_std\n value: -18.407334880575494\n - type: nauc_recall_at_1000_diff1\n value: 6.008969959111555\n - type: nauc_recall_at_1000_max\n value: 39.79691734058127\n - type: nauc_recall_at_1000_std\n value: 32.43591825510109\n - type: nauc_recall_at_100_diff1\n value: 15.2374566058917\n - type: nauc_recall_at_100_max\n value: 23.058785539503717\n - type: nauc_recall_at_100_std\n value: -15.962888794058165\n - type: nauc_recall_at_10_diff1\n value: 19.46184821807753\n - type: nauc_recall_at_10_max\n value: 19.001003513986866\n - type: nauc_recall_at_10_std\n value: -27.753332786663876\n - type: nauc_recall_at_1_diff1\n value: 28.87454015638211\n - type: nauc_recall_at_1_max\n value: 17.50681123879997\n - type: nauc_recall_at_1_std\n value: -30.382831850562432\n - type: nauc_recall_at_20_diff1\n value: 17.237090858517405\n - type: nauc_recall_at_20_max\n value: 18.42118474134871\n - type: nauc_recall_at_20_std\n value: -24.862787724031957\n - type: nauc_recall_at_3_diff1\n value: 18.813019521758577\n - type: nauc_recall_at_3_max\n value: 19.198572333053544\n - type: nauc_recall_at_3_std\n value: -28.5644958605618\n - type: nauc_recall_at_5_diff1\n value: 20.247501986329482\n - type: nauc_recall_at_5_max\n value: 21.121526202170358\n - type: nauc_recall_at_5_std\n value: -27.220378617864853\n - type: ndcg_at_1\n value: 30.318\n - type: ndcg_at_10\n value: 34.042\n - type: ndcg_at_100\n value: 42.733\n - type: ndcg_at_1000\n value: 46.015\n - type: ndcg_at_20\n value: 37.053999999999995\n - type: ndcg_at_3\n value: 29.254\n - type: ndcg_at_5\n value: 30.514000000000003\n - type: precision_at_1\n value: 30.318\n - type: precision_at_10\n value: 10.981\n - type: precision_at_100\n value: 1.889\n - type: precision_at_1000\n value: 0.234\n - type: precision_at_20\n value: 6.643000000000001\n - type: precision_at_3\n value: 22.166\n - type: precision_at_5\n value: 17.477999999999998\n - type: recall_at_1\n value: 13.236\n - type: recall_at_10\n value: 41.461\n - type: recall_at_100\n value: 75.008\n - type: recall_at_1000\n value: 96.775\n - type: recall_at_20\n value: 50.754\n - type: recall_at_3\n value: 26.081\n - type: recall_at_5\n value: 33.168\n task:\n type: Retrieval\n - dataset:\n config: eng-cmn\n name: MTEB XPQARetrieval (eng-cmn)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 37.504\n - type: map_at_1\n value: 16.019\n - type: map_at_10\n value: 30.794\n - type: map_at_100\n value: 33.157\n - type: map_at_1000\n value: 33.324999999999996\n - type: map_at_20\n value: 32.161\n - type: map_at_3\n value: 25.372\n - type: map_at_5\n value: 28.246\n - type: mrr_at_1\n value: 30.461165048543688\n - type: mrr_at_10\n value: 39.393107566651224\n - type: mrr_at_100\n value: 40.570039540602295\n - type: mrr_at_1000\n value: 40.6306116407744\n - type: mrr_at_20\n value: 40.09428159978876\n - type: mrr_at_3\n value: 37.176375404530745\n - type: mrr_at_5\n value: 38.09870550161812\n - type: nauc_map_at_1000_diff1\n value: 30.82306881892873\n - type: nauc_map_at_1000_max\n value: 5.877636000666466\n - type: nauc_map_at_1000_std\n value: -30.7140513386797\n - type: nauc_map_at_100_diff1\n value: 30.85192449151961\n - type: nauc_map_at_100_max\n value: 5.809195131550909\n - type: nauc_map_at_100_std\n value: -30.838556702972063\n - type: nauc_map_at_10_diff1\n value: 30.50359163635058\n - type: nauc_map_at_10_max\n value: 6.373491595869303\n - type: nauc_map_at_10_std\n value: -29.89368007827676\n - type: nauc_map_at_1_diff1\n value: 38.60240510083884\n - type: nauc_map_at_1_max\n value: 10.407392664609139\n - type: nauc_map_at_1_std\n value: -17.76327278732833\n - type: nauc_map_at_20_diff1\n value: 30.897489125753598\n - type: nauc_map_at_20_max\n value: 5.9303381898248\n - type: nauc_map_at_20_std\n value: -30.863345188760515\n - type: nauc_map_at_3_diff1\n value: 32.8150951852729\n - type: nauc_map_at_3_max\n value: 7.671931402215177\n - type: nauc_map_at_3_std\n value: -25.654809758216533\n - type: nauc_map_at_5_diff1\n value: 31.19558194781019\n - type: nauc_map_at_5_max\n value: 6.426885613116939\n - type: nauc_map_at_5_std\n value: -28.609027858850016\n - type: nauc_mrr_at_1000_diff1\n value: 30.7596332048733\n - type: nauc_mrr_at_1000_max\n value: 1.1970748115580212\n - type: nauc_mrr_at_1000_std\n value: -34.647570668150216\n - type: nauc_mrr_at_100_diff1\n value: 30.74693370788581\n - type: nauc_mrr_at_100_max\n value: 1.1673272262754841\n - type: nauc_mrr_at_100_std\n value: -34.67761028542745\n - type: nauc_mrr_at_10_diff1\n value: 30.537820575183076\n - type: nauc_mrr_at_10_max\n value: 1.0261868725502707\n - type: nauc_mrr_at_10_std\n value: -34.999990560631204\n - type: nauc_mrr_at_1_diff1\n value: 35.51868580113285\n - type: nauc_mrr_at_1_max\n value: 5.117103773147307\n - type: nauc_mrr_at_1_std\n value: -30.633913466736956\n - type: nauc_mrr_at_20_diff1\n value: 30.67318175430903\n - type: nauc_mrr_at_20_max\n value: 1.0979983974981327\n - type: nauc_mrr_at_20_std\n value: -34.8388339739997\n - type: nauc_mrr_at_3_diff1\n value: 30.884642006045702\n - type: nauc_mrr_at_3_max\n value: 1.7970996544095983\n - type: nauc_mrr_at_3_std\n value: -34.290172894906085\n - type: nauc_mrr_at_5_diff1\n value: 30.89687518368571\n - type: nauc_mrr_at_5_max\n value: 1.2123714988495347\n - type: nauc_mrr_at_5_std\n value: -35.01704580471926\n - type: nauc_ndcg_at_1000_diff1\n value: 29.214476799077342\n - type: nauc_ndcg_at_1000_max\n value: 3.6379035546112872\n - type: nauc_ndcg_at_1000_std\n value: -32.35757522049194\n - type: nauc_ndcg_at_100_diff1\n value: 29.130004541376298\n - type: nauc_ndcg_at_100_max\n value: 2.9580589185293045\n - type: nauc_ndcg_at_100_std\n value: -33.26884643871724\n - type: nauc_ndcg_at_10_diff1\n value: 28.521001084366393\n - type: nauc_ndcg_at_10_max\n value: 3.630223957267483\n - type: nauc_ndcg_at_10_std\n value: -33.14524140940815\n - type: nauc_ndcg_at_1_diff1\n value: 35.51868580113285\n - type: nauc_ndcg_at_1_max\n value: 5.117103773147307\n - type: nauc_ndcg_at_1_std\n value: -30.633913466736956\n - type: nauc_ndcg_at_20_diff1\n value: 29.194462756848782\n - type: nauc_ndcg_at_20_max\n value: 2.61162903136461\n - type: nauc_ndcg_at_20_std\n value: -34.59161403211834\n - type: nauc_ndcg_at_3_diff1\n value: 30.183555327135203\n - type: nauc_ndcg_at_3_max\n value: 5.61949040917093\n - type: nauc_ndcg_at_3_std\n value: -30.350117794058175\n - type: nauc_ndcg_at_5_diff1\n value: 29.74420394139971\n - type: nauc_ndcg_at_5_max\n value: 3.952183813937688\n - type: nauc_ndcg_at_5_std\n value: -31.807833795302038\n - type: nauc_precision_at_1000_diff1\n value: -5.467049121617333\n - type: nauc_precision_at_1000_max\n value: -3.993986884198271\n - type: nauc_precision_at_1000_std\n value: -13.703967324212224\n - type: nauc_precision_at_100_diff1\n value: 1.5585428307943647\n - type: nauc_precision_at_100_max\n value: -4.250455723613214\n - type: nauc_precision_at_100_std\n value: -22.294689856776493\n - type: nauc_precision_at_10_diff1\n value: 11.076036917255259\n - type: nauc_precision_at_10_max\n value: -1.5859394644365377\n - type: nauc_precision_at_10_std\n value: -34.94912594413202\n - type: nauc_precision_at_1_diff1\n value: 35.51868580113285\n - type: nauc_precision_at_1_max\n value: 5.117103773147307\n - type: nauc_precision_at_1_std\n value: -30.633913466736956\n - type: nauc_precision_at_20_diff1\n value: 9.311484455773828\n - type: nauc_precision_at_20_max\n value: -3.678383428592432\n - type: nauc_precision_at_20_std\n value: -33.700002761401635\n - type: nauc_precision_at_3_diff1\n value: 19.2787260874381\n - type: nauc_precision_at_3_max\n value: 0.18292109396940018\n - type: nauc_precision_at_3_std\n value: -35.23939824276542\n - type: nauc_precision_at_5_diff1\n value: 14.97930592298584\n - type: nauc_precision_at_5_max\n value: -1.63540635880963\n - type: nauc_precision_at_5_std\n value: -35.908283558321315\n - type: nauc_recall_at_1000_diff1\n value: 26.63056473607804\n - type: nauc_recall_at_1000_max\n value: 62.7304558520689\n - type: nauc_recall_at_1000_std\n value: 58.12421701377561\n - type: nauc_recall_at_100_diff1\n value: 21.42127379898579\n - type: nauc_recall_at_100_max\n value: 1.4748203516921914\n - type: nauc_recall_at_100_std\n value: -27.56467339041136\n - type: nauc_recall_at_10_diff1\n value: 21.20479652609812\n - type: nauc_recall_at_10_max\n value: 1.7394881489709888\n - type: nauc_recall_at_10_std\n value: -32.15116902585072\n - type: nauc_recall_at_1_diff1\n value: 38.60240510083884\n - type: nauc_recall_at_1_max\n value: 10.407392664609139\n - type: nauc_recall_at_1_std\n value: -17.76327278732833\n - type: nauc_recall_at_20_diff1\n value: 23.049652721582632\n - type: nauc_recall_at_20_max\n value: -1.7715787106286838\n - type: nauc_recall_at_20_std\n value: -36.14203686002867\n - type: nauc_recall_at_3_diff1\n value: 26.522179829461873\n - type: nauc_recall_at_3_max\n value: 6.078208732431124\n - type: nauc_recall_at_3_std\n value: -25.02625711226274\n - type: nauc_recall_at_5_diff1\n value: 24.19538553561693\n - type: nauc_recall_at_5_max\n value: 2.4963810785503524\n - type: nauc_recall_at_5_std\n value: -30.449635496921257\n - type: ndcg_at_1\n value: 30.461\n - type: ndcg_at_10\n value: 37.504\n - type: ndcg_at_100\n value: 46.156000000000006\n - type: ndcg_at_1000\n value: 48.985\n - type: ndcg_at_20\n value: 41.025\n - type: ndcg_at_3\n value: 32.165\n - type: ndcg_at_5\n value: 33.072\n - type: precision_at_1\n value: 30.461\n - type: precision_at_10\n value: 11.032\n - type: precision_at_100\n value: 1.8870000000000002\n - type: precision_at_1000\n value: 0.22499999999999998\n - type: precision_at_20\n value: 6.833\n - type: precision_at_3\n value: 22.532\n - type: precision_at_5\n value: 16.966\n - type: recall_at_1\n value: 16.019\n - type: recall_at_10\n value: 47.557\n - type: recall_at_100\n value: 80.376\n - type: recall_at_1000\n value: 98.904\n - type: recall_at_20\n value: 58.48100000000001\n - type: recall_at_3\n value: 30.682\n - type: recall_at_5\n value: 36.714999999999996\n task:\n type: Retrieval\n - dataset:\n config: eng-spa\n name: MTEB XPQARetrieval (eng-spa)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 53.359\n - type: map_at_1\n value: 22.892000000000003\n - type: map_at_10\n value: 45.773\n - type: map_at_100\n value: 47.778999999999996\n - type: map_at_1000\n value: 47.882999999999996\n - type: map_at_20\n value: 46.869\n - type: map_at_3\n value: 37.643\n - type: map_at_5\n value: 43.120999999999995\n - type: mrr_at_1\n value: 47.28877679697352\n - type: mrr_at_10\n value: 56.95890630316857\n - type: mrr_at_100\n value: 57.71103367009639\n - type: mrr_at_1000\n value: 57.73661441948852\n - type: mrr_at_20\n value: 57.37701091311334\n - type: mrr_at_3\n value: 54.74989491382929\n - type: mrr_at_5\n value: 56.08659100462372\n - type: nauc_map_at_1000_diff1\n value: 27.8347129954991\n - type: nauc_map_at_1000_max\n value: 38.04300600762859\n - type: nauc_map_at_1000_std\n value: -18.294653328262868\n - type: nauc_map_at_100_diff1\n value: 27.818449297770858\n - type: nauc_map_at_100_max\n value: 38.03533462156633\n - type: nauc_map_at_100_std\n value: -18.332989980880644\n - type: nauc_map_at_10_diff1\n value: 27.520664180018358\n - type: nauc_map_at_10_max\n value: 37.67109855753314\n - type: nauc_map_at_10_std\n value: -18.496721673888683\n - type: nauc_map_at_1_diff1\n value: 37.56020148060502\n - type: nauc_map_at_1_max\n value: 10.298394230150745\n - type: nauc_map_at_1_std\n value: -20.41359936101547\n - type: nauc_map_at_20_diff1\n value: 27.615023038189722\n - type: nauc_map_at_20_max\n value: 37.808525116320254\n - type: nauc_map_at_20_std\n value: -18.49235775420803\n - type: nauc_map_at_3_diff1\n value: 30.797347567428424\n - type: nauc_map_at_3_max\n value: 29.374407828869497\n - type: nauc_map_at_3_std\n value: -19.75905772914969\n - type: nauc_map_at_5_diff1\n value: 28.431802888884803\n - type: nauc_map_at_5_max\n value: 35.57723911610521\n - type: nauc_map_at_5_std\n value: -19.093588845366824\n - type: nauc_mrr_at_1000_diff1\n value: 33.263611009054586\n - type: nauc_mrr_at_1000_max\n value: 40.620639901613664\n - type: nauc_mrr_at_1000_std\n value: -17.083016011032036\n - type: nauc_mrr_at_100_diff1\n value: 33.25375012559163\n - type: nauc_mrr_at_100_max\n value: 40.62376205172005\n - type: nauc_mrr_at_100_std\n value: -17.091930575226684\n - type: nauc_mrr_at_10_diff1\n value: 33.05787202690095\n - type: nauc_mrr_at_10_max\n value: 40.4516362611674\n - type: nauc_mrr_at_10_std\n value: -17.088910666499892\n - type: nauc_mrr_at_1_diff1\n value: 36.424151087824555\n - type: nauc_mrr_at_1_max\n value: 40.955715626650445\n - type: nauc_mrr_at_1_std\n value: -16.56636409111209\n - type: nauc_mrr_at_20_diff1\n value: 33.12029456858138\n - type: nauc_mrr_at_20_max\n value: 40.56409347292635\n - type: nauc_mrr_at_20_std\n value: -17.102034817242068\n - type: nauc_mrr_at_3_diff1\n value: 33.52377926814156\n - type: nauc_mrr_at_3_max\n value: 40.824911575046876\n - type: nauc_mrr_at_3_std\n value: -16.855935748811092\n - type: nauc_mrr_at_5_diff1\n value: 33.08646471768442\n - type: nauc_mrr_at_5_max\n value: 40.59323589955881\n - type: nauc_mrr_at_5_std\n value: -16.77829710500156\n - type: nauc_ndcg_at_1000_diff1\n value: 28.741186244590207\n - type: nauc_ndcg_at_1000_max\n value: 40.0113825410539\n - type: nauc_ndcg_at_1000_std\n value: -17.15655081742458\n - type: nauc_ndcg_at_100_diff1\n value: 28.680521359782972\n - type: nauc_ndcg_at_100_max\n value: 39.94751899984445\n - type: nauc_ndcg_at_100_std\n value: -17.82813814043932\n - type: nauc_ndcg_at_10_diff1\n value: 27.22858072673168\n - type: nauc_ndcg_at_10_max\n value: 38.600188968554725\n - type: nauc_ndcg_at_10_std\n value: -18.517203924893614\n - type: nauc_ndcg_at_1_diff1\n value: 36.424151087824555\n - type: nauc_ndcg_at_1_max\n value: 40.955715626650445\n - type: nauc_ndcg_at_1_std\n value: -16.56636409111209\n - type: nauc_ndcg_at_20_diff1\n value: 27.56875900623774\n - type: nauc_ndcg_at_20_max\n value: 38.95264310199067\n - type: nauc_ndcg_at_20_std\n value: -18.709973965688445\n - type: nauc_ndcg_at_3_diff1\n value: 28.682842749851574\n - type: nauc_ndcg_at_3_max\n value: 38.361215408395964\n - type: nauc_ndcg_at_3_std\n value: -16.800291231827515\n - type: nauc_ndcg_at_5_diff1\n value: 28.178239259093484\n - type: nauc_ndcg_at_5_max\n value: 36.77096292606479\n - type: nauc_ndcg_at_5_std\n value: -18.718861696641145\n - type: nauc_precision_at_1000_diff1\n value: -7.3686253252869305\n - type: nauc_precision_at_1000_max\n value: 31.98896996987639\n - type: nauc_precision_at_1000_std\n value: 13.125659676392267\n - type: nauc_precision_at_100_diff1\n value: -2.8239113056969156\n - type: nauc_precision_at_100_max\n value: 36.95062472971812\n - type: nauc_precision_at_100_std\n value: 7.230228733647562\n - type: nauc_precision_at_10_diff1\n value: 2.5515545798843555\n - type: nauc_precision_at_10_max\n value: 45.46146019314904\n - type: nauc_precision_at_10_std\n value: -1.3249340536211553\n - type: nauc_precision_at_1_diff1\n value: 36.424151087824555\n - type: nauc_precision_at_1_max\n value: 40.955715626650445\n - type: nauc_precision_at_1_std\n value: -16.56636409111209\n - type: nauc_precision_at_20_diff1\n value: 0.7202861770489576\n - type: nauc_precision_at_20_max\n value: 41.9937596214609\n - type: nauc_precision_at_20_std\n value: 0.2756400069730064\n - type: nauc_precision_at_3_diff1\n value: 12.89221206929447\n - type: nauc_precision_at_3_max\n value: 48.57775126381142\n - type: nauc_precision_at_3_std\n value: -8.042242254131068\n - type: nauc_precision_at_5_diff1\n value: 7.063616193387763\n - type: nauc_precision_at_5_max\n value: 47.26496887331675\n - type: nauc_precision_at_5_std\n value: -4.735805200913049\n - type: nauc_recall_at_1000_diff1\n value: 2.6650052980682224\n - type: nauc_recall_at_1000_max\n value: 81.94826279951472\n - type: nauc_recall_at_1000_std\n value: 48.46012388224573\n - type: nauc_recall_at_100_diff1\n value: 24.516371948375827\n - type: nauc_recall_at_100_max\n value: 39.17639620389552\n - type: nauc_recall_at_100_std\n value: -17.884197602579533\n - type: nauc_recall_at_10_diff1\n value: 19.93892097640112\n - type: nauc_recall_at_10_max\n value: 33.079079440022106\n - type: nauc_recall_at_10_std\n value: -20.22227622801884\n - type: nauc_recall_at_1_diff1\n value: 37.56020148060502\n - type: nauc_recall_at_1_max\n value: 10.298394230150745\n - type: nauc_recall_at_1_std\n value: -20.41359936101547\n - type: nauc_recall_at_20_diff1\n value: 20.363784035670633\n - type: nauc_recall_at_20_max\n value: 33.39352971625336\n - type: nauc_recall_at_20_std\n value: -21.712050932168875\n - type: nauc_recall_at_3_diff1\n value: 26.220072121604655\n - type: nauc_recall_at_3_max\n value: 25.853218030218507\n - type: nauc_recall_at_3_std\n value: -17.830613372910907\n - type: nauc_recall_at_5_diff1\n value: 22.25850162680252\n - type: nauc_recall_at_5_max\n value: 30.89620539042785\n - type: nauc_recall_at_5_std\n value: -19.16786434439169\n - type: ndcg_at_1\n value: 47.288999999999994\n - type: ndcg_at_10\n value: 53.359\n - type: ndcg_at_100\n value: 60.25899999999999\n - type: ndcg_at_1000\n value: 61.902\n - type: ndcg_at_20\n value: 56.025000000000006\n - type: ndcg_at_3\n value: 47.221999999999994\n - type: ndcg_at_5\n value: 49.333\n - type: precision_at_1\n value: 47.288999999999994\n - type: precision_at_10\n value: 16.003\n - type: precision_at_100\n value: 2.221\n - type: precision_at_1000\n value: 0.246\n - type: precision_at_20\n value: 8.985\n - type: precision_at_3\n value: 34.510000000000005\n - type: precision_at_5\n value: 26.961000000000002\n - type: recall_at_1\n value: 22.892000000000003\n - type: recall_at_10\n value: 62.928\n - type: recall_at_100\n value: 89.105\n - type: recall_at_1000\n value: 99.319\n - type: recall_at_20\n value: 71.387\n - type: recall_at_3\n value: 43.492999999999995\n - type: recall_at_5\n value: 53.529\n task:\n type: Retrieval\n - dataset:\n config: eng-fra\n name: MTEB XPQARetrieval (eng-fra)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 54.888000000000005\n - type: map_at_1\n value: 26.079\n - type: map_at_10\n value: 47.434\n - type: map_at_100\n value: 49.376\n - type: map_at_1000\n value: 49.461\n - type: map_at_20\n value: 48.634\n - type: map_at_3\n value: 40.409\n - type: map_at_5\n value: 44.531\n - type: mrr_at_1\n value: 46.86248331108144\n - type: mrr_at_10\n value: 56.45506177548896\n - type: mrr_at_100\n value: 57.20360629445577\n - type: mrr_at_1000\n value: 57.227004696897986\n - type: mrr_at_20\n value: 56.905302765737865\n - type: mrr_at_3\n value: 54.09434801958164\n - type: mrr_at_5\n value: 55.40943480195811\n - type: nauc_map_at_1000_diff1\n value: 37.739936045535885\n - type: nauc_map_at_1000_max\n value: 35.92625003516368\n - type: nauc_map_at_1000_std\n value: -15.825119611638398\n - type: nauc_map_at_100_diff1\n value: 37.71697833661983\n - type: nauc_map_at_100_max\n value: 35.91174068136317\n - type: nauc_map_at_100_std\n value: -15.838841891589006\n - type: nauc_map_at_10_diff1\n value: 37.52309268219689\n - type: nauc_map_at_10_max\n value: 35.4887130483351\n - type: nauc_map_at_10_std\n value: -16.61132378136234\n - type: nauc_map_at_1_diff1\n value: 42.705087329207984\n - type: nauc_map_at_1_max\n value: 12.047671550242974\n - type: nauc_map_at_1_std\n value: -17.156030827065834\n - type: nauc_map_at_20_diff1\n value: 37.59446680137666\n - type: nauc_map_at_20_max\n value: 35.80559546695052\n - type: nauc_map_at_20_std\n value: -16.158338316249786\n - type: nauc_map_at_3_diff1\n value: 38.618415267131816\n - type: nauc_map_at_3_max\n value: 27.030227996183925\n - type: nauc_map_at_3_std\n value: -18.962500694157857\n - type: nauc_map_at_5_diff1\n value: 37.980845601534256\n - type: nauc_map_at_5_max\n value: 32.82374761283266\n - type: nauc_map_at_5_std\n value: -17.856875825229565\n - type: nauc_mrr_at_1000_diff1\n value: 40.26059509279346\n - type: nauc_mrr_at_1000_max\n value: 39.28453752990871\n - type: nauc_mrr_at_1000_std\n value: -13.306217279524212\n - type: nauc_mrr_at_100_diff1\n value: 40.23390833398881\n - type: nauc_mrr_at_100_max\n value: 39.26041461025653\n - type: nauc_mrr_at_100_std\n value: -13.317700798873153\n - type: nauc_mrr_at_10_diff1\n value: 40.163737640180145\n - type: nauc_mrr_at_10_max\n value: 39.27138538165913\n - type: nauc_mrr_at_10_std\n value: -13.472971360323038\n - type: nauc_mrr_at_1_diff1\n value: 42.95339241383707\n - type: nauc_mrr_at_1_max\n value: 40.62982307619158\n - type: nauc_mrr_at_1_std\n value: -10.429597045942748\n - type: nauc_mrr_at_20_diff1\n value: 40.23703505923782\n - type: nauc_mrr_at_20_max\n value: 39.27051308063652\n - type: nauc_mrr_at_20_std\n value: -13.390197643922038\n - type: nauc_mrr_at_3_diff1\n value: 40.5721313555661\n - type: nauc_mrr_at_3_max\n value: 39.254774354468594\n - type: nauc_mrr_at_3_std\n value: -13.773803807863827\n - type: nauc_mrr_at_5_diff1\n value: 40.41081287079734\n - type: nauc_mrr_at_5_max\n value: 39.515241132077335\n - type: nauc_mrr_at_5_std\n value: -13.306544090087336\n - type: nauc_ndcg_at_1000_diff1\n value: 38.04772268296103\n - type: nauc_ndcg_at_1000_max\n value: 38.03364565521176\n - type: nauc_ndcg_at_1000_std\n value: -14.203182726102263\n - type: nauc_ndcg_at_100_diff1\n value: 37.51752795463643\n - type: nauc_ndcg_at_100_max\n value: 37.809671511710604\n - type: nauc_ndcg_at_100_std\n value: -13.880578225081408\n - type: nauc_ndcg_at_10_diff1\n value: 36.78438984005559\n - type: nauc_ndcg_at_10_max\n value: 36.98105155993232\n - type: nauc_ndcg_at_10_std\n value: -16.886308645939113\n - type: nauc_ndcg_at_1_diff1\n value: 42.95339241383707\n - type: nauc_ndcg_at_1_max\n value: 40.62982307619158\n - type: nauc_ndcg_at_1_std\n value: -10.429597045942748\n - type: nauc_ndcg_at_20_diff1\n value: 36.94164323893683\n - type: nauc_ndcg_at_20_max\n value: 37.333583379288285\n - type: nauc_ndcg_at_20_std\n value: -15.853318071434716\n - type: nauc_ndcg_at_3_diff1\n value: 36.905604845477384\n - type: nauc_ndcg_at_3_max\n value: 35.10252586688781\n - type: nauc_ndcg_at_3_std\n value: -17.128435988977742\n - type: nauc_ndcg_at_5_diff1\n value: 37.96742463612705\n - type: nauc_ndcg_at_5_max\n value: 34.65945109443365\n - type: nauc_ndcg_at_5_std\n value: -17.916428667861183\n - type: nauc_precision_at_1000_diff1\n value: -3.740861894117653\n - type: nauc_precision_at_1000_max\n value: 31.993854396874177\n - type: nauc_precision_at_1000_std\n value: 17.445629474196448\n - type: nauc_precision_at_100_diff1\n value: -0.4825948747911606\n - type: nauc_precision_at_100_max\n value: 35.834638448782954\n - type: nauc_precision_at_100_std\n value: 16.82718796079511\n - type: nauc_precision_at_10_diff1\n value: 8.285949866268147\n - type: nauc_precision_at_10_max\n value: 45.3292519726866\n - type: nauc_precision_at_10_std\n value: 4.5574850748441555\n - type: nauc_precision_at_1_diff1\n value: 42.95339241383707\n - type: nauc_precision_at_1_max\n value: 40.62982307619158\n - type: nauc_precision_at_1_std\n value: -10.429597045942748\n - type: nauc_precision_at_20_diff1\n value: 4.890590733611442\n - type: nauc_precision_at_20_max\n value: 41.83051757078859\n - type: nauc_precision_at_20_std\n value: 9.197347125630467\n - type: nauc_precision_at_3_diff1\n value: 17.79940075411976\n - type: nauc_precision_at_3_max\n value: 45.224103632426946\n - type: nauc_precision_at_3_std\n value: -5.017203435609909\n - type: nauc_precision_at_5_diff1\n value: 13.548063145911929\n - type: nauc_precision_at_5_max\n value: 46.84837547409909\n - type: nauc_precision_at_5_std\n value: -0.8925939386354484\n - type: nauc_recall_at_1000_diff1\n value: 74.48441717138078\n - type: nauc_recall_at_1000_max\n value: 74.66717137705027\n - type: nauc_recall_at_1000_std\n value: 0.24030117471512125\n - type: nauc_recall_at_100_diff1\n value: 22.553777341988656\n - type: nauc_recall_at_100_max\n value: 31.67861029246527\n - type: nauc_recall_at_100_std\n value: 0.2707450517253687\n - type: nauc_recall_at_10_diff1\n value: 28.490866614443235\n - type: nauc_recall_at_10_max\n value: 31.722970141434352\n - type: nauc_recall_at_10_std\n value: -21.97893365028007\n - type: nauc_recall_at_1_diff1\n value: 42.705087329207984\n - type: nauc_recall_at_1_max\n value: 12.047671550242974\n - type: nauc_recall_at_1_std\n value: -17.156030827065834\n - type: nauc_recall_at_20_diff1\n value: 27.44043454173112\n - type: nauc_recall_at_20_max\n value: 31.454281772040716\n - type: nauc_recall_at_20_std\n value: -20.1735695305415\n - type: nauc_recall_at_3_diff1\n value: 34.08447534706394\n - type: nauc_recall_at_3_max\n value: 21.793973773840865\n - type: nauc_recall_at_3_std\n value: -22.753978372378906\n - type: nauc_recall_at_5_diff1\n value: 33.59686526199479\n - type: nauc_recall_at_5_max\n value: 29.188889073761302\n - type: nauc_recall_at_5_std\n value: -21.96156333744562\n - type: ndcg_at_1\n value: 46.861999999999995\n - type: ndcg_at_10\n value: 54.888000000000005\n - type: ndcg_at_100\n value: 61.477000000000004\n - type: ndcg_at_1000\n value: 62.768\n - type: ndcg_at_20\n value: 57.812\n - type: ndcg_at_3\n value: 48.721\n - type: ndcg_at_5\n value: 50.282000000000004\n - type: precision_at_1\n value: 46.861999999999995\n - type: precision_at_10\n value: 15.167\n - type: precision_at_100\n value: 2.072\n - type: precision_at_1000\n value: 0.22499999999999998\n - type: precision_at_20\n value: 8.672\n - type: precision_at_3\n value: 33.066\n - type: precision_at_5\n value: 24.726\n - type: recall_at_1\n value: 26.079\n - type: recall_at_10\n value: 66.095\n - type: recall_at_100\n value: 91.65299999999999\n - type: recall_at_1000\n value: 99.83999999999999\n - type: recall_at_20\n value: 75.28\n - type: recall_at_3\n value: 46.874\n - type: recall_at_5\n value: 55.062\n task:\n type: Retrieval\n - dataset:\n config: pol-eng\n name: MTEB XPQARetrieval (pol-eng)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 50.831\n - type: map_at_1\n value: 25.549\n - type: map_at_10\n value: 44.432\n - type: map_at_100\n value: 46.431\n - type: map_at_1000\n value: 46.525\n - type: map_at_20\n value: 45.595\n - type: map_at_3\n value: 38.574000000000005\n - type: map_at_5\n value: 42.266999999999996\n - type: mrr_at_1\n value: 43.5006435006435\n - type: mrr_at_10\n value: 51.561255132683684\n - type: mrr_at_100\n value: 52.59912482635216\n - type: mrr_at_1000\n value: 52.631337587043056\n - type: mrr_at_20\n value: 52.23234440063273\n - type: mrr_at_3\n value: 48.97039897039895\n - type: mrr_at_5\n value: 50.31531531531527\n - type: nauc_map_at_1000_diff1\n value: 35.907901295900174\n - type: nauc_map_at_1000_max\n value: 24.573763602041687\n - type: nauc_map_at_1000_std\n value: -29.524077960309313\n - type: nauc_map_at_100_diff1\n value: 35.86869121827827\n - type: nauc_map_at_100_max\n value: 24.532343818487494\n - type: nauc_map_at_100_std\n value: -29.613979124488864\n - type: nauc_map_at_10_diff1\n value: 35.90171794022391\n - type: nauc_map_at_10_max\n value: 23.90914892943268\n - type: nauc_map_at_10_std\n value: -30.43698820061533\n - type: nauc_map_at_1_diff1\n value: 50.80313333312038\n - type: nauc_map_at_1_max\n value: 16.649890421888156\n - type: nauc_map_at_1_std\n value: -22.323989416471683\n - type: nauc_map_at_20_diff1\n value: 35.77755470212964\n - type: nauc_map_at_20_max\n value: 24.199895270297034\n - type: nauc_map_at_20_std\n value: -30.223411960170647\n - type: nauc_map_at_3_diff1\n value: 38.964124882315936\n - type: nauc_map_at_3_max\n value: 21.187432510177167\n - type: nauc_map_at_3_std\n value: -28.976663506389887\n - type: nauc_map_at_5_diff1\n value: 36.04644236616672\n - type: nauc_map_at_5_max\n value: 23.501186429317094\n - type: nauc_map_at_5_std\n value: -30.068144596060748\n - type: nauc_mrr_at_1000_diff1\n value: 41.36555452105447\n - type: nauc_mrr_at_1000_max\n value: 26.376799280402867\n - type: nauc_mrr_at_1000_std\n value: -30.008603028757424\n - type: nauc_mrr_at_100_diff1\n value: 41.35523965220727\n - type: nauc_mrr_at_100_max\n value: 26.402612115967706\n - type: nauc_mrr_at_100_std\n value: -29.991754627128024\n - type: nauc_mrr_at_10_diff1\n value: 41.001395127259315\n - type: nauc_mrr_at_10_max\n value: 26.104860505051384\n - type: nauc_mrr_at_10_std\n value: -30.38420449487516\n - type: nauc_mrr_at_1_diff1\n value: 44.882846373248206\n - type: nauc_mrr_at_1_max\n value: 26.61905322890808\n - type: nauc_mrr_at_1_std\n value: -28.724565662206153\n - type: nauc_mrr_at_20_diff1\n value: 41.278009142648834\n - type: nauc_mrr_at_20_max\n value: 26.284565529087295\n - type: nauc_mrr_at_20_std\n value: -30.19549140549242\n - type: nauc_mrr_at_3_diff1\n value: 41.74663893951077\n - type: nauc_mrr_at_3_max\n value: 26.263048464325884\n - type: nauc_mrr_at_3_std\n value: -30.676733442965688\n - type: nauc_mrr_at_5_diff1\n value: 41.11461477846568\n - type: nauc_mrr_at_5_max\n value: 25.94713927964926\n - type: nauc_mrr_at_5_std\n value: -30.317066480767817\n - type: nauc_ndcg_at_1000_diff1\n value: 36.34161052445199\n - type: nauc_ndcg_at_1000_max\n value: 26.321036033696206\n - type: nauc_ndcg_at_1000_std\n value: -27.59146917115399\n - type: nauc_ndcg_at_100_diff1\n value: 35.66557800007035\n - type: nauc_ndcg_at_100_max\n value: 26.282211208336136\n - type: nauc_ndcg_at_100_std\n value: -27.905634124461333\n - type: nauc_ndcg_at_10_diff1\n value: 35.34872687407275\n - type: nauc_ndcg_at_10_max\n value: 24.018561915792272\n - type: nauc_ndcg_at_10_std\n value: -31.57712772869015\n - type: nauc_ndcg_at_1_diff1\n value: 44.882846373248206\n - type: nauc_ndcg_at_1_max\n value: 26.865602442152554\n - type: nauc_ndcg_at_1_std\n value: -28.509295454329152\n - type: nauc_ndcg_at_20_diff1\n value: 35.46177768045546\n - type: nauc_ndcg_at_20_max\n value: 24.921273675141542\n - type: nauc_ndcg_at_20_std\n value: -30.84348812979793\n - type: nauc_ndcg_at_3_diff1\n value: 36.84688489063923\n - type: nauc_ndcg_at_3_max\n value: 24.088513229463736\n - type: nauc_ndcg_at_3_std\n value: -30.05640995379297\n - type: nauc_ndcg_at_5_diff1\n value: 35.623143276796185\n - type: nauc_ndcg_at_5_max\n value: 23.76654250474061\n - type: nauc_ndcg_at_5_std\n value: -30.87847710074466\n - type: nauc_precision_at_1000_diff1\n value: -16.270532533886932\n - type: nauc_precision_at_1000_max\n value: 17.37365042394671\n - type: nauc_precision_at_1000_std\n value: 16.27166715693082\n - type: nauc_precision_at_100_diff1\n value: -13.175264889436313\n - type: nauc_precision_at_100_max\n value: 19.488571046893963\n - type: nauc_precision_at_100_std\n value: 9.055429698007798\n - type: nauc_precision_at_10_diff1\n value: 0.6806938753592942\n - type: nauc_precision_at_10_max\n value: 21.933083960522616\n - type: nauc_precision_at_10_std\n value: -18.2147036942157\n - type: nauc_precision_at_1_diff1\n value: 44.882846373248206\n - type: nauc_precision_at_1_max\n value: 26.865602442152554\n - type: nauc_precision_at_1_std\n value: -28.509295454329152\n - type: nauc_precision_at_20_diff1\n value: -4.318119150162302\n - type: nauc_precision_at_20_max\n value: 21.089702301041687\n - type: nauc_precision_at_20_std\n value: -10.333077681479546\n - type: nauc_precision_at_3_diff1\n value: 11.496076462671107\n - type: nauc_precision_at_3_max\n value: 23.018301549827008\n - type: nauc_precision_at_3_std\n value: -23.98652995416454\n - type: nauc_precision_at_5_diff1\n value: 4.271050668117355\n - type: nauc_precision_at_5_max\n value: 23.61051327966779\n - type: nauc_precision_at_5_std\n value: -21.557618503107847\n - type: nauc_recall_at_1000_diff1\n value: 62.23955911850697\n - type: nauc_recall_at_1000_max\n value: 83.20491723365542\n - type: nauc_recall_at_1000_std\n value: 66.5173462601958\n - type: nauc_recall_at_100_diff1\n value: 20.503778602988177\n - type: nauc_recall_at_100_max\n value: 29.379026288767506\n - type: nauc_recall_at_100_std\n value: -16.139120874540573\n - type: nauc_recall_at_10_diff1\n value: 27.659110249896557\n - type: nauc_recall_at_10_max\n value: 19.69557968026332\n - type: nauc_recall_at_10_std\n value: -33.95657132767551\n - type: nauc_recall_at_1_diff1\n value: 50.80313333312038\n - type: nauc_recall_at_1_max\n value: 16.649890421888156\n - type: nauc_recall_at_1_std\n value: -22.323989416471683\n - type: nauc_recall_at_20_diff1\n value: 27.084453724565176\n - type: nauc_recall_at_20_max\n value: 21.40080632474994\n - type: nauc_recall_at_20_std\n value: -32.83683639340239\n - type: nauc_recall_at_3_diff1\n value: 34.32950941333572\n - type: nauc_recall_at_3_max\n value: 18.55616615958199\n - type: nauc_recall_at_3_std\n value: -30.375983327454076\n - type: nauc_recall_at_5_diff1\n value: 29.44516734974564\n - type: nauc_recall_at_5_max\n value: 20.630543534300312\n - type: nauc_recall_at_5_std\n value: -31.30763062499127\n - type: ndcg_at_1\n value: 43.501\n - type: ndcg_at_10\n value: 50.831\n - type: ndcg_at_100\n value: 58.17099999999999\n - type: ndcg_at_1000\n value: 59.705\n - type: ndcg_at_20\n value: 54.047999999999995\n - type: ndcg_at_3\n value: 44.549\n - type: ndcg_at_5\n value: 46.861000000000004\n - type: precision_at_1\n value: 43.501\n - type: precision_at_10\n value: 12.895999999999999\n - type: precision_at_100\n value: 1.9\n - type: precision_at_1000\n value: 0.21\n - type: precision_at_20\n value: 7.593\n - type: precision_at_3\n value: 29.215000000000003\n - type: precision_at_5\n value: 21.57\n - type: recall_at_1\n value: 25.549\n - type: recall_at_10\n value: 61.795\n - type: recall_at_100\n value: 90.019\n - type: recall_at_1000\n value: 99.807\n - type: recall_at_20\n value: 72.096\n - type: recall_at_3\n value: 43.836999999999996\n - type: recall_at_5\n value: 51.714000000000006\n task:\n type: Retrieval\n - dataset:\n config: pol-pol\n name: MTEB XPQARetrieval (pol-pol)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 53.70399999999999\n - type: map_at_1\n value: 27.739000000000004\n - type: map_at_10\n value: 47.469\n - type: map_at_100\n value: 49.392\n - type: map_at_1000\n value: 49.483\n - type: map_at_20\n value: 48.646\n - type: map_at_3\n value: 41.467\n - type: map_at_5\n value: 45.467\n - type: mrr_at_1\n value: 47.00636942675159\n - type: mrr_at_10\n value: 54.63699322616519\n - type: mrr_at_100\n value: 55.54525182833755\n - type: mrr_at_1000\n value: 55.581331515356155\n - type: mrr_at_20\n value: 55.22918377451415\n - type: mrr_at_3\n value: 52.03821656050952\n - type: mrr_at_5\n value: 53.38216560509549\n - type: nauc_map_at_1000_diff1\n value: 45.03530825034854\n - type: nauc_map_at_1000_max\n value: 34.22740272603397\n - type: nauc_map_at_1000_std\n value: -30.428880484199244\n - type: nauc_map_at_100_diff1\n value: 44.978704455592805\n - type: nauc_map_at_100_max\n value: 34.20908357964765\n - type: nauc_map_at_100_std\n value: -30.47325365059666\n - type: nauc_map_at_10_diff1\n value: 44.9560579177672\n - type: nauc_map_at_10_max\n value: 33.70097588985278\n - type: nauc_map_at_10_std\n value: -31.205563222357885\n - type: nauc_map_at_1_diff1\n value: 57.94711780881773\n - type: nauc_map_at_1_max\n value: 21.60278071836319\n - type: nauc_map_at_1_std\n value: -23.273741268035923\n - type: nauc_map_at_20_diff1\n value: 44.97859054699532\n - type: nauc_map_at_20_max\n value: 34.153729150181846\n - type: nauc_map_at_20_std\n value: -30.97482545902907\n - type: nauc_map_at_3_diff1\n value: 47.52016138686765\n - type: nauc_map_at_3_max\n value: 30.176197065298417\n - type: nauc_map_at_3_std\n value: -29.90628984041898\n - type: nauc_map_at_5_diff1\n value: 45.36581638257985\n - type: nauc_map_at_5_max\n value: 33.697200263698036\n - type: nauc_map_at_5_std\n value: -31.165331120088453\n - type: nauc_mrr_at_1000_diff1\n value: 53.32889526818364\n - type: nauc_mrr_at_1000_max\n value: 36.104118340589736\n - type: nauc_mrr_at_1000_std\n value: -31.321132494516984\n - type: nauc_mrr_at_100_diff1\n value: 53.30695875258367\n - type: nauc_mrr_at_100_max\n value: 36.114890079024455\n - type: nauc_mrr_at_100_std\n value: -31.291749322117447\n - type: nauc_mrr_at_10_diff1\n value: 53.189084772141435\n - type: nauc_mrr_at_10_max\n value: 35.939061062282484\n - type: nauc_mrr_at_10_std\n value: -31.502185884653645\n - type: nauc_mrr_at_1_diff1\n value: 56.89368291041337\n - type: nauc_mrr_at_1_max\n value: 36.07581125496313\n - type: nauc_mrr_at_1_std\n value: -29.703764232519475\n - type: nauc_mrr_at_20_diff1\n value: 53.23955737199497\n - type: nauc_mrr_at_20_max\n value: 36.068824838215676\n - type: nauc_mrr_at_20_std\n value: -31.420039428197594\n - type: nauc_mrr_at_3_diff1\n value: 53.74385074861207\n - type: nauc_mrr_at_3_max\n value: 35.57054587735015\n - type: nauc_mrr_at_3_std\n value: -32.356894834537684\n - type: nauc_mrr_at_5_diff1\n value: 53.66669556981826\n - type: nauc_mrr_at_5_max\n value: 36.02102289605049\n - type: nauc_mrr_at_5_std\n value: -32.030437067359124\n - type: nauc_ndcg_at_1000_diff1\n value: 46.34900536768847\n - type: nauc_ndcg_at_1000_max\n value: 35.6314995837715\n - type: nauc_ndcg_at_1000_std\n value: -28.965103958822624\n - type: nauc_ndcg_at_100_diff1\n value: 45.1587893788861\n - type: nauc_ndcg_at_100_max\n value: 35.62430753595297\n - type: nauc_ndcg_at_100_std\n value: -28.77303405812772\n - type: nauc_ndcg_at_10_diff1\n value: 44.928781590765965\n - type: nauc_ndcg_at_10_max\n value: 34.315200006430366\n - type: nauc_ndcg_at_10_std\n value: -32.05164097076614\n - type: nauc_ndcg_at_1_diff1\n value: 57.228262350455125\n - type: nauc_ndcg_at_1_max\n value: 35.645285703387366\n - type: nauc_ndcg_at_1_std\n value: -29.893553821348718\n - type: nauc_ndcg_at_20_diff1\n value: 44.959903633039865\n - type: nauc_ndcg_at_20_max\n value: 35.493022926282755\n - type: nauc_ndcg_at_20_std\n value: -31.54989291850644\n - type: nauc_ndcg_at_3_diff1\n value: 46.65266185996905\n - type: nauc_ndcg_at_3_max\n value: 33.74458119579594\n - type: nauc_ndcg_at_3_std\n value: -31.493683304534176\n - type: nauc_ndcg_at_5_diff1\n value: 46.08707037187612\n - type: nauc_ndcg_at_5_max\n value: 34.7401426055243\n - type: nauc_ndcg_at_5_std\n value: -32.44390676345172\n - type: nauc_precision_at_1000_diff1\n value: -12.11355300492561\n - type: nauc_precision_at_1000_max\n value: 14.490738062121233\n - type: nauc_precision_at_1000_std\n value: 14.448811005059097\n - type: nauc_precision_at_100_diff1\n value: -9.742085657181239\n - type: nauc_precision_at_100_max\n value: 18.030305489251223\n - type: nauc_precision_at_100_std\n value: 8.213089709529765\n - type: nauc_precision_at_10_diff1\n value: 5.153466672774969\n - type: nauc_precision_at_10_max\n value: 27.29412644661678\n - type: nauc_precision_at_10_std\n value: -15.505053884112355\n - type: nauc_precision_at_1_diff1\n value: 57.228262350455125\n - type: nauc_precision_at_1_max\n value: 35.645285703387366\n - type: nauc_precision_at_1_std\n value: -29.893553821348718\n - type: nauc_precision_at_20_diff1\n value: -0.6812430761066635\n - type: nauc_precision_at_20_max\n value: 25.81911286466295\n - type: nauc_precision_at_20_std\n value: -8.388506222482595\n - type: nauc_precision_at_3_diff1\n value: 18.263873866510576\n - type: nauc_precision_at_3_max\n value: 30.879576105862345\n - type: nauc_precision_at_3_std\n value: -24.0342929870108\n - type: nauc_precision_at_5_diff1\n value: 10.9905804265327\n - type: nauc_precision_at_5_max\n value: 30.88468087429045\n - type: nauc_precision_at_5_std\n value: -20.458684056213507\n - type: nauc_recall_at_1000_diff1\n value: -64.887668417171\n - type: nauc_recall_at_1000_max\n value: 52.25501730358092\n - type: nauc_recall_at_1000_std\n value: 85.13647916200132\n - type: nauc_recall_at_100_diff1\n value: 18.956777346127655\n - type: nauc_recall_at_100_max\n value: 36.10473493564588\n - type: nauc_recall_at_100_std\n value: -10.007474558899949\n - type: nauc_recall_at_10_diff1\n value: 33.810344497568046\n - type: nauc_recall_at_10_max\n value: 31.395430183214245\n - type: nauc_recall_at_10_std\n value: -33.12920524433795\n - type: nauc_recall_at_1_diff1\n value: 57.94711780881773\n - type: nauc_recall_at_1_max\n value: 21.60278071836319\n - type: nauc_recall_at_1_std\n value: -23.273741268035923\n - type: nauc_recall_at_20_diff1\n value: 31.449657437065397\n - type: nauc_recall_at_20_max\n value: 34.519574934321945\n - type: nauc_recall_at_20_std\n value: -33.43406862055647\n - type: nauc_recall_at_3_diff1\n value: 42.07841848382365\n - type: nauc_recall_at_3_max\n value: 28.7648772833266\n - type: nauc_recall_at_3_std\n value: -31.56367736320086\n - type: nauc_recall_at_5_diff1\n value: 39.21392858246301\n - type: nauc_recall_at_5_max\n value: 34.28338202081927\n - type: nauc_recall_at_5_std\n value: -33.725680523721906\n - type: ndcg_at_1\n value: 46.879\n - type: ndcg_at_10\n value: 53.70399999999999\n - type: ndcg_at_100\n value: 60.532\n - type: ndcg_at_1000\n value: 61.997\n - type: ndcg_at_20\n value: 56.818999999999996\n - type: ndcg_at_3\n value: 47.441\n - type: ndcg_at_5\n value: 49.936\n - type: precision_at_1\n value: 46.879\n - type: precision_at_10\n value: 13.376\n - type: precision_at_100\n value: 1.8980000000000001\n - type: precision_at_1000\n value: 0.208\n - type: precision_at_20\n value: 7.771\n - type: precision_at_3\n value: 30.658\n - type: precision_at_5\n value: 22.828\n - type: recall_at_1\n value: 27.739000000000004\n - type: recall_at_10\n value: 64.197\n - type: recall_at_100\n value: 90.54100000000001\n - type: recall_at_1000\n value: 99.90400000000001\n - type: recall_at_20\n value: 74.178\n - type: recall_at_3\n value: 46.312\n - type: recall_at_5\n value: 54.581999999999994\n task:\n type: Retrieval\n - dataset:\n config: cmn-eng\n name: MTEB XPQARetrieval (cmn-eng)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 64.64\n - type: map_at_1\n value: 35.858000000000004\n - type: map_at_10\n value: 58.547000000000004\n - type: map_at_100\n value: 60.108\n - type: map_at_1000\n value: 60.153999999999996\n - type: map_at_20\n value: 59.528000000000006\n - type: map_at_3\n value: 51.578\n - type: map_at_5\n value: 56.206999999999994\n - type: mrr_at_1\n value: 56.95121951219512\n - type: mrr_at_10\n value: 64.93975029036001\n - type: mrr_at_100\n value: 65.63357055718294\n - type: mrr_at_1000\n value: 65.64844109026834\n - type: mrr_at_20\n value: 65.41280668715439\n - type: mrr_at_3\n value: 62.68292682926826\n - type: mrr_at_5\n value: 64.1585365853658\n - type: nauc_map_at_1000_diff1\n value: 45.82740870907091\n - type: nauc_map_at_1000_max\n value: 21.9696540066807\n - type: nauc_map_at_1000_std\n value: -32.028262356639495\n - type: nauc_map_at_100_diff1\n value: 45.802053117616396\n - type: nauc_map_at_100_max\n value: 21.946002070290966\n - type: nauc_map_at_100_std\n value: -32.06190418866229\n - type: nauc_map_at_10_diff1\n value: 46.017774155748945\n - type: nauc_map_at_10_max\n value: 21.876909086095544\n - type: nauc_map_at_10_std\n value: -32.13913568843985\n - type: nauc_map_at_1_diff1\n value: 56.34671160956164\n - type: nauc_map_at_1_max\n value: 17.6796949796236\n - type: nauc_map_at_1_std\n value: -13.741140688066045\n - type: nauc_map_at_20_diff1\n value: 46.027469176858716\n - type: nauc_map_at_20_max\n value: 21.80738432042703\n - type: nauc_map_at_20_std\n value: -32.430379634015395\n - type: nauc_map_at_3_diff1\n value: 48.40096725254027\n - type: nauc_map_at_3_max\n value: 21.15442803574233\n - type: nauc_map_at_3_std\n value: -26.205850292181417\n - type: nauc_map_at_5_diff1\n value: 45.77800041356389\n - type: nauc_map_at_5_max\n value: 22.11718771798752\n - type: nauc_map_at_5_std\n value: -30.32876338031471\n - type: nauc_mrr_at_1000_diff1\n value: 49.748274798877944\n - type: nauc_mrr_at_1000_max\n value: 24.547774167219906\n - type: nauc_mrr_at_1000_std\n value: -32.728447209433504\n - type: nauc_mrr_at_100_diff1\n value: 49.734549290377856\n - type: nauc_mrr_at_100_max\n value: 24.536933315055222\n - type: nauc_mrr_at_100_std\n value: -32.74076335880697\n - type: nauc_mrr_at_10_diff1\n value: 49.82827711456392\n - type: nauc_mrr_at_10_max\n value: 24.536773657485075\n - type: nauc_mrr_at_10_std\n value: -33.05707547166962\n - type: nauc_mrr_at_1_diff1\n value: 51.954289992321044\n - type: nauc_mrr_at_1_max\n value: 26.336255074856886\n - type: nauc_mrr_at_1_std\n value: -29.042962019692446\n - type: nauc_mrr_at_20_diff1\n value: 49.70938465628863\n - type: nauc_mrr_at_20_max\n value: 24.433219849576947\n - type: nauc_mrr_at_20_std\n value: -32.94123791846049\n - type: nauc_mrr_at_3_diff1\n value: 50.289486880347134\n - type: nauc_mrr_at_3_max\n value: 24.978796972860142\n - type: nauc_mrr_at_3_std\n value: -32.11305594784892\n - type: nauc_mrr_at_5_diff1\n value: 49.95013396316144\n - type: nauc_mrr_at_5_max\n value: 24.514452761198303\n - type: nauc_mrr_at_5_std\n value: -32.865859962984146\n - type: nauc_ndcg_at_1000_diff1\n value: 45.73806489233998\n - type: nauc_ndcg_at_1000_max\n value: 22.404941391043867\n - type: nauc_ndcg_at_1000_std\n value: -33.063445720849685\n - type: nauc_ndcg_at_100_diff1\n value: 45.1046206923062\n - type: nauc_ndcg_at_100_max\n value: 22.081133719684658\n - type: nauc_ndcg_at_100_std\n value: -33.299291459450146\n - type: nauc_ndcg_at_10_diff1\n value: 46.140608688357496\n - type: nauc_ndcg_at_10_max\n value: 21.442489279388916\n - type: nauc_ndcg_at_10_std\n value: -35.115870342856006\n - type: nauc_ndcg_at_1_diff1\n value: 51.954289992321044\n - type: nauc_ndcg_at_1_max\n value: 26.336255074856886\n - type: nauc_ndcg_at_1_std\n value: -29.042962019692446\n - type: nauc_ndcg_at_20_diff1\n value: 45.966784725457046\n - type: nauc_ndcg_at_20_max\n value: 21.166632858613145\n - type: nauc_ndcg_at_20_std\n value: -35.65112890375392\n - type: nauc_ndcg_at_3_diff1\n value: 46.7404863978999\n - type: nauc_ndcg_at_3_max\n value: 22.701743709129456\n - type: nauc_ndcg_at_3_std\n value: -30.907633466983192\n - type: nauc_ndcg_at_5_diff1\n value: 45.86487199083486\n - type: nauc_ndcg_at_5_max\n value: 22.088804840002513\n - type: nauc_ndcg_at_5_std\n value: -32.3853481632832\n - type: nauc_precision_at_1000_diff1\n value: -25.69710612774455\n - type: nauc_precision_at_1000_max\n value: 1.3964400247388091\n - type: nauc_precision_at_1000_std\n value: -8.873947511634814\n - type: nauc_precision_at_100_diff1\n value: -24.013497191077978\n - type: nauc_precision_at_100_max\n value: 2.0197725715909343\n - type: nauc_precision_at_100_std\n value: -11.387423148770633\n - type: nauc_precision_at_10_diff1\n value: -6.47728645242781\n - type: nauc_precision_at_10_max\n value: 6.815261443768304\n - type: nauc_precision_at_10_std\n value: -26.825062292855943\n - type: nauc_precision_at_1_diff1\n value: 51.954289992321044\n - type: nauc_precision_at_1_max\n value: 26.336255074856886\n - type: nauc_precision_at_1_std\n value: -29.042962019692446\n - type: nauc_precision_at_20_diff1\n value: -12.355232044747511\n - type: nauc_precision_at_20_max\n value: 4.022126850949725\n - type: nauc_precision_at_20_std\n value: -23.688935769326772\n - type: nauc_precision_at_3_diff1\n value: 7.662671665835864\n - type: nauc_precision_at_3_max\n value: 14.372394760986248\n - type: nauc_precision_at_3_std\n value: -28.635125665532453\n - type: nauc_precision_at_5_diff1\n value: -1.4592476425511611\n - type: nauc_precision_at_5_max\n value: 11.124310161474174\n - type: nauc_precision_at_5_std\n value: -27.89526669318053\n - type: nauc_recall_at_1000_diff1\n value: -19.58450046684932\n - type: nauc_recall_at_1000_max\n value: 70.71661998133165\n - type: nauc_recall_at_1000_std\n value: 93.05555555556315\n - type: nauc_recall_at_100_diff1\n value: 15.06356457571853\n - type: nauc_recall_at_100_max\n value: 14.051414749344806\n - type: nauc_recall_at_100_std\n value: -29.461874235153008\n - type: nauc_recall_at_10_diff1\n value: 41.29842726117901\n - type: nauc_recall_at_10_max\n value: 15.768699673830898\n - type: nauc_recall_at_10_std\n value: -42.11585661287712\n - type: nauc_recall_at_1_diff1\n value: 56.34671160956164\n - type: nauc_recall_at_1_max\n value: 17.6796949796236\n - type: nauc_recall_at_1_std\n value: -13.741140688066045\n - type: nauc_recall_at_20_diff1\n value: 38.8078283585263\n - type: nauc_recall_at_20_max\n value: 12.06816084005326\n - type: nauc_recall_at_20_std\n value: -48.20956170056591\n - type: nauc_recall_at_3_diff1\n value: 44.71028758038993\n - type: nauc_recall_at_3_max\n value: 19.1059093689162\n - type: nauc_recall_at_3_std\n value: -26.795164453784253\n - type: nauc_recall_at_5_diff1\n value: 41.06320797773054\n - type: nauc_recall_at_5_max\n value: 19.117028272530998\n - type: nauc_recall_at_5_std\n value: -33.985747504612156\n - type: ndcg_at_1\n value: 56.95099999999999\n - type: ndcg_at_10\n value: 64.64\n - type: ndcg_at_100\n value: 70.017\n - type: ndcg_at_1000\n value: 70.662\n - type: ndcg_at_20\n value: 67.256\n - type: ndcg_at_3\n value: 58.269000000000005\n - type: ndcg_at_5\n value: 60.94199999999999\n - type: precision_at_1\n value: 56.95099999999999\n - type: precision_at_10\n value: 15.671\n - type: precision_at_100\n value: 2.002\n - type: precision_at_1000\n value: 0.208\n - type: precision_at_20\n value: 8.689\n - type: precision_at_3\n value: 36.341\n - type: precision_at_5\n value: 26.854\n - type: recall_at_1\n value: 35.858000000000004\n - type: recall_at_10\n value: 75.02\n - type: recall_at_100\n value: 95.76\n - type: recall_at_1000\n value: 99.837\n - type: recall_at_20\n value: 83.732\n - type: recall_at_3\n value: 57.093\n - type: recall_at_5\n value: 66.193\n task:\n type: Retrieval\n - dataset:\n config: cmn-cmn\n name: MTEB XPQARetrieval (cmn-cmn)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 69.446\n - type: map_at_1\n value: 39.995999999999995\n - type: map_at_10\n value: 64.033\n - type: map_at_100\n value: 65.51599999999999\n - type: map_at_1000\n value: 65.545\n - type: map_at_20\n value: 64.958\n - type: map_at_3\n value: 57.767\n - type: map_at_5\n value: 61.998\n - type: mrr_at_1\n value: 63.3495145631068\n - type: mrr_at_10\n value: 70.21146363075978\n - type: mrr_at_100\n value: 70.82810974202124\n - type: mrr_at_1000\n value: 70.83816803303915\n - type: mrr_at_20\n value: 70.60140248428802\n - type: mrr_at_3\n value: 68.66909385113267\n - type: mrr_at_5\n value: 69.56108414239482\n - type: nauc_map_at_1000_diff1\n value: 51.649897072831465\n - type: nauc_map_at_1000_max\n value: 38.25222728655331\n - type: nauc_map_at_1000_std\n value: -39.10327919949334\n - type: nauc_map_at_100_diff1\n value: 51.644205886401465\n - type: nauc_map_at_100_max\n value: 38.23611154355255\n - type: nauc_map_at_100_std\n value: -39.1677073977285\n - type: nauc_map_at_10_diff1\n value: 51.81444145636039\n - type: nauc_map_at_10_max\n value: 38.03382104326485\n - type: nauc_map_at_10_std\n value: -38.999395639812015\n - type: nauc_map_at_1_diff1\n value: 59.785298201044704\n - type: nauc_map_at_1_max\n value: 23.273537759937785\n - type: nauc_map_at_1_std\n value: -17.838712689290194\n - type: nauc_map_at_20_diff1\n value: 51.680208795601004\n - type: nauc_map_at_20_max\n value: 38.23334583518634\n - type: nauc_map_at_20_std\n value: -39.24344495939061\n - type: nauc_map_at_3_diff1\n value: 52.180913298194056\n - type: nauc_map_at_3_max\n value: 33.45482478000481\n - type: nauc_map_at_3_std\n value: -31.682911030586297\n - type: nauc_map_at_5_diff1\n value: 50.804900676175436\n - type: nauc_map_at_5_max\n value: 37.68924816012326\n - type: nauc_map_at_5_std\n value: -36.85016896616712\n - type: nauc_mrr_at_1000_diff1\n value: 56.371477471577535\n - type: nauc_mrr_at_1000_max\n value: 42.773877962050086\n - type: nauc_mrr_at_1000_std\n value: -40.41765081873682\n - type: nauc_mrr_at_100_diff1\n value: 56.3619751528192\n - type: nauc_mrr_at_100_max\n value: 42.76298794859916\n - type: nauc_mrr_at_100_std\n value: -40.44070582448831\n - type: nauc_mrr_at_10_diff1\n value: 56.33810523477712\n - type: nauc_mrr_at_10_max\n value: 42.76591937795783\n - type: nauc_mrr_at_10_std\n value: -40.69339583030244\n - type: nauc_mrr_at_1_diff1\n value: 58.90399906884378\n - type: nauc_mrr_at_1_max\n value: 43.38806571165292\n - type: nauc_mrr_at_1_std\n value: -38.224015285584\n - type: nauc_mrr_at_20_diff1\n value: 56.32629070537032\n - type: nauc_mrr_at_20_max\n value: 42.79615263472604\n - type: nauc_mrr_at_20_std\n value: -40.496777397603076\n - type: nauc_mrr_at_3_diff1\n value: 55.96989454480743\n - type: nauc_mrr_at_3_max\n value: 42.49832220744744\n - type: nauc_mrr_at_3_std\n value: -39.883799467132384\n - type: nauc_mrr_at_5_diff1\n value: 56.003080766475755\n - type: nauc_mrr_at_5_max\n value: 42.73308051011805\n - type: nauc_mrr_at_5_std\n value: -39.87179511166683\n - type: nauc_ndcg_at_1000_diff1\n value: 52.49054229225255\n - type: nauc_ndcg_at_1000_max\n value: 39.61644750719859\n - type: nauc_ndcg_at_1000_std\n value: -40.89845763194674\n - type: nauc_ndcg_at_100_diff1\n value: 52.33511250864434\n - type: nauc_ndcg_at_100_max\n value: 39.25530146124452\n - type: nauc_ndcg_at_100_std\n value: -41.92444498004374\n - type: nauc_ndcg_at_10_diff1\n value: 52.62031505931842\n - type: nauc_ndcg_at_10_max\n value: 38.667195545396766\n - type: nauc_ndcg_at_10_std\n value: -42.59503924641507\n - type: nauc_ndcg_at_1_diff1\n value: 58.90399906884378\n - type: nauc_ndcg_at_1_max\n value: 43.38806571165292\n - type: nauc_ndcg_at_1_std\n value: -38.224015285584\n - type: nauc_ndcg_at_20_diff1\n value: 52.15061629809436\n - type: nauc_ndcg_at_20_max\n value: 39.09332400054708\n - type: nauc_ndcg_at_20_std\n value: -42.80018671618001\n - type: nauc_ndcg_at_3_diff1\n value: 51.04210728138207\n - type: nauc_ndcg_at_3_max\n value: 38.19034802567046\n - type: nauc_ndcg_at_3_std\n value: -38.179821090765216\n - type: nauc_ndcg_at_5_diff1\n value: 51.04399574045204\n - type: nauc_ndcg_at_5_max\n value: 38.42492210204548\n - type: nauc_ndcg_at_5_std\n value: -38.868073241617715\n - type: nauc_precision_at_1000_diff1\n value: -25.151369907213734\n - type: nauc_precision_at_1000_max\n value: 9.012549147054989\n - type: nauc_precision_at_1000_std\n value: -9.319786589947698\n - type: nauc_precision_at_100_diff1\n value: -23.20945211843088\n - type: nauc_precision_at_100_max\n value: 9.860701593969862\n - type: nauc_precision_at_100_std\n value: -13.073877818347231\n - type: nauc_precision_at_10_diff1\n value: -6.970781124246847\n - type: nauc_precision_at_10_max\n value: 19.392675322254487\n - type: nauc_precision_at_10_std\n value: -26.74943490717657\n - type: nauc_precision_at_1_diff1\n value: 58.90399906884378\n - type: nauc_precision_at_1_max\n value: 43.38806571165292\n - type: nauc_precision_at_1_std\n value: -38.224015285584\n - type: nauc_precision_at_20_diff1\n value: -13.046456108081102\n - type: nauc_precision_at_20_max\n value: 15.69439950383875\n - type: nauc_precision_at_20_std\n value: -23.836004512018093\n - type: nauc_precision_at_3_diff1\n value: 3.5444232965528846\n - type: nauc_precision_at_3_max\n value: 27.08858445453865\n - type: nauc_precision_at_3_std\n value: -29.12757283665593\n - type: nauc_precision_at_5_diff1\n value: -3.6853986353320267\n - type: nauc_precision_at_5_max\n value: 24.32059689571271\n - type: nauc_precision_at_5_std\n value: -27.46188072134163\n - type: nauc_recall_at_1000_diff1\n value: 86.93515141907919\n - type: nauc_recall_at_1000_max\n value: 100.0\n - type: nauc_recall_at_1000_std\n value: 100.0\n - type: nauc_recall_at_100_diff1\n value: 39.7052887613879\n - type: nauc_recall_at_100_max\n value: 18.40943977796887\n - type: nauc_recall_at_100_std\n value: -88.74014854144974\n - type: nauc_recall_at_10_diff1\n value: 48.85342500870892\n - type: nauc_recall_at_10_max\n value: 32.69617204234419\n - type: nauc_recall_at_10_std\n value: -51.9937231860804\n - type: nauc_recall_at_1_diff1\n value: 59.785298201044704\n - type: nauc_recall_at_1_max\n value: 23.273537759937785\n - type: nauc_recall_at_1_std\n value: -17.838712689290194\n - type: nauc_recall_at_20_diff1\n value: 45.40839773314378\n - type: nauc_recall_at_20_max\n value: 33.02458321493215\n - type: nauc_recall_at_20_std\n value: -55.97800739448166\n - type: nauc_recall_at_3_diff1\n value: 47.05565693416531\n - type: nauc_recall_at_3_max\n value: 28.743850400344297\n - type: nauc_recall_at_3_std\n value: -32.436470486397475\n - type: nauc_recall_at_5_diff1\n value: 45.30223758669577\n - type: nauc_recall_at_5_max\n value: 33.6567274747059\n - type: nauc_recall_at_5_std\n value: -39.946712017948514\n - type: ndcg_at_1\n value: 63.349999999999994\n - type: ndcg_at_10\n value: 69.446\n - type: ndcg_at_100\n value: 74.439\n - type: ndcg_at_1000\n value: 74.834\n - type: ndcg_at_20\n value: 71.763\n - type: ndcg_at_3\n value: 64.752\n - type: ndcg_at_5\n value: 66.316\n - type: precision_at_1\n value: 63.349999999999994\n - type: precision_at_10\n value: 16.286\n - type: precision_at_100\n value: 2.024\n - type: precision_at_1000\n value: 0.207\n - type: precision_at_20\n value: 8.908000000000001\n - type: precision_at_3\n value: 40.655\n - type: precision_at_5\n value: 28.859\n - type: recall_at_1\n value: 39.995999999999995\n - type: recall_at_10\n value: 78.107\n - type: recall_at_100\n value: 97.538\n - type: recall_at_1000\n value: 99.96000000000001\n - type: recall_at_20\n value: 85.72\n - type: recall_at_3\n value: 63.291\n - type: recall_at_5\n value: 70.625\n task:\n type: Retrieval\n - dataset:\n config: spa-eng\n name: MTEB XPQARetrieval (spa-eng)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 68.258\n - type: map_at_1\n value: 33.06\n - type: map_at_10\n value: 61.590999999999994\n - type: map_at_100\n value: 63.341\n - type: map_at_1000\n value: 63.385999999999996\n - type: map_at_20\n value: 62.77700000000001\n - type: map_at_3\n value: 52.547999999999995\n - type: map_at_5\n value: 58.824\n - type: mrr_at_1\n value: 63.80832282471627\n - type: mrr_at_10\n value: 70.76848015372607\n - type: mrr_at_100\n value: 71.33996704518061\n - type: mrr_at_1000\n value: 71.35368444388072\n - type: mrr_at_20\n value: 71.18191741103522\n - type: mrr_at_3\n value: 68.83144178226142\n - type: mrr_at_5\n value: 69.88440521227405\n - type: nauc_map_at_1000_diff1\n value: 41.59255746310511\n - type: nauc_map_at_1000_max\n value: 42.064075373358065\n - type: nauc_map_at_1000_std\n value: -25.130730194381723\n - type: nauc_map_at_100_diff1\n value: 41.56447648820406\n - type: nauc_map_at_100_max\n value: 42.06711634651607\n - type: nauc_map_at_100_std\n value: -25.14871585556968\n - type: nauc_map_at_10_diff1\n value: 41.28968387107058\n - type: nauc_map_at_10_max\n value: 41.511538272139774\n - type: nauc_map_at_10_std\n value: -25.99906440164276\n - type: nauc_map_at_1_diff1\n value: 51.09859596320021\n - type: nauc_map_at_1_max\n value: 12.406789321338222\n - type: nauc_map_at_1_std\n value: -18.227486548655076\n - type: nauc_map_at_20_diff1\n value: 41.39469672947315\n - type: nauc_map_at_20_max\n value: 41.98309315808902\n - type: nauc_map_at_20_std\n value: -25.44704720985219\n - type: nauc_map_at_3_diff1\n value: 43.16164995512842\n - type: nauc_map_at_3_max\n value: 30.935400935562818\n - type: nauc_map_at_3_std\n value: -23.53095555148866\n - type: nauc_map_at_5_diff1\n value: 41.23474352142375\n - type: nauc_map_at_5_max\n value: 39.03088859147947\n - type: nauc_map_at_5_std\n value: -26.046526443708366\n - type: nauc_mrr_at_1000_diff1\n value: 51.79649678213789\n - type: nauc_mrr_at_1000_max\n value: 50.50340748045259\n - type: nauc_mrr_at_1000_std\n value: -24.777183703493407\n - type: nauc_mrr_at_100_diff1\n value: 51.78609028166551\n - type: nauc_mrr_at_100_max\n value: 50.51732896833555\n - type: nauc_mrr_at_100_std\n value: -24.760054686874717\n - type: nauc_mrr_at_10_diff1\n value: 51.705268395036995\n - type: nauc_mrr_at_10_max\n value: 50.35818415293149\n - type: nauc_mrr_at_10_std\n value: -25.170367120250404\n - type: nauc_mrr_at_1_diff1\n value: 53.91475115581825\n - type: nauc_mrr_at_1_max\n value: 49.122529616282016\n - type: nauc_mrr_at_1_std\n value: -22.377647552937155\n - type: nauc_mrr_at_20_diff1\n value: 51.778984221197774\n - type: nauc_mrr_at_20_max\n value: 50.5070957827813\n - type: nauc_mrr_at_20_std\n value: -24.908935023607285\n - type: nauc_mrr_at_3_diff1\n value: 51.82683773090423\n - type: nauc_mrr_at_3_max\n value: 50.77993196421369\n - type: nauc_mrr_at_3_std\n value: -24.3925832021831\n - type: nauc_mrr_at_5_diff1\n value: 51.722232683543034\n - type: nauc_mrr_at_5_max\n value: 50.334865493961864\n - type: nauc_mrr_at_5_std\n value: -25.513593495703297\n - type: nauc_ndcg_at_1000_diff1\n value: 44.21851582991263\n - type: nauc_ndcg_at_1000_max\n value: 45.73539068637836\n - type: nauc_ndcg_at_1000_std\n value: -24.716522467580397\n - type: nauc_ndcg_at_100_diff1\n value: 43.8002401615357\n - type: nauc_ndcg_at_100_max\n value: 45.801409410061915\n - type: nauc_ndcg_at_100_std\n value: -24.73171742499903\n - type: nauc_ndcg_at_10_diff1\n value: 42.540922778755885\n - type: nauc_ndcg_at_10_max\n value: 44.348836943874595\n - type: nauc_ndcg_at_10_std\n value: -28.05403666494785\n - type: nauc_ndcg_at_1_diff1\n value: 53.91475115581825\n - type: nauc_ndcg_at_1_max\n value: 49.122529616282016\n - type: nauc_ndcg_at_1_std\n value: -22.377647552937155\n - type: nauc_ndcg_at_20_diff1\n value: 43.10347921163421\n - type: nauc_ndcg_at_20_max\n value: 45.53253270265022\n - type: nauc_ndcg_at_20_std\n value: -26.63902791862846\n - type: nauc_ndcg_at_3_diff1\n value: 42.41720274782384\n - type: nauc_ndcg_at_3_max\n value: 42.91778219334943\n - type: nauc_ndcg_at_3_std\n value: -24.793252033594076\n - type: nauc_ndcg_at_5_diff1\n value: 42.51515034945093\n - type: nauc_ndcg_at_5_max\n value: 41.62080576508792\n - type: nauc_ndcg_at_5_std\n value: -28.209669314955065\n - type: nauc_precision_at_1000_diff1\n value: -14.89794075433148\n - type: nauc_precision_at_1000_max\n value: 27.85387929356412\n - type: nauc_precision_at_1000_std\n value: 10.728618597190849\n - type: nauc_precision_at_100_diff1\n value: -13.075270046295856\n - type: nauc_precision_at_100_max\n value: 29.77208946756632\n - type: nauc_precision_at_100_std\n value: 8.491662697326039\n - type: nauc_precision_at_10_diff1\n value: -4.0826025188781205\n - type: nauc_precision_at_10_max\n value: 39.04278085180075\n - type: nauc_precision_at_10_std\n value: -5.925408651372333\n - type: nauc_precision_at_1_diff1\n value: 53.91475115581825\n - type: nauc_precision_at_1_max\n value: 49.122529616282016\n - type: nauc_precision_at_1_std\n value: -22.377647552937155\n - type: nauc_precision_at_20_diff1\n value: -7.93186440645135\n - type: nauc_precision_at_20_max\n value: 35.81281308891365\n - type: nauc_precision_at_20_std\n value: 0.1241277857515697\n - type: nauc_precision_at_3_diff1\n value: 7.563562511484409\n - type: nauc_precision_at_3_max\n value: 43.43738862378524\n - type: nauc_precision_at_3_std\n value: -11.958059731912615\n - type: nauc_precision_at_5_diff1\n value: -0.1801152449011624\n - type: nauc_precision_at_5_max\n value: 41.32486715619513\n - type: nauc_precision_at_5_std\n value: -10.088699021919552\n - type: nauc_recall_at_1000_diff1\n value: 86.93359696819986\n - type: nauc_recall_at_1000_max\n value: 100.0\n - type: nauc_recall_at_1000_std\n value: 72.21843645604022\n - type: nauc_recall_at_100_diff1\n value: 29.86050842714198\n - type: nauc_recall_at_100_max\n value: 48.106658251136245\n - type: nauc_recall_at_100_std\n value: -14.981886214880035\n - type: nauc_recall_at_10_diff1\n value: 33.67119240737528\n - type: nauc_recall_at_10_max\n value: 39.271984859561414\n - type: nauc_recall_at_10_std\n value: -35.6434883839217\n - type: nauc_recall_at_1_diff1\n value: 51.09859596320021\n - type: nauc_recall_at_1_max\n value: 12.406789321338222\n - type: nauc_recall_at_1_std\n value: -18.227486548655076\n - type: nauc_recall_at_20_diff1\n value: 33.211979983240724\n - type: nauc_recall_at_20_max\n value: 43.47676074743184\n - type: nauc_recall_at_20_std\n value: -33.88107138395349\n - type: nauc_recall_at_3_diff1\n value: 39.22513750146998\n - type: nauc_recall_at_3_max\n value: 27.066674083840166\n - type: nauc_recall_at_3_std\n value: -26.963282529629893\n - type: nauc_recall_at_5_diff1\n value: 36.53718917129459\n - type: nauc_recall_at_5_max\n value: 35.40550013169686\n - type: nauc_recall_at_5_std\n value: -34.209159379410806\n - type: ndcg_at_1\n value: 63.808\n - type: ndcg_at_10\n value: 68.258\n - type: ndcg_at_100\n value: 73.38799999999999\n - type: ndcg_at_1000\n value: 74.03\n - type: ndcg_at_20\n value: 70.968\n - type: ndcg_at_3\n value: 62.33\n - type: ndcg_at_5\n value: 64.096\n - type: precision_at_1\n value: 63.808\n - type: precision_at_10\n value: 19.243\n - type: precision_at_100\n value: 2.367\n - type: precision_at_1000\n value: 0.245\n - type: precision_at_20\n value: 10.599\n - type: precision_at_3\n value: 44.515\n - type: precision_at_5\n value: 33.467999999999996\n - type: recall_at_1\n value: 33.06\n - type: recall_at_10\n value: 77.423\n - type: recall_at_100\n value: 95.923\n - type: recall_at_1000\n value: 99.874\n - type: recall_at_20\n value: 85.782\n - type: recall_at_3\n value: 57.098000000000006\n - type: recall_at_5\n value: 67.472\n task:\n type: Retrieval\n - dataset:\n config: spa-spa\n name: MTEB XPQARetrieval (spa-spa)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 72.004\n - type: map_at_1\n value: 36.248000000000005\n - type: map_at_10\n value: 65.679\n - type: map_at_100\n value: 67.22399999999999\n - type: map_at_1000\n value: 67.264\n - type: map_at_20\n value: 66.705\n - type: map_at_3\n value: 56.455\n - type: map_at_5\n value: 62.997\n - type: mrr_at_1\n value: 67.71752837326608\n - type: mrr_at_10\n value: 74.59782021257429\n - type: mrr_at_100\n value: 75.0640960767943\n - type: mrr_at_1000\n value: 75.07324799466076\n - type: mrr_at_20\n value: 74.9323963386884\n - type: mrr_at_3\n value: 72.95081967213115\n - type: mrr_at_5\n value: 73.82723833543506\n - type: nauc_map_at_1000_diff1\n value: 43.111810717567714\n - type: nauc_map_at_1000_max\n value: 44.835247208972476\n - type: nauc_map_at_1000_std\n value: -32.798405973931985\n - type: nauc_map_at_100_diff1\n value: 43.090223482932764\n - type: nauc_map_at_100_max\n value: 44.83392441557943\n - type: nauc_map_at_100_std\n value: -32.81149166676563\n - type: nauc_map_at_10_diff1\n value: 42.87841934951979\n - type: nauc_map_at_10_max\n value: 43.9838653389494\n - type: nauc_map_at_10_std\n value: -33.588084643627084\n - type: nauc_map_at_1_diff1\n value: 54.509245848379095\n - type: nauc_map_at_1_max\n value: 10.05921648322742\n - type: nauc_map_at_1_std\n value: -24.652326014826762\n - type: nauc_map_at_20_diff1\n value: 43.07468612984794\n - type: nauc_map_at_20_max\n value: 44.75663122615032\n - type: nauc_map_at_20_std\n value: -33.11788887878321\n - type: nauc_map_at_3_diff1\n value: 44.63272828938906\n - type: nauc_map_at_3_max\n value: 32.1584369869227\n - type: nauc_map_at_3_std\n value: -30.761662210142944\n - type: nauc_map_at_5_diff1\n value: 42.77296997803048\n - type: nauc_map_at_5_max\n value: 41.78894616737652\n - type: nauc_map_at_5_std\n value: -33.56459774477362\n - type: nauc_mrr_at_1000_diff1\n value: 53.097544131833494\n - type: nauc_mrr_at_1000_max\n value: 50.61134979184588\n - type: nauc_mrr_at_1000_std\n value: -35.6221191487669\n - type: nauc_mrr_at_100_diff1\n value: 53.096609856182106\n - type: nauc_mrr_at_100_max\n value: 50.61951585642645\n - type: nauc_mrr_at_100_std\n value: -35.62396157508327\n - type: nauc_mrr_at_10_diff1\n value: 52.771534471912304\n - type: nauc_mrr_at_10_max\n value: 50.430863224435726\n - type: nauc_mrr_at_10_std\n value: -36.027992076620365\n - type: nauc_mrr_at_1_diff1\n value: 55.05316238884337\n - type: nauc_mrr_at_1_max\n value: 49.461858515275196\n - type: nauc_mrr_at_1_std\n value: -31.87492636319712\n - type: nauc_mrr_at_20_diff1\n value: 53.083253469629746\n - type: nauc_mrr_at_20_max\n value: 50.62156424256193\n - type: nauc_mrr_at_20_std\n value: -35.879153692447154\n - type: nauc_mrr_at_3_diff1\n value: 52.98283109188415\n - type: nauc_mrr_at_3_max\n value: 50.83561260429378\n - type: nauc_mrr_at_3_std\n value: -35.30839538038797\n - type: nauc_mrr_at_5_diff1\n value: 52.93270510879709\n - type: nauc_mrr_at_5_max\n value: 50.54595596761199\n - type: nauc_mrr_at_5_std\n value: -35.84059376434395\n - type: nauc_ndcg_at_1000_diff1\n value: 45.343685089209416\n - type: nauc_ndcg_at_1000_max\n value: 47.801141576669465\n - type: nauc_ndcg_at_1000_std\n value: -33.512958862879195\n - type: nauc_ndcg_at_100_diff1\n value: 45.255590461515894\n - type: nauc_ndcg_at_100_max\n value: 47.99240031881967\n - type: nauc_ndcg_at_100_std\n value: -33.614465006695205\n - type: nauc_ndcg_at_10_diff1\n value: 43.93472511731019\n - type: nauc_ndcg_at_10_max\n value: 45.92599752897053\n - type: nauc_ndcg_at_10_std\n value: -36.43629114491574\n - type: nauc_ndcg_at_1_diff1\n value: 55.05316238884337\n - type: nauc_ndcg_at_1_max\n value: 49.461858515275196\n - type: nauc_ndcg_at_1_std\n value: -31.87492636319712\n - type: nauc_ndcg_at_20_diff1\n value: 44.93534591273201\n - type: nauc_ndcg_at_20_max\n value: 47.55153940713458\n - type: nauc_ndcg_at_20_std\n value: -35.56392448745206\n - type: nauc_ndcg_at_3_diff1\n value: 43.17916122133396\n - type: nauc_ndcg_at_3_max\n value: 45.603634205103276\n - type: nauc_ndcg_at_3_std\n value: -32.473227507181214\n - type: nauc_ndcg_at_5_diff1\n value: 44.10242961669216\n - type: nauc_ndcg_at_5_max\n value: 43.61666669031808\n - type: nauc_ndcg_at_5_std\n value: -35.98808321497782\n - type: nauc_precision_at_1000_diff1\n value: -23.264714449991146\n - type: nauc_precision_at_1000_max\n value: 28.505729576735465\n - type: nauc_precision_at_1000_std\n value: 11.987379232920926\n - type: nauc_precision_at_100_diff1\n value: -21.156119174614627\n - type: nauc_precision_at_100_max\n value: 30.711646221646255\n - type: nauc_precision_at_100_std\n value: 9.650486536340322\n - type: nauc_precision_at_10_diff1\n value: -10.98001328477502\n - type: nauc_precision_at_10_max\n value: 39.25638073760597\n - type: nauc_precision_at_10_std\n value: -4.3456859257488\n - type: nauc_precision_at_1_diff1\n value: 55.05316238884337\n - type: nauc_precision_at_1_max\n value: 49.461858515275196\n - type: nauc_precision_at_1_std\n value: -31.87492636319712\n - type: nauc_precision_at_20_diff1\n value: -14.97565390664424\n - type: nauc_precision_at_20_max\n value: 36.383835295942355\n - type: nauc_precision_at_20_std\n value: 1.525158880381114\n - type: nauc_precision_at_3_diff1\n value: 1.0448345623903483\n - type: nauc_precision_at_3_max\n value: 45.69772060667404\n - type: nauc_precision_at_3_std\n value: -13.002685018948293\n - type: nauc_precision_at_5_diff1\n value: -5.434185597628904\n - type: nauc_precision_at_5_max\n value: 42.99162431099203\n - type: nauc_precision_at_5_std\n value: -9.789308817624534\n - type: nauc_recall_at_1000_diff1\n value: 12.309303236094845\n - type: nauc_recall_at_1000_max\n value: 100.0\n - type: nauc_recall_at_1000_std\n value: 86.93359696819986\n - type: nauc_recall_at_100_diff1\n value: 39.093544920901415\n - type: nauc_recall_at_100_max\n value: 55.62814395062938\n - type: nauc_recall_at_100_std\n value: -22.6919033301514\n - type: nauc_recall_at_10_diff1\n value: 35.50100141633622\n - type: nauc_recall_at_10_max\n value: 39.25750019586647\n - type: nauc_recall_at_10_std\n value: -43.01273078031791\n - type: nauc_recall_at_1_diff1\n value: 54.509245848379095\n - type: nauc_recall_at_1_max\n value: 10.05921648322742\n - type: nauc_recall_at_1_std\n value: -24.652326014826762\n - type: nauc_recall_at_20_diff1\n value: 38.1281707132327\n - type: nauc_recall_at_20_max\n value: 43.97950642900301\n - type: nauc_recall_at_20_std\n value: -44.049952771307574\n - type: nauc_recall_at_3_diff1\n value: 40.01986938242728\n - type: nauc_recall_at_3_max\n value: 27.517114421061173\n - type: nauc_recall_at_3_std\n value: -32.99056780232045\n - type: nauc_recall_at_5_diff1\n value: 38.52035606499483\n - type: nauc_recall_at_5_max\n value: 37.05834604678859\n - type: nauc_recall_at_5_std\n value: -39.86196378897912\n - type: ndcg_at_1\n value: 67.718\n - type: ndcg_at_10\n value: 72.004\n - type: ndcg_at_100\n value: 76.554\n - type: ndcg_at_1000\n value: 77.07300000000001\n - type: ndcg_at_20\n value: 74.37899999999999\n - type: ndcg_at_3\n value: 66.379\n - type: ndcg_at_5\n value: 68.082\n - type: precision_at_1\n value: 67.718\n - type: precision_at_10\n value: 19.849\n - type: precision_at_100\n value: 2.3800000000000003\n - type: precision_at_1000\n value: 0.245\n - type: precision_at_20\n value: 10.813\n - type: precision_at_3\n value: 46.574\n - type: precision_at_5\n value: 34.83\n - type: recall_at_1\n value: 36.248000000000005\n - type: recall_at_10\n value: 80.252\n - type: recall_at_100\n value: 96.73\n - type: recall_at_1000\n value: 99.874\n - type: recall_at_20\n value: 87.703\n - type: recall_at_3\n value: 60.815\n - type: recall_at_5\n value: 71.16\n task:\n type: Retrieval\n - dataset:\n config: fra-eng\n name: MTEB XPQARetrieval (fra-eng)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 73.729\n - type: map_at_1\n value: 43.964999999999996\n - type: map_at_10\n value: 67.803\n - type: map_at_100\n value: 69.188\n - type: map_at_1000\n value: 69.21000000000001\n - type: map_at_20\n value: 68.747\n - type: map_at_3\n value: 60.972\n - type: map_at_5\n value: 65.39399999999999\n - type: mrr_at_1\n value: 68.4913217623498\n - type: mrr_at_10\n value: 75.2600822260368\n - type: mrr_at_100\n value: 75.6599169808848\n - type: mrr_at_1000\n value: 75.66720883727534\n - type: mrr_at_20\n value: 75.52375865860405\n - type: mrr_at_3\n value: 73.54250111259452\n - type: mrr_at_5\n value: 74.51713395638626\n - type: nauc_map_at_1000_diff1\n value: 46.81533703002097\n - type: nauc_map_at_1000_max\n value: 46.30794757084772\n - type: nauc_map_at_1000_std\n value: -14.953470500312335\n - type: nauc_map_at_100_diff1\n value: 46.82464740277745\n - type: nauc_map_at_100_max\n value: 46.32852879948254\n - type: nauc_map_at_100_std\n value: -14.950035098066172\n - type: nauc_map_at_10_diff1\n value: 46.31406143369831\n - type: nauc_map_at_10_max\n value: 45.337593270786634\n - type: nauc_map_at_10_std\n value: -16.011789445907876\n - type: nauc_map_at_1_diff1\n value: 57.097134715065835\n - type: nauc_map_at_1_max\n value: 21.93931500350721\n - type: nauc_map_at_1_std\n value: -15.134457251301637\n - type: nauc_map_at_20_diff1\n value: 46.47030891134173\n - type: nauc_map_at_20_max\n value: 46.29169960276292\n - type: nauc_map_at_20_std\n value: -15.14241106541829\n - type: nauc_map_at_3_diff1\n value: 50.27064228648596\n - type: nauc_map_at_3_max\n value: 39.43058773971639\n - type: nauc_map_at_3_std\n value: -16.16545993089126\n - type: nauc_map_at_5_diff1\n value: 46.974867679747426\n - type: nauc_map_at_5_max\n value: 44.31091104855002\n - type: nauc_map_at_5_std\n value: -16.50175337658926\n - type: nauc_mrr_at_1000_diff1\n value: 55.20294005110399\n - type: nauc_mrr_at_1000_max\n value: 51.947725719119966\n - type: nauc_mrr_at_1000_std\n value: -14.586112939597232\n - type: nauc_mrr_at_100_diff1\n value: 55.20426251109304\n - type: nauc_mrr_at_100_max\n value: 51.95648725402534\n - type: nauc_mrr_at_100_std\n value: -14.579769236539143\n - type: nauc_mrr_at_10_diff1\n value: 54.93870506205835\n - type: nauc_mrr_at_10_max\n value: 51.89312772900638\n - type: nauc_mrr_at_10_std\n value: -14.692635010092939\n - type: nauc_mrr_at_1_diff1\n value: 56.54945935175171\n - type: nauc_mrr_at_1_max\n value: 51.28134504197991\n - type: nauc_mrr_at_1_std\n value: -12.909042186563061\n - type: nauc_mrr_at_20_diff1\n value: 55.10667018041461\n - type: nauc_mrr_at_20_max\n value: 51.98236870783707\n - type: nauc_mrr_at_20_std\n value: -14.599377575198025\n - type: nauc_mrr_at_3_diff1\n value: 55.67124311746892\n - type: nauc_mrr_at_3_max\n value: 51.77903236246767\n - type: nauc_mrr_at_3_std\n value: -14.94452633860763\n - type: nauc_mrr_at_5_diff1\n value: 55.42849172366371\n - type: nauc_mrr_at_5_max\n value: 51.76902965753959\n - type: nauc_mrr_at_5_std\n value: -15.357993534727072\n - type: nauc_ndcg_at_1000_diff1\n value: 48.736844959280326\n - type: nauc_ndcg_at_1000_max\n value: 48.92891159935398\n - type: nauc_ndcg_at_1000_std\n value: -13.983968675611056\n - type: nauc_ndcg_at_100_diff1\n value: 48.73859328503975\n - type: nauc_ndcg_at_100_max\n value: 49.31867149556439\n - type: nauc_ndcg_at_100_std\n value: -13.72387564912742\n - type: nauc_ndcg_at_10_diff1\n value: 46.50313862975287\n - type: nauc_ndcg_at_10_max\n value: 47.13599793554596\n - type: nauc_ndcg_at_10_std\n value: -16.317919977400113\n - type: nauc_ndcg_at_1_diff1\n value: 56.54945935175171\n - type: nauc_ndcg_at_1_max\n value: 51.28134504197991\n - type: nauc_ndcg_at_1_std\n value: -12.909042186563061\n - type: nauc_ndcg_at_20_diff1\n value: 47.01727117133912\n - type: nauc_ndcg_at_20_max\n value: 49.121366036709105\n - type: nauc_ndcg_at_20_std\n value: -14.411078677638775\n - type: nauc_ndcg_at_3_diff1\n value: 49.229581145458276\n - type: nauc_ndcg_at_3_max\n value: 47.427609717032\n - type: nauc_ndcg_at_3_std\n value: -16.52066627289908\n - type: nauc_ndcg_at_5_diff1\n value: 48.0152514127505\n - type: nauc_ndcg_at_5_max\n value: 46.12152407850816\n - type: nauc_ndcg_at_5_std\n value: -17.613295491954656\n - type: nauc_precision_at_1000_diff1\n value: -25.959006032642463\n - type: nauc_precision_at_1000_max\n value: 12.81002362947137\n - type: nauc_precision_at_1000_std\n value: 12.575312826061513\n - type: nauc_precision_at_100_diff1\n value: -24.35413527283394\n - type: nauc_precision_at_100_max\n value: 14.878359236477303\n - type: nauc_precision_at_100_std\n value: 12.384426050018428\n - type: nauc_precision_at_10_diff1\n value: -17.93220761770618\n - type: nauc_precision_at_10_max\n value: 23.523485811847294\n - type: nauc_precision_at_10_std\n value: 4.424456968716939\n - type: nauc_precision_at_1_diff1\n value: 56.54945935175171\n - type: nauc_precision_at_1_max\n value: 51.28134504197991\n - type: nauc_precision_at_1_std\n value: -12.909042186563061\n - type: nauc_precision_at_20_diff1\n value: -21.776871398686936\n - type: nauc_precision_at_20_max\n value: 21.18436338264366\n - type: nauc_precision_at_20_std\n value: 9.937274986573321\n - type: nauc_precision_at_3_diff1\n value: -1.2411845580934435\n - type: nauc_precision_at_3_max\n value: 34.962281941875\n - type: nauc_precision_at_3_std\n value: -2.447892908501237\n - type: nauc_precision_at_5_diff1\n value: -11.134164534114085\n - type: nauc_precision_at_5_max\n value: 30.22079740070525\n - type: nauc_precision_at_5_std\n value: -0.24232594421765946\n - type: nauc_recall_at_1000_diff1\n value: .nan\n - type: nauc_recall_at_1000_max\n value: .nan\n - type: nauc_recall_at_1000_std\n value: .nan\n - type: nauc_recall_at_100_diff1\n value: 43.3647412452869\n - type: nauc_recall_at_100_max\n value: 63.50094950500327\n - type: nauc_recall_at_100_std\n value: 2.3911909633714044\n - type: nauc_recall_at_10_diff1\n value: 33.993445071666855\n - type: nauc_recall_at_10_max\n value: 41.38694129134144\n - type: nauc_recall_at_10_std\n value: -19.308698266099096\n - type: nauc_recall_at_1_diff1\n value: 57.097134715065835\n - type: nauc_recall_at_1_max\n value: 21.93931500350721\n - type: nauc_recall_at_1_std\n value: -15.134457251301637\n - type: nauc_recall_at_20_diff1\n value: 32.03888531880772\n - type: nauc_recall_at_20_max\n value: 49.660787482562085\n - type: nauc_recall_at_20_std\n value: -12.641456758778382\n - type: nauc_recall_at_3_diff1\n value: 47.94527082900579\n - type: nauc_recall_at_3_max\n value: 36.51733131437679\n - type: nauc_recall_at_3_std\n value: -18.65511713247495\n - type: nauc_recall_at_5_diff1\n value: 42.04545772092305\n - type: nauc_recall_at_5_max\n value: 41.21440912972303\n - type: nauc_recall_at_5_std\n value: -21.47386527081128\n - type: ndcg_at_1\n value: 68.491\n - type: ndcg_at_10\n value: 73.729\n - type: ndcg_at_100\n value: 77.684\n - type: ndcg_at_1000\n value: 78.084\n - type: ndcg_at_20\n value: 75.795\n - type: ndcg_at_3\n value: 68.568\n - type: ndcg_at_5\n value: 70.128\n - type: precision_at_1\n value: 68.491\n - type: precision_at_10\n value: 16.996\n - type: precision_at_100\n value: 2.023\n - type: precision_at_1000\n value: 0.207\n - type: precision_at_20\n value: 9.246\n - type: precision_at_3\n value: 41.923\n - type: precision_at_5\n value: 29.826000000000004\n - type: recall_at_1\n value: 43.964999999999996\n - type: recall_at_10\n value: 82.777\n - type: recall_at_100\n value: 97.287\n - type: recall_at_1000\n value: 100.0\n - type: recall_at_20\n value: 89.183\n - type: recall_at_3\n value: 65.803\n - type: recall_at_5\n value: 74.119\n task:\n type: Retrieval\n - dataset:\n config: fra-fra\n name: MTEB XPQARetrieval (fr)\n revision: c99d599f0a6ab9b85b065da6f9d94f9cf731679f\n split: test\n type: jinaai/xpqa\n metrics:\n - type: main_score\n value: 77.581\n - type: map_at_1\n value: 46.444\n - type: map_at_10\n value: 72.084\n - type: map_at_100\n value: 73.175\n - type: map_at_1000\n value: 73.193\n - type: map_at_20\n value: 72.77799999999999\n - type: map_at_3\n value: 65.242\n - type: map_at_5\n value: 69.926\n - type: mrr_at_1\n value: 71.82910547396529\n - type: mrr_at_10\n value: 78.66594612923046\n - type: mrr_at_100\n value: 78.97334934049613\n - type: mrr_at_1000\n value: 78.97687021803557\n - type: mrr_at_20\n value: 78.85701141744282\n - type: mrr_at_3\n value: 76.96929238985311\n - type: mrr_at_5\n value: 77.99732977303067\n - type: nauc_map_at_1000_diff1\n value: 49.090956807097804\n - type: nauc_map_at_1000_max\n value: 52.01095354889508\n - type: nauc_map_at_1000_std\n value: -12.182870421711026\n - type: nauc_map_at_100_diff1\n value: 49.091664766684566\n - type: nauc_map_at_100_max\n value: 52.017499797253755\n - type: nauc_map_at_100_std\n value: -12.188342487271528\n - type: nauc_map_at_10_diff1\n value: 48.6619338205362\n - type: nauc_map_at_10_max\n value: 50.93591260329888\n - type: nauc_map_at_10_std\n value: -12.899399261673365\n - type: nauc_map_at_1_diff1\n value: 61.89699552471587\n - type: nauc_map_at_1_max\n value: 22.387748207421946\n - type: nauc_map_at_1_std\n value: -17.139518194308437\n - type: nauc_map_at_20_diff1\n value: 48.72828404686453\n - type: nauc_map_at_20_max\n value: 51.781074586075434\n - type: nauc_map_at_20_std\n value: -12.174270605093136\n - type: nauc_map_at_3_diff1\n value: 53.11509580126934\n - type: nauc_map_at_3_max\n value: 42.1768380145106\n - type: nauc_map_at_3_std\n value: -14.98340833032363\n - type: nauc_map_at_5_diff1\n value: 49.60521390803235\n - type: nauc_map_at_5_max\n value: 49.80360562029127\n - type: nauc_map_at_5_std\n value: -13.900652140457618\n - type: nauc_mrr_at_1000_diff1\n value: 58.10782478654255\n - type: nauc_mrr_at_1000_max\n value: 61.31083013535486\n - type: nauc_mrr_at_1000_std\n value: -9.624904298545921\n - type: nauc_mrr_at_100_diff1\n value: 58.11041683306092\n - type: nauc_mrr_at_100_max\n value: 61.31590199755797\n - type: nauc_mrr_at_100_std\n value: -9.625991053580865\n - type: nauc_mrr_at_10_diff1\n value: 57.883701815695375\n - type: nauc_mrr_at_10_max\n value: 61.36276126424689\n - type: nauc_mrr_at_10_std\n value: -9.495072468420386\n - type: nauc_mrr_at_1_diff1\n value: 60.18176977079093\n - type: nauc_mrr_at_1_max\n value: 59.697615236642555\n - type: nauc_mrr_at_1_std\n value: -9.396133077966779\n - type: nauc_mrr_at_20_diff1\n value: 57.964817434006754\n - type: nauc_mrr_at_20_max\n value: 61.34073539502932\n - type: nauc_mrr_at_20_std\n value: -9.602378876645131\n - type: nauc_mrr_at_3_diff1\n value: 58.44338049427257\n - type: nauc_mrr_at_3_max\n value: 60.92272989411293\n - type: nauc_mrr_at_3_std\n value: -9.928970439416162\n - type: nauc_mrr_at_5_diff1\n value: 58.01513016866578\n - type: nauc_mrr_at_5_max\n value: 61.46805302986586\n - type: nauc_mrr_at_5_std\n value: -9.842227002440984\n - type: nauc_ndcg_at_1000_diff1\n value: 50.99293152828167\n - type: nauc_ndcg_at_1000_max\n value: 56.14232784664811\n - type: nauc_ndcg_at_1000_std\n value: -10.529213072410288\n - type: nauc_ndcg_at_100_diff1\n value: 50.99385944312529\n - type: nauc_ndcg_at_100_max\n value: 56.34825518954588\n - type: nauc_ndcg_at_100_std\n value: -10.398943874846047\n - type: nauc_ndcg_at_10_diff1\n value: 48.51273364357823\n - type: nauc_ndcg_at_10_max\n value: 53.77871849486298\n - type: nauc_ndcg_at_10_std\n value: -11.82105972112472\n - type: nauc_ndcg_at_1_diff1\n value: 60.18176977079093\n - type: nauc_ndcg_at_1_max\n value: 59.697615236642555\n - type: nauc_ndcg_at_1_std\n value: -9.396133077966779\n - type: nauc_ndcg_at_20_diff1\n value: 49.04268319033412\n - type: nauc_ndcg_at_20_max\n value: 55.47011381097071\n - type: nauc_ndcg_at_20_std\n value: -10.486452945493042\n - type: nauc_ndcg_at_3_diff1\n value: 50.95112745400584\n - type: nauc_ndcg_at_3_max\n value: 53.45473828705577\n - type: nauc_ndcg_at_3_std\n value: -13.420699384045728\n - type: nauc_ndcg_at_5_diff1\n value: 50.313156212000074\n - type: nauc_ndcg_at_5_max\n value: 52.78539129309866\n - type: nauc_ndcg_at_5_std\n value: -13.586274096509122\n - type: nauc_precision_at_1000_diff1\n value: -31.13772049254778\n - type: nauc_precision_at_1000_max\n value: 17.2847598361294\n - type: nauc_precision_at_1000_std\n value: 15.497531773816887\n - type: nauc_precision_at_100_diff1\n value: -29.98812263553739\n - type: nauc_precision_at_100_max\n value: 19.048620003227654\n - type: nauc_precision_at_100_std\n value: 15.38499952171958\n - type: nauc_precision_at_10_diff1\n value: -25.33028097412579\n - type: nauc_precision_at_10_max\n value: 26.077919168306853\n - type: nauc_precision_at_10_std\n value: 11.35352933466097\n - type: nauc_precision_at_1_diff1\n value: 60.18176977079093\n - type: nauc_precision_at_1_max\n value: 59.697615236642555\n - type: nauc_precision_at_1_std\n value: -9.396133077966779\n - type: nauc_precision_at_20_diff1\n value: -28.417606311068905\n - type: nauc_precision_at_20_max\n value: 23.958679828637692\n - type: nauc_precision_at_20_std\n value: 14.442021499194205\n - type: nauc_precision_at_3_diff1\n value: -8.127396049790482\n - type: nauc_precision_at_3_max\n value: 37.348067982957076\n - type: nauc_precision_at_3_std\n value: 4.747913619596849\n - type: nauc_precision_at_5_diff1\n value: -16.902418446058395\n - type: nauc_precision_at_5_max\n value: 32.73583852552014\n - type: nauc_precision_at_5_std\n value: 7.031446423850052\n - type: nauc_recall_at_1000_diff1\n value: -14.485978369112514\n - type: nauc_recall_at_1000_max\n value: 78.59123887333172\n - type: nauc_recall_at_1000_std\n value: 90.7384575424963\n - type: nauc_recall_at_100_diff1\n value: 41.47842281590715\n - type: nauc_recall_at_100_max\n value: 67.47271545727422\n - type: nauc_recall_at_100_std\n value: 14.555561992253999\n - type: nauc_recall_at_10_diff1\n value: 33.05308907973924\n - type: nauc_recall_at_10_max\n value: 45.49878918493155\n - type: nauc_recall_at_10_std\n value: -11.560069806810926\n - type: nauc_recall_at_1_diff1\n value: 61.89699552471587\n - type: nauc_recall_at_1_max\n value: 22.387748207421946\n - type: nauc_recall_at_1_std\n value: -17.139518194308437\n - type: nauc_recall_at_20_diff1\n value: 31.305721376453754\n - type: nauc_recall_at_20_max\n value: 51.24817763724019\n - type: nauc_recall_at_20_std\n value: -5.0809908162023145\n - type: nauc_recall_at_3_diff1\n value: 49.27109038342917\n - type: nauc_recall_at_3_max\n value: 37.69188317998447\n - type: nauc_recall_at_3_std\n value: -17.119900758664336\n - type: nauc_recall_at_5_diff1\n value: 42.74501803377967\n - type: nauc_recall_at_5_max\n value: 46.877008503354844\n - type: nauc_recall_at_5_std\n value: -15.704892082115975\n - type: ndcg_at_1\n value: 71.829\n - type: ndcg_at_10\n value: 77.581\n - type: ndcg_at_100\n value: 80.75\n - type: ndcg_at_1000\n value: 81.026\n - type: ndcg_at_20\n value: 79.092\n - type: ndcg_at_3\n value: 72.81\n - type: ndcg_at_5\n value: 74.22999999999999\n - type: precision_at_1\n value: 71.829\n - type: precision_at_10\n value: 17.717\n - type: precision_at_100\n value: 2.031\n - type: precision_at_1000\n value: 0.207\n - type: precision_at_20\n value: 9.399000000000001\n - type: precision_at_3\n value: 44.458999999999996\n - type: precision_at_5\n value: 31.535000000000004\n - type: recall_at_1\n value: 46.444\n - type: recall_at_10\n value: 86.275\n - type: recall_at_100\n value: 98.017\n - type: recall_at_1000\n value: 99.8\n - type: recall_at_20\n value: 90.935\n - type: recall_at_3\n value: 70.167\n - type: recall_at_5\n value: 78.2\n task:\n type: Retrieval\n---\n\n

\n\n

\n\"Jina\n

\n\n\n

\nThe embedding model trained by Jina AI.\n

\n\n

\njina-embeddings-v3: Multilingual Embeddings With Task LoRA\n

\n\n## Quick Start\n\n[Blog](https://jina.ai/news/jina-embeddings-v3-a-frontier-multilingual-embedding-model/#parameter-dimensions) | [Azure](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/jinaai.jina-embeddings-v3-vm) | [AWS SageMaker](https://aws.amazon.com/marketplace/pp/prodview-kdi3xkt62lo32) | [API](https://jina.ai/embeddings)\n\n\n## Intended Usage & Model Info\n\n\n`jina-embeddings-v3` is a **multilingual multi-task text embedding model** designed for a variety of NLP applications.\nBased on the [Jina-XLM-RoBERTa architecture](https://huggingface.co/jinaai/xlm-roberta-flash-implementation), \nthis model supports Rotary Position Embeddings to handle long input sequences up to **8192 tokens**.\nAdditionally, it features 5 LoRA adapters to generate task-specific embeddings efficiently.\n\n### Key Features:\n- **Extended Sequence Length:** Supports up to 8192 tokens with RoPE.\n- **Task-Specific Embedding:** Customize embeddings through the `task` argument with the following options:\n - `retrieval.query`: Used for query embeddings in asymmetric retrieval tasks\n - `retrieval.passage`: Used for passage embeddings in asymmetric retrieval tasks\n - `separation`: Used for embeddings in clustering and re-ranking applications\n - `classification`: Used for embeddings in classification tasks\n - `text-matching`: Used for embeddings in tasks that quantify similarity between two texts, such as STS or symmetric retrieval tasks\n- **Matryoshka Embeddings**: Supports flexible embedding sizes (`32, 64, 128, 256, 512, 768, 1024`), allowing for truncating embeddings to fit your application.\n\n### Supported Languages:\nWhile the foundation model supports 100 languages, we've focused our tuning efforts on the following 30 languages: \n**Arabic, Bengali, Chinese, Danish, Dutch, English, Finnish, French, Georgian, German, Greek, \nHindi, Indonesian, Italian, Japanese, Korean, Latvian, Norwegian, Polish, Portuguese, Romanian, \nRussian, Slovak, Spanish, Swedish, Thai, Turkish, Ukrainian, Urdu,** and **Vietnamese.**\n\n\n> **\u26a0\ufe0f Important Notice:** \n> We fixed a bug in the `encode` function [#60](https://huggingface.co/jinaai/jina-embeddings-v3/discussions/60) where **Matryoshka embedding truncation** occurred *after normalization*, leading to non-normalized truncated embeddings. This issue has been resolved in the latest code revision. \n> \n> If you have encoded data using the previous version and wish to maintain consistency, please use the specific code revision when loading the model: `AutoModel.from_pretrained('jinaai/jina-embeddings-v3', code_revision='da863dd04a4e5dce6814c6625adfba87b83838aa', ...)`\n\n\n## Usage\n\n**
Apply mean pooling when integrating the model.**\n

\n\n### Why Use Mean Pooling?\n\nMean pooling takes all token embeddings from the model's output and averages them at the sentence or paragraph level. \nThis approach has been shown to produce high-quality sentence embeddings.\n\nWe provide an `encode` function that handles this for you automatically.\n\nHowever, if you're working with the model directly, outside of the `encode` function, \nyou'll need to apply mean pooling manually. Here's how you can do it:\n\n\n```python\nimport torch\nimport torch.nn.functional as F\nfrom transformers import AutoTokenizer, AutoModel\n\n\ndef mean_pooling(model_output, attention_mask):\n token_embeddings = model_output[0]\n input_mask_expanded = (\n attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()\n )\n return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(\n input_mask_expanded.sum(1), min=1e-9\n )\n\n\nsentences = [\"How is the weather today?\", \"What is the current weather like today?\"]\n\ntokenizer = AutoTokenizer.from_pretrained(\"jinaai/jina-embeddings-v3\")\nmodel = AutoModel.from_pretrained(\"jinaai/jina-embeddings-v3\", trust_remote_code=True)\n\nencoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors=\"pt\")\ntask = 'retrieval.query'\ntask_id = model._adaptation_map[task]\nadapter_mask = torch.full((len(sentences),), task_id, dtype=torch.int32)\nwith torch.no_grad():\n model_output = model(**encoded_input, adapter_mask=adapter_mask)\n\nembeddings = mean_pooling(model_output, encoded_input[\"attention_mask\"])\nembeddings = F.normalize(embeddings, p=2, dim=1)\n\n```\n\n

\n
\n\nThe easiest way to start using `jina-embeddings-v3` is with the [Jina Embedding API](https://jina.ai/embeddings/).\n\nAlternatively, you can use `jina-embeddings-v3` directly via Transformers package:\n```bash\n!pip install transformers torch einops\n!pip install 'numpy<2'\n```\nIf you run it on a GPU that support [FlashAttention-2](https://github.com/Dao-AILab/flash-attention). By 2024.9.12, it supports Ampere, Ada, or Hopper GPUs (e.g., A100, RTX 3090, RTX 4090, H100),\n\n```bash\n!pip install flash-attn --no-build-isolation\n```\n\n```python\nfrom transformers import AutoModel\n\n# Initialize the model\nmodel = AutoModel.from_pretrained(\"jinaai/jina-embeddings-v3\", trust_remote_code=True)\n\ntexts = [\n \"Follow the white rabbit.\", # English\n \"Sigue al conejo blanco.\", # Spanish\n \"Suis le lapin blanc.\", # French\n \"\u8ddf\u7740\u767d\u5154\u8d70\u3002\", # Chinese\n \"\u0627\u062a\u0628\u0639 \u0627\u0644\u0623\u0631\u0646\u0628 \u0627\u0644\u0623\u0628\u064a\u0636.\", # Arabic\n \"Folge dem wei\u00dfen Kaninchen.\", # German\n]\n\n# When calling the `encode` function, you can choose a `task` based on the use case:\n# 'retrieval.query', 'retrieval.passage', 'separation', 'classification', 'text-matching'\n# Alternatively, you can choose not to pass a `task`, and no specific LoRA adapter will be used.\nembeddings = model.encode(texts, task=\"text-matching\")\n\n# Compute similarities\nprint(embeddings[0] @ embeddings[1].T)\n```\n\nBy default, the model supports a maximum sequence length of 8192 tokens. \nHowever, if you want to truncate your input texts to a shorter length, you can pass the `max_length` parameter to the `encode` function:\n```python\nembeddings = model.encode([\"Very long ... document\"], max_length=2048)\n\n```\n\nIn case you want to use **Matryoshka embeddings** and switch to a different dimension, \nyou can adjust it by passing the `truncate_dim` parameter to the `encode` function:\n```python\nembeddings = model.encode(['Sample text'], truncate_dim=256)\n```\n\n\nThe latest version (3.1.0) of [SentenceTransformers](https://github.com/UKPLab/sentence-transformers) also supports `jina-embeddings-v3`:\n\n```bash\n!pip install -U sentence-transformers\n```\n\n```python\nfrom sentence_transformers import SentenceTransformer\n\nmodel = SentenceTransformer(\"jinaai/jina-embeddings-v3\", trust_remote_code=True)\n\ntask = \"retrieval.query\"\nembeddings = model.encode(\n [\"What is the weather like in Berlin today?\"],\n task=task,\n prompt_name=task,\n)\n```\n\nYou can fine-tune `jina-embeddings-v3` using [SentenceTransformerTrainer](https://sbert.net/docs/package_reference/sentence_transformer/trainer.html). \nTo fine-tune for a specific task, you should set the task before passing the model to the ST Trainer, either during initialization:\n```python\nmodel = SentenceTransformer(\"jinaai/jina-embeddings-v3\", trust_remote_code=True, model_kwargs={'default_task': 'classification'})\n```\nOr afterwards:\n```python\nmodel = SentenceTransformer(\"jinaai/jina-embeddings-v3\", trust_remote_code=True)\nmodel[0].default_task = 'classification'\n```\nThis way you can fine-tune the LoRA adapter for the chosen task.\n\nHowever, If you want to fine-tune the entire model, make sure the main parameters are set as trainable when loading the model:\n```python\nmodel = SentenceTransformer(\"jinaai/jina-embeddings-v3\", trust_remote_code=True, model_kwargs={'lora_main_params_trainable': True})\n```\nThis will allow fine-tuning the whole model instead of just the LoRA adapters.\n\n\n**
ONNX Inference.**\n

\n\nYou can use ONNX for efficient inference with `jina-embeddings-v3`:\n```python\nimport onnxruntime\nimport numpy as np\nfrom transformers import AutoTokenizer, PretrainedConfig\n\n# Mean pool function\ndef mean_pooling(model_output: np.ndarray, attention_mask: np.ndarray):\n token_embeddings = model_output\n input_mask_expanded = np.expand_dims(attention_mask, axis=-1)\n input_mask_expanded = np.broadcast_to(input_mask_expanded, token_embeddings.shape)\n sum_embeddings = np.sum(token_embeddings * input_mask_expanded, axis=1)\n sum_mask = np.clip(np.sum(input_mask_expanded, axis=1), a_min=1e-9, a_max=None)\n return sum_embeddings / sum_mask\n\n# Load tokenizer and model config\ntokenizer = AutoTokenizer.from_pretrained('jinaai/jina-embeddings-v3')\nconfig = PretrainedConfig.from_pretrained('jinaai/jina-embeddings-v3')\n\n# Tokenize input\ninput_text = tokenizer('sample text', return_tensors='np')\n\n# ONNX session\nmodel_path = 'jina-embeddings-v3/onnx/model.onnx'\nsession = onnxruntime.InferenceSession(model_path)\n\n# Prepare inputs for ONNX model\ntask_type = 'text-matching'\ntask_id = np.array(config.lora_adaptations.index(task_type), dtype=np.int64)\ninputs = {\n 'input_ids': input_text['input_ids'],\n 'attention_mask': input_text['attention_mask'],\n 'task_id': task_id\n}\n\n# Run model\noutputs = session.run(None, inputs)[0]\n\n# Apply mean pooling and normalization to the model outputs\nembeddings = mean_pooling(outputs, input_text[\"attention_mask\"])\nembeddings = embeddings / np.linalg.norm(embeddings, ord=2, axis=1, keepdims=True)\n```\n\n

\n
\n\n\n## Contact\n\nJoin our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.\n\n## License\n\n`jina-embeddings-v3` is listed on AWS & Azure. If you need to use it beyond those platforms or on-premises within your company, note that the models is licensed under CC BY-NC 4.0. For commercial usage inquiries, feel free to [contact us](https://jina.ai/contact-sales/).\n\n## Citation\n\nIf you find `jina-embeddings-v3` useful in your research, please cite the following paper:\n\n```bibtex\n@misc{sturua2024jinaembeddingsv3multilingualembeddingstask,\n title={jina-embeddings-v3: Multilingual Embeddings With Task LoRA}, \n author={Saba Sturua and Isabelle Mohr and Mohammad Kalim Akram and Michael G\u00fcnther and Bo Wang and Markus Krimmel and Feng Wang and Georgios Mastrapas and Andreas Koukounas and Andreas Koukounas and Nan Wang and Han Xiao},\n year={2024},\n eprint={2409.10173},\n archivePrefix={arXiv},\n primaryClass={cs.CL},\n url={https://arxiv.org/abs/2409.10173}, \n}\n\n```\n", "metadata": "\"N/A\"", "depth": 0, "children": [ "kaleinaNyan/jina-v3-rullmarena-judge", "kaleinaNyan/jina-v3-rullmarena-judge-300924", "kaleinaNyan/jina-v3-rullmarena-judge-041024", "hs-hf/jina-embeddings-v3-distilled", "Thaweewat/jina-embedding-v3-m2v-128", "Thaweewat/jina-embedding-v3-m2v-256", "Thaweewat/jina-embedding-v3-m2v-512", "Thaweewat/jina-embedding-v3-m2v-768", "Thaweewat/jina-embedding-v3-m2v-1024", "BlackBeenie/jina-embeddings-v3-msmarco-v3-bpr", "CISCai/jina-embeddings-v3-query-distilled", "CISCai/jina-embeddings-v3-passage-distilled", "CISCai/jina-embeddings-v3-separation-distilled", "CISCai/jina-embeddings-v3-classification-distilled", "CISCai/jina-embeddings-v3-matching-distilled", "seregadgl/t2", "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_01", "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb", "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb_2", "angelitasr/jina-embeddings-v3_eeid", "tboquet/m2v-jina-embeddings-v3-pca-256", "Jrinky/jina_final_temp", "Abdelkareem/jina_v3_distilled", "johnpaulbin/jina-embeddings-v3-128", "Sajjad313/jina-embedding-v3", "csanz91/lampistero_rag_embeddings", "csanz91/lampistero_rag_embeddings_2", "NAMAA-Space/zarra" ], "children_count": 28, "adapters": [], "adapters_count": 0, "quantized": [ "alikia2x/jina-embedding-v3-m2v-256", "alikia2x/jina-embedding-v3-m2v-1024" ], "quantized_count": 2, "merges": [], "merges_count": 0, "total_derivatives": 30, "spaces": [], "spaces_count": 0, "parents": [], "base_model": "jinaai/jina-embeddings-v3", "base_model_relation": "base" }, { "model_id": "kaleinaNyan/jina-v3-rullmarena-judge", "gated": "unknown", "card": "---\nlicense: apache-2.0\nlanguage:\n- ru\n- en\nbase_model:\n- jinaai/jina-embeddings-v3\n---\n\n## **JinaJudge: Proxy Judgement for Russian LLM Arena**\n\n### **Description**\nThis model is trained to replicate the judgement patterns of GPT-4-1106-Preview in the [Russian LLM Arena](https://huggingface.co/spaces/Vikhrmodels/arenahardlb), designed for faster and more cost-effective evaluation of language models. While the model's focus is on Russian LLM evaluation, it can also be used for English-centric models.\n\n---\n\n### **Model Details**\n- **Architecture**: Utilizes a `jina-embeddings-v3` encoder for feature extraction, followed by 4 transformer-decoder blocks.\n- **Data Source**: The training data was collected from the Russian LLM Arena. Data contradictions were filtered, and transitive examples were added for better generalization.\n- **Judgement Classes**: Though the original arena includes five judgement categories (`A>>B`, `A>B`, `A=B`, `B>A`, `B>>A`), the model consolidates them into three simplified classes:\n - **A > B**\n - **A = B**\n - **B > A**\n- **Training**: The model underwent full-weight fine-tuning with the Adam optimizer over 30 epochs. A maximum sequence length of 4096 was set, and the best weights were chosen based on final performance.\n\n---\n\n### **Evaluation**\nThe validation process was based on **existing judgements** from the Russian LLM Arena, which were already available. These judgements were filtered and simplified to match the three-class structure used in training.\n\n**Models evaluated**:\n- **gemma-2-9b-it-sppo-iter3**\n- **glm-4-9b-chat**\n- **gpt-3.5-turbo-1106**\n- **mistral-7b-instruct-v0.3**\n- **storm-7b**\n\n**Validation Performance**:\n- **Accuracy**: 78.09%\n- **Precision**: 75.82%\n- **Recall**: 76.77%\n- **F1-score**: 76.27%\n\nFor the **test** phase, new judgements were generated using GPT-4 for the `kolibri-mistral-0427-upd` model.\n\n**Test Performance**:\n- **Accuracy**: 80.07%\n- **Precision**: 76.68%\n- **Recall**: 77.73%\n- **F1-score**: 77.08%\n\n---\n\n### **Error Analysis**\nUpon reviewing erroneous predictions, the following observations were made:\n1. **Preference for English**: In some cases, the model selects better English responses over superior Russian ones.\n2. **Difficulty with Paraphrasing**: The model occasionally struggles with distinguishing between paraphrased responses.\n3. **Ambiguous Prompts**: A significant portion of the errors arises from prompts in the Russian LLM Arena that don't allow for deterministic judgements, leading to noise in the evaluation data.\n\nWhile there is potential to improve alignment between this model and GPT-4, achieving an accuracy beyond 85% is unlikely due to the inherent noise in the benchmarks.\n\n---\n\n### **Usage Example**\n\n```python\nfrom transformers import AutoModel\n\njina = AutoModel.from_pretrained(\"kaleinaNyan/jina-v3-rullmarena-judge\", trust_remote_code=True)\n\nprompt_template = \"\"\"\n\n{user_prompt}\n\n\n{assistant_a}\n\n\n{assistant_b}\n\n\"\"\".strip()\n\nprompt = \"your prompt\"\nassistant_a = \"assistant a response\"\nassistant_b = \"assistant b response\"\n\nexample = prompt_template.format(\n user_prompt=user_prompt,\n assistant_a=assistant_a,\n assistant_b=assistant_b,\n)\n\njudgement = jina([example])[0].argmax()\n\njudgement_map = {\n 0: \"A is better than B\",\n 1: \"A == B\",\n 2: \"B is better than A\"\n}\n\nprint(judgement_map[judgement])\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": null, "base_model_relation": null }, { "model_id": "kaleinaNyan/jina-v3-rullmarena-judge-300924", "gated": "False", "card": "---\nlicense: apache-2.0\nlanguage:\n- ru\n- en\nbase_model:\n- jinaai/jina-embeddings-v3\n---\n\n## **JinaJudge: Proxy Judgement for Russian LLM Arena**\n\n### **Description**\nThis model is trained to replicate the judgement patterns of GPT-4-1106-Preview in the [Russian LLM Arena](https://huggingface.co/spaces/Vikhrmodels/arenahardlb), designed for faster and more cost-effective evaluation of language models. While the model's focus is on Russian LLM evaluation, it can also be used for English-centric models.\n\n---\n\n### **Model Details**\n\nThis is a small upgrade to the [kaleinaNyan/jina-v3-rullmarena-judge](https://huggingface.co/kaleinaNyan/jina-v3-rullmarena-judge) model:\n- Number of decoder blocks increased from 4 to 5.\n- Hidden activations dimensionality reduced from 1024 to 512 (via a projection layer after JINA encoder).\n- The resulting model size went from 614M params to 589M params.\n- I also tweaked some training hyperparameters, but training data composition is the same.\n\nSurprisingly, these changes gave a tangible performance improvement, so I decided to upload the model. As it turned out (after evaluation on the train set), previous model was not expressive enough.\n\n---\n\n### **Evaluation**\nThe validation process was based on **existing judgements** from the Russian LLM Arena, which were already available. These judgements were filtered and simplified to match the three-class structure used in training.\n\nNOTE: values in parenthesis show relative improvement compared to previous model.\n\n**Models evaluated**:\n- **gemma-2-9b-it-sppo-iter3**\n- **glm-4-9b-chat**\n- **gpt-3.5-turbo-1106**\n- **mistral-7b-instruct-v0.3**\n- **storm-7b**\n\n**Validation Performance**:\n- **Accuracy**: 80.76% (+2.67)\n- **Precision**: 78.56% (+2.74)\n- **Recall**: 79.48% (+2.71)\n- **F1-score**: 79.00% (+2.73)\n\nFor the **test** phase, new judgements were generated using GPT-4 for the `kolibri-mistral-0427-upd` model.\n\n**Test Performance**:\n- **Accuracy**: 82.72% (+2.64)\n- **Precision**: 80.11% (+3.43)\n- **Recall**: 82.42% (+4.69)\n- **F1-score**: 81.18% (+4.10)\n\n---\n\n### **Usage Example**\n\n```python\nfrom transformers import AutoModel\n\njina = AutoModel.from_pretrained(\"kaleinaNyan/jina-v3-rullmarena-judge-300924\", trust_remote_code=True)\n\nprompt_template = \"\"\"\n\n{user_prompt}\n\n\n{assistant_a}\n\n\n{assistant_b}\n\n\"\"\".strip()\n\nprompt = \"your prompt\"\nassistant_a = \"assistant a response\"\nassistant_b = \"assistant b response\"\n\nexample = prompt_template.format(\n user_prompt=user_prompt,\n assistant_a=assistant_a,\n assistant_b=assistant_b,\n)\n\njudgement = jina([example])[0].argmax()\n\njudgement_map = {\n 0: \"A is better than B\",\n 1: \"A == B\",\n 2: \"B is better than A\"\n}\n\nprint(judgement_map[judgement])\n```\n\n---\n\n### **Generated ranking**\n\nThe ranking was obtained using a modified [Russian LLM Arena code](https://github.com/oKatanaaa/ru_llm_arena). \nAll judgements were regenerated using the jina-judge model.\n\n| Model | Score | 95% CI | Average #Tokens |\n|--------------------------------------|-------|----------------------|-----------------|\n| gpt-4-1106-preview | 81.6 | (-2.3, 3.0) | 541 |\n| gpt-4.0-mini | 76.0 | (-2.7, 2.4) | 448 |\n| qwen-2.5-72b-it | 72.5 | (-3.6, 3.6) | 557 |\n| gemma-2-9b-it-sppo-iter3 | 72.1 | (-3.7, 3.6) | 569 |\n| gemma-2-27b-it | 71.1 | (-3.3, 3.2) | 482 |\n| gemma-2-9b-it | 70.8 | (-3.4, 3.5) | 569 |\n| t-lite-instruct-0.1 | 68.3 | (-3.8, 4.5) | 810 |\n| suzume-llama-3-8b-multilingual-orpo | 62.9 | (-3.9, 4.0) | 682 |\n| glm-4-9b-chat | 60.5 | (-3.9, 4.0) | 516 |\n| sfr-iterative-dpo-llama-3-8b-r | 59.9 | (-4.0, 4.3) | 682 |\n| c4ai-command-r-v01 | 56.9 | (-4.2, 3.8) | 516 |\n| phi-3-medium-4k-instruct | 56.4 | (-2.8, 3.3) | 566 |\n| mistral-nemo-instruct-2407 | 56.1 | (-2.9, 3.4) | 682 |\n| yandex_gpt_pro | 51.7 | (-3.4, 3.4) | 345 |\n| suzume-llama-3-8b-multilingual | 51.3 | (-3.4, 4.0) | 489 |\n| hermes-2-theta-llama-3-8b | 50.9 | (-3.2, 3.4) | 485 |\n| starling-1m-7b-beta | 50.2 | (-3.3, 3.4) | 495 |\n| gpt-3.5-turbo-0125 | 50.0 | (0.0, 0.0) | 220 |\n| llama-3-instruct-8b-sppo-iter3 | 49.8 | (-3.4, 4.0) | 763 |\n| llama-3-8b-saiga-suzume-ties | 48.2 | (-4.1, 3.9) | 569 |\n| llama-3-smaug-8b | 46.6 | (-3.9, 3.8) | 763 |\n| vikhr-it-5.4-fp16-orpo-v2 | 46.6 | (-3.7, 4.0) | 379 |\n| aya-23-8b | 46.3 | (-3.8, 3.9) | 571 |\n| saiga-llama3-8b_v6 | 45.5 | (-3.8, 3.9) | 471 |\n| vikhr-it-5.2-fp16-cp | 43.8 | (-3.9, 4.0) | 543 |\n| qwen2-7b-instruct | 43.7 | (-2.5, 2.7) | 492 |\n| opencchat-3.5-0106 | 43.4 | (-3.3, 3.7) | 485 |\n| gpt-3.5-turbo-1106 | 41.7 | (-2.9, 3.5) | 220 |\n| kolibri-mistral-0427-upd | 41.5 | (-3.2, 3.5) | 551 |\n| paralex-llama-3-8b-sft | 40.6 | (-3.8, 3.3) | 688 |\n| mistral-7b-instruct-v0.3 | 40.3 | (-3.3, 3.4) | 469 |\n| llama-3-instruct-8b-simpo | 40.2 | (-2.9, 3.7) | 551 |\n| gigachat_pro | 40.2 | (-3.2, 3.5) | 294 |\n| hermes-2-pro-llama-3-8b | 39.5 | (-2.9, 3.4) | 689 |\n| vikhr-it-5.3-fp16-32k | 39.5 | (-2.8, 3.2) | 519 |\n| opencchat-3.6-8b-2204522 | 37.7 | (-3.3, 3.7) | 409 |\n| meta-llama-3-8b-instruct | 37.5 | (-3.1, 3.5) | 450 |\n| kolibri-vikhr-mistral-0427 | 37.1 | (-3.1, 3.8) | 488 |\n| neural-chat-v3.3 | 36.5 | (-2.7, 3.6) | 523 |\n| vikhr-it-5.1-fp16 | 36.4 | (-3.5, 3.5) | 448 |\n| gigachat-lite | 36.0 | (-2.8, 3.0) | 523 |\n| saiga-7b | 25.9 | (-3.1, 3.7) | 927 |\n| storm-7b | 25.1 | (-3.6, 4.1) | 419 |\n| snorkel-mistral-pairrm-dpo | 16.5 | (-3.8, 3.2) | 773 |\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "kaleinaNyan/jina-v3-rullmarena-judge", "base_model_relation": "finetune" }, { "model_id": "kaleinaNyan/jina-v3-rullmarena-judge-041024", "gated": "False", "card": "---\nlicense: apache-2.0\nlanguage:\n- ru\n- en\nbase_model:\n- jinaai/jina-embeddings-v3\n---\n\n## **JinaJudge: Proxy Judgement for Russian LLM Arena**\n\n### **Description**\nThis model is trained to replicate the judgement patterns of GPT-4-1106-Preview in the [Russian LLM Arena](https://huggingface.co/spaces/Vikhrmodels/arenahardlb), designed for faster and more cost-effective evaluation of language models. While the model's focus is on Russian LLM evaluation, it can also be used for English-centric models.\n\n---\n\n### **Model Details**\n\nThis is an iterative update of [kaleinaNyan/jina-v3-rullmarena-judge-300924](https://huggingface.co/kaleinaNyan/jina-v3-rullmarena-judge-300924) model:\n- Increased amount of training data (not by much, approaximately 1.5x times).\n- Updated data composition to fix erroneous judgements where GPT-4 picked English responses over Russian ones.\n- Validation set was updated as well to exclude such errors.\n- Test set did not change (no bad judgements in that regard).\n\n---\n\n### **Evaluation**\nThe validation process was based on **existing judgements** from the Russian LLM Arena, which were already available. These judgements were filtered and simplified to match the three-class structure used in training.\n\nNOTE: values in parenthesis show relative improvement compared to previous model.\n\n**Models evaluated**:\n- **gemma-2-9b-it-sppo-iter3**\n- **glm-4-9b-chat**\n- **gpt-3.5-turbo-1106**\n- **mistral-7b-instruct-v0.3**\n- **storm-7b**\n\n**Validation Performance (old validation set)**:\n- **Accuracy**: 79.97% (-0.78)\n- **Precision**: 78.25% (-0.31)\n- **Recall**: 78.25% (-1.23)\n- **F1-score**: 78.25% (-0.75)\n\nNOTE: will report later what actually caused the drop (the subset of fixed judgements or smth else)\n\n**Validation Performance (new validation set)**:\n- **Accuracy**: 83.59% (+2.48)\n- **Precision**: 80.97% (+2.14)\n- **Recall**: 80.97% (+1.22)\n- **F1-score**: 80.97% (+1.77)\n\nFor the **test** phase, new judgements were generated using GPT-4 for the `kolibri-mistral-0427-upd` model.\n\n**Test Performance**:\n- **Accuracy**: 85.09% (+2.37)\n- **Precision**: 83.20% (+3.09)\n- **Recall**: 83.20% (+0.78)\n- **F1-score**: 83.20% (+2.02)\n\n---\n\n### **Usage Example**\n\n```python\nfrom transformers import AutoModel\n\njina = AutoModel.from_pretrained(\"kaleinaNyan/jina-v3-rullmarena-judge-041024\", trust_remote_code=True)\n\nprompt_template = \"\"\"\n\n{user_prompt}\n\n\n{assistant_a}\n\n\n{assistant_b}\n\n\"\"\".strip()\n\nprompt = \"your prompt\"\nassistant_a = \"assistant a response\"\nassistant_b = \"assistant b response\"\n\nexample = prompt_template.format(\n user_prompt=user_prompt,\n assistant_a=assistant_a,\n assistant_b=assistant_b,\n)\n\njudgement = jina([example])[0].argmax()\n\njudgement_map = {\n 0: \"A is better than B\",\n 1: \"A == B\",\n 2: \"B is better than A\"\n}\n\nprint(judgement_map[judgement])\n```\n\n---\n\n### **Generated ranking**\n\nThe ranking was obtained using a modified [Russian LLM Arena code](https://github.com/oKatanaaa/ru_llm_arena). \nAll judgements were regenerated using the jina-judge model. It takes about 16 minutes to regenerate the whole board (or 23 seconds per model) on an RTX3090.\n\n\n| Model | Score | 95% CI | Average #Tokens |\n|--------------------------------------------------|-------|----------------------|-----------------|\n| gpt-4-1106-preview | 82.8 | (-2.2, 2.3) | 541 |\n| gpt-4o-mini | 75.3 | (-2.5, 2.9) | 448 |\n| qwen-2.5-72b-it | 73.1 | (-3.4, 3.1) | 557 |\n| gemma-2-9b-it-sppo-iter3 | 70.6 | (-3.9, 2.8) | 509 |\n| gemma-2-27b-it | 68.7 | (-2.8, 3.8) | 472 |\n| t-lite-instruct-0.1 | 67.5 | (-3.8, 3.8) | 810 |\n| gemma-2-9b-it | 67.0 | (-3.7, 3.3) | 459 |\n| suzume-llama-3-8B-multilingual-orpo-borda-half | 62.4 | (-3.5, 3.7) | 682 |\n| glm-4-9b-chat | 61.5 | (-3.7, 3.0) | 568 |\n| phi-3-medium-4k-instruct | 60.4 | (-3.5, 3.7) | 566 |\n| sfr-iterative-dpo-llama-3-8b-r | 57.2 | (-3.9, 2.2) | 516 |\n| c4ai-command-r-v01 | 55.0 | (-3.9, 3.1) | 529 |\n| suzume-llama-3-8b-multilingual | 51.9 | (-2.8, 3.7) | 641 |\n| mistral-nemo-instruct-2407 | 51.9 | (-3.8, 3.7) | 403 |\n| yandex_gpt_pro | 50.3 | (-3.4, 3.1) | 345 |\n| gpt-3.5-turbo-0125 | 50.0 | (0.0, 0.0) | 220 |\n| hermes-2-theta-llama-3-8b | 49.3 | (-3.4, 3.9) | 485 |\n| starling-lm-7b-beta | 48.3 | (-3.8, 4.0) | 629 |\n| llama-3-8b-saiga-suzume-ties | 47.9 | (-3.9, 5.0) | 763 |\n| llama-3-smaug-8b | 47.6 | (-3.6, 3.1) | 524 |\n| vikhr-it-5.4-fp16-orpo-v2 | 46.8 | (-2.5, 2.7) | 379 |\n| aya-23-8b | 46.1 | (-3.9, 3.9) | 554 |\n| saiga_llama3_8b_v6 | 44.8 | (-3.4, 3.3) | 471 |\n| qwen2-7b-instruct | 43.6 | (-3.0, 2.7) | 340 |\n| vikhr-it-5.2-fp16-cp | 43.6 | (-4.1, 3.3) | 543 |\n| openchat-3.5-0106 | 42.8 | (-3.9, 3.3) | 492 |\n| kolibri-mistral-0427-upd | 42.3 | (-4.2, 3.2) | 551 |\n| paralex-llama-3-8b-sft | 41.8 | (-3.2, 3.7) | 688 |\n| llama-3-instruct-8b-sppo-iter3 | 41.7 | (-3.4, 3.3) | 502 |\n| gpt-3.5-turbo-1106 | 41.5 | (-2.9, 2.1) | 191 |\n| mistral-7b-instruct-v0.3 | 41.1 | (-4.3, 3.5) | 469 |\n| gigachat_pro | 40.9 | (-3.4, 3.6) | 294 |\n| openchat-3.6-8b-20240522 | 39.1 | (-3.2, 4.1) | 428 |\n| vikhr-it-5.3-fp16-32k | 38.8 | (-3.5, 3.3) | 519 |\n| hermes-2-pro-llama-3-8b | 38.4 | (-3.2, 3.1) | 463 |\n| kolibri-vikhr-mistral-0427 | 34.5 | (-2.9, 3.5) | 489 |\n| vikhr-it-5.3-fp16 | 33.5 | (-3.5, 3.8) | 523 |\n| llama-3-instruct-8b-simpo | 32.7 | (-3.9, 3.6) | 417 |\n| meta-llama-3-8b-instruct | 32.1 | (-3.4, 3.3) | 450 |\n| neural-chat-7b-v3-3 | 25.9 | (-2.7, 3.6) | 927 |\n| gigachat_lite | 25.4 | (-2.8, 2.5) | 276 |\n| snorkel-mistral-pairrm-dpo | 10.3 | (-2.0, 2.3) | 773 |\n| storm-7b | 3.7 | (-1.3, 1.6) | 419 |\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "kaleinaNyan/jina-v3-rullmarena-judge", "base_model_relation": "finetune" }, { "model_id": "hs-hf/jina-embeddings-v3-distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nlicense: cc-by-nc-4.0\nmodel_name: jina-embeddings-v3-distilled\ntags:\n- embeddings\n- static-embeddings\n---\n\n# jina-embeddings-v3-distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"jina-embeddings-v3-distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "hs-hf/jina-embeddings-v3-distilled", "base_model_relation": "base" }, { "model_id": "Thaweewat/jina-embedding-v3-m2v-128", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\nlanguage:\n- th\n- en\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 128-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** inf tokens\n- **Output Dimensionality:** 128 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (0): StaticEmbedding(\n (embedding): EmbeddingBag(250002, 128, mode='mean')\n )\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"Thaweewat/jina-embedding-v3-m2v-128\")\n# Run inference\nsentences = [\n 'The weather is lovely today.',\n \"It's so sunny outside!\",\n 'He drove to the stadium.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 128]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.2.0\n- Transformers: 4.44.2\n- PyTorch: 2.4.1+cu121\n- Accelerate: 0.34.2\n- Datasets: \n- Tokenizers: 0.19.1\n\n## Citation\n\n### BibTeX\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Thaweewat/jina-embedding-v3-m2v", "base_model_relation": "finetune" }, { "model_id": "Thaweewat/jina-embedding-v3-m2v-256", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 256-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** inf tokens\n- **Output Dimensionality:** 256 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (0): StaticEmbedding(\n (embedding): EmbeddingBag(250002, 256, mode='mean')\n )\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"Thaweewat/jina-embedding-v3-m2v-256\")\n# Run inference\nsentences = [\n 'The weather is lovely today.',\n \"It's so sunny outside!\",\n 'He drove to the stadium.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 256]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.2.0\n- Transformers: 4.44.2\n- PyTorch: 2.4.1+cu121\n- Accelerate: 0.34.2\n- Datasets: \n- Tokenizers: 0.19.1\n\n## Citation\n\n### BibTeX\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Thaweewat/jina-embedding-v3-m2v", "base_model_relation": "finetune" }, { "model_id": "Thaweewat/jina-embedding-v3-m2v-512", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 512-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** inf tokens\n- **Output Dimensionality:** 512 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (0): StaticEmbedding(\n (embedding): EmbeddingBag(250002, 512, mode='mean')\n )\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"Thaweewat/jina-embedding-v3-m2v-512\")\n# Run inference\nsentences = [\n 'The weather is lovely today.',\n \"It's so sunny outside!\",\n 'He drove to the stadium.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 512]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.2.0\n- Transformers: 4.44.2\n- PyTorch: 2.4.1+cu121\n- Accelerate: 0.34.2\n- Datasets: \n- Tokenizers: 0.19.1\n\n## Citation\n\n### BibTeX\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Thaweewat/jina-embedding-v3-m2v", "base_model_relation": "finetune" }, { "model_id": "Thaweewat/jina-embedding-v3-m2v-768", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** inf tokens\n- **Output Dimensionality:** 768 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (0): StaticEmbedding(\n (embedding): EmbeddingBag(250002, 768, mode='mean')\n )\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"Thaweewat/jina-embedding-v3-m2v-768\")\n# Run inference\nsentences = [\n 'The weather is lovely today.',\n \"It's so sunny outside!\",\n 'He drove to the stadium.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 768]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.2.0\n- Transformers: 4.44.2\n- PyTorch: 2.4.1+cu121\n- Accelerate: 0.34.2\n- Datasets: \n- Tokenizers: 0.19.1\n\n## Citation\n\n### BibTeX\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Thaweewat/jina-embedding-v3-m2v", "base_model_relation": "finetune" }, { "model_id": "Thaweewat/jina-embedding-v3-m2v-1024", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** inf tokens\n- **Output Dimensionality:** 1024 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (0): StaticEmbedding(\n (embedding): EmbeddingBag(250002, 1024, mode='mean')\n )\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"Thaweewat/jina-embedding-v3-m2v-1024\")\n# Run inference\nsentences = [\n 'The weather is lovely today.',\n \"It's so sunny outside!\",\n 'He drove to the stadium.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.2.0\n- Transformers: 4.44.2\n- PyTorch: 2.4.1+cu121\n- Accelerate: 0.34.2\n- Datasets: \n- Tokenizers: 0.19.1\n\n## Citation\n\n### BibTeX\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Thaweewat/jina-embedding-v3-m2v", "base_model_relation": "finetune" }, { "model_id": "BlackBeenie/jina-embeddings-v3-msmarco-v3-bpr", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:498970\n- loss:BPRLoss\nwidget:\n- source_sentence: meaning of the prefix em\n sentences:\n - Word Origin and History for em- Expand. from French assimilation of en- to following\n labial (see en- (1)). Also a prefix used to form verbs from adjectives and nouns.\n representing Latin ex- assimilated to following -m- (see ex-).\n - 'Hawaii: Aloha! Whether you are hoping to travel to Hawaii for a tropical green\n Christmas or you are hoping to make this island paradise your home, we can help\n you find the information you need! The state of Hawaii, located in the middle\n of the Pacific Ocean, is farther away from any other landmass than any other island\n on the earth.'\n - 'Prefixes: Un, Dis, Im, Mis. A prefix is placed at the beginning of a word to\n change its meaning. For example, the suffix re- means either again or back as\n in return, repeat or refurbish. The following 4 prefixes are easy to confuse because\n they all have a negative meaning. un-.'\n- source_sentence: how long does engine take to cool down\n sentences:\n - It takes roughly 30 minutes for the laptop to cool down to a normal state.Or if\n you want to use it soon it could take I guess 10-15 minutes.\n - \"Turn off the engine. If you can pop the hood from the driver\u00e2\\x80\\x99s seat,\\\n \\ do so \u00e2\\x80\\x94 but don\u00e2\\x80\\x99t risk opening it by hand until the engine has\\\n \\ cooled, especially if you see steam wafting off the engine. It typically takes\\\n \\ a solid 30 minutes for an engine to cool down enough for it to be safe to handle.\"\n - Zeppelin was invented in 1900 by a military officer of German origin named Count\n Ferdinand von Zeppelin.It was a stiff framed airship, LZ-I that flew on 2nd July,\n 1900 carrying five passengers near Lake Constance in Germany. Zeppelins were used\n in the times of peace as well as war.eppelin was invented in 1900 by a military\n officer of German origin named Count Ferdinand von Zeppelin.\n- source_sentence: how long does it take to get an undergraduate\n sentences:\n - How Long Does It Take To Become a Nurse Anesthetist (CRNA)? How Long Does It Take\n To Become a Nurse Practitioner? How Long Does It Take To Become a Nutritionist?\n How Long Does It Take To Become A Pharmacist? How Long Does It Take To Become\n a Physician Assistant? How Long Does It Take To Become a Social Worker? (ANSWERED)\n How Long Does It Take To Become a Vet Tech? How Long Does It Take To Become An\n LPN? How Long Does It Take To Become an OB/GYN? How Long Does It Take To Become\n an Ultrasound Technician? How Long Does It Take To Get a Medical Degree? How Long\n Does It Take To Get a Nursing Degree? Your first stepping stone toward a rewarding\n nursing career is completing the education and becoming registered. Ill answer\n the age old question about how long it takes to get a registered nursing degree.\n - A depositary receipt (DR) is a type of negotiable (transferable) financial security\n that is traded on a local stock exchange but represents a security, usually in\n the form of equity, that is issued by a foreign publicly listed company. U.S.\n broker may also sell ADRs back into the local Russian market. This is known as\n cross-border trading. When this happens, an amount of ADRs is canceled by the\n depository and the local shares are released from the custodian bank and delivered\n back to the Russian broker who bought them.\n - Undergraduate Studies. To become a doctor, a student must first complete high\n school, then go on to college. During the typical four-year undergraduate period,\n the aspiring doctor will study topics such as anatomy, physiology, biology, chemistry\n and other college courses necessary for a degree, such as English or math.\n- source_sentence: fees definition\n sentences:\n - fees. 1 veterinarians' charges rendered to clients for services. 2 Justifiable\n professional fees are based on the amount of time spent on the case, with a varying\n fee per hour depending on the difficulty and complexity of the problem, and on\n the specialist superiority of the veterinarian.\n - 'Summary: The Catbird Seat by James Thurber is about Mr. Martin who has decided\n he must kill Mrs Barrows because she is destroying the firm he works for, but\n in the end he tricks his boss into thinking she has had a mental breakdown.'\n - Cost, in common usage, the monetary value of goods and services that producers\n and consumers purchase. In a basic economic sense, cost is the measure of the\n alternative opportunities foregone in the choice of one good or activity over\n others.\n- source_sentence: what is a fermentation lock used for\n sentences:\n - \"Remember, fermentation is a method of preserving food. Leaving it on your counter\\\n \\ gives it more time for the LAB activity to increase \u00e2\\x80\\x94 which, in turn,\\\n \\ lowers pH \u00e2\\x80\\x94 and prevents spoilage. As long as your jar can keep out\\\n \\ the oxygen, you shouldn\u00e2\\x80\\x99t be worried. Which leads me to\u00e2\\x80\u00a6.\"\n - The fermentation lock or airlock is a device used in beer brewing and wine making\n that allows carbon dioxide released by the beer to escape the fermenter, while\n not allowing air to enter the fermenter, thus avoiding oxidation. There are two\n main designs for the fermentation lock, or airlock.\n - The New River is formed by the confluence of the South Fork New River and the\n North Fork New River in Ashe County, North Carolina. It then flows north into\n southwestern Virginia, passing near Galax, Virginia and through a gorge in the\n Iron Mountains. Continuing north, the river enters Pulaski County, Virginia, where\n it is impounded by Claytor Dam, creating Claytor Lake.\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\nFinetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) (trained with msmarco-v3 dataset).\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"BlackBeenie/jina-embeddings-v3-msmarco-v3-bpr\")\n# Run inference\nsentences = [\n 'what is a fermentation lock used for',\n 'The fermentation lock or airlock is a device used in beer brewing and wine making that allows carbon dioxide released by the beer to escape the fermenter, while not allowing air to enter the fermenter, thus avoiding oxidation. There are two main designs for the fermentation lock, or airlock.',\n 'Remember, fermentation is a method of preserving food. Leaving it on your counter gives it more time for the LAB activity to increase \u00e2\\x80\\x94 which, in turn, lowers pH \u00e2\\x80\\x94 and prevents spoilage. As long as your jar can keep out the oxygen, you shouldn\u00e2\\x80\\x99t be worried. Which leads me to\u00e2\\x80\u00a6.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 498,970 training samples\n* Columns: sentence_0, sentence_1, and sentence_2\n* Approximate statistics based on the first 1000 samples:\n | | sentence_0 | sentence_1 | sentence_2 |\n |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|\n | type | string | string | string |\n | details |
  • min: 4 tokens
  • mean: 9.93 tokens
  • max: 37 tokens
|
  • min: 17 tokens
  • mean: 90.01 tokens
  • max: 239 tokens
|
  • min: 23 tokens
  • mean: 88.24 tokens
  • max: 258 tokens
|\n* Samples:\n | sentence_0 | sentence_1 | sentence_2 |\n |:-------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n | how much does it cost to paint a interior house | Interior House Painting Cost Factors. Generally, it will take a minimum of two gallons of paint to cover a room. At the highest end, paint will cost anywhere between $30 and $60 per gallon and come in three different finishes: flat, semi-gloss or high-gloss.Flat finishes are the least shiny and are best suited for areas requiring frequent cleaning.rovide a few details about your project and receive competitive quotes from local pros. The average national cost to paint a home interior is $1,671, with most homeowners spending between $966 and $2,426. | Question DetailsAsked on 3/12/2014. Guest_... How much does it cost per square foot to paint the interior of a house? We just bought roughly a 1500 sg ft townhouse and want to get the entire house, including ceilings painted (including a roughly 400 sq ft finished basement not included in square footage). |\n | when is s corp taxes due | If you form a corporate entity for your small business, regardless of whether it's taxed as a C or S corporation, a tax return must be filed with the Internal Revenue Service on its due date each year. Corporate tax returns are always due on the 15th day of the third month following the close of the tax year. The actual day that the tax return filing deadline falls on, however, isn't the same for every corporation. | Before Jan. 1, 2026 After Dec. 31, 2025 Starting with 2016 tax returns, all. other C corps besides Dec. 31 and. June 30 year-ends (including those with. other fiscal year-ends) will be due on. the 15th of the 4th month after the. |\n | what are disaccharides | Disaccharides are formed when two monosaccharides are joined together and a molecule of water is removed, a process known as dehydration reaction. For example; milk sugar (lactose) is made from glucose and galactose whereas the sugar from sugar cane and sugar beets (sucrose) is made from glucose and fructose.altose, another notable disaccharide, is made up of two glucose molecules. The two monosaccharides are bonded via a dehydration reaction (also called a condensation reaction or dehydration synthesis) that leads to the loss of a molecule of water and formation of a glycosidic bond. | Disaccharides- Another type of carbohydrate. How many sugar units are disaccharides composed of?_____ What elements make up disaccharides? _____ How does the body use disaccharides? _____ There is no chemical test for disaccharides. Table sugar (white granulated sugar) is an example of a disaccharide. List some foods that contain a lot of disaccharides: _____ |\n* Loss: beir.losses.bpr_loss.BPRLoss\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `eval_strategy`: steps\n- `per_device_train_batch_size`: 32\n- `per_device_eval_batch_size`: 32\n- `num_train_epochs`: 8\n- `multi_dataset_batch_sampler`: round_robin\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: steps\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 32\n- `per_device_eval_batch_size`: 32\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 5e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1\n- `num_train_epochs`: 8\n- `max_steps`: -1\n- `lr_scheduler_type`: linear\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.0\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: False\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: False\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `eval_use_gather_object`: False\n- `batch_sampler`: batch_sampler\n- `multi_dataset_batch_sampler`: round_robin\n\n
\n\n### Training Logs\n
Click to expand\n\n| Epoch | Step | Training Loss |\n|:------:|:------:|:-------------:|\n| 0.0321 | 500 | 1.7204 |\n| 0.0641 | 1000 | 0.6847 |\n| 0.0962 | 1500 | 0.4782 |\n| 0.1283 | 2000 | 0.4001 |\n| 0.1603 | 2500 | 0.3773 |\n| 0.1924 | 3000 | 0.3538 |\n| 0.2245 | 3500 | 0.3424 |\n| 0.2565 | 4000 | 0.3375 |\n| 0.2886 | 4500 | 0.3286 |\n| 0.3207 | 5000 | 0.3289 |\n| 0.3527 | 5500 | 0.3266 |\n| 0.3848 | 6000 | 0.3226 |\n| 0.4169 | 6500 | 0.3266 |\n| 0.4489 | 7000 | 0.3262 |\n| 0.4810 | 7500 | 0.3241 |\n| 0.5131 | 8000 | 0.3216 |\n| 0.5451 | 8500 | 0.3232 |\n| 0.5772 | 9000 | 0.3186 |\n| 0.6092 | 9500 | 0.3194 |\n| 0.6413 | 10000 | 0.314 |\n| 0.6734 | 10500 | 0.3217 |\n| 0.7054 | 11000 | 0.3156 |\n| 0.7375 | 11500 | 0.3244 |\n| 0.7696 | 12000 | 0.3189 |\n| 0.8016 | 12500 | 0.3235 |\n| 0.8337 | 13000 | 0.3305 |\n| 0.8658 | 13500 | 0.3284 |\n| 0.8978 | 14000 | 0.3213 |\n| 0.9299 | 14500 | 0.3283 |\n| 0.9620 | 15000 | 0.3219 |\n| 0.9940 | 15500 | 0.3247 |\n| 1.0 | 15593 | - |\n| 1.0261 | 16000 | 0.3287 |\n| 1.0582 | 16500 | 0.3346 |\n| 1.0902 | 17000 | 0.3245 |\n| 1.1223 | 17500 | 0.3202 |\n| 1.1544 | 18000 | 0.332 |\n| 1.1864 | 18500 | 0.3298 |\n| 1.2185 | 19000 | 0.332 |\n| 1.2506 | 19500 | 0.3258 |\n| 1.2826 | 20000 | 0.3291 |\n| 1.3147 | 20500 | 0.334 |\n| 1.3468 | 21000 | 0.3328 |\n| 1.3788 | 21500 | 0.3362 |\n| 1.4109 | 22000 | 0.3348 |\n| 1.4430 | 22500 | 0.3402 |\n| 1.4750 | 23000 | 0.3346 |\n| 1.5071 | 23500 | 0.339 |\n| 1.5392 | 24000 | 0.3406 |\n| 1.5712 | 24500 | 0.3239 |\n| 1.6033 | 25000 | 0.3275 |\n| 1.6353 | 25500 | 0.3287 |\n| 1.6674 | 26000 | 0.3271 |\n| 1.6995 | 26500 | 0.3337 |\n| 1.7315 | 27000 | 0.3352 |\n| 1.7636 | 27500 | 0.3244 |\n| 1.7957 | 28000 | 0.3418 |\n| 1.8277 | 28500 | 0.349 |\n| 1.8598 | 29000 | 0.3395 |\n| 1.8919 | 29500 | 0.3386 |\n| 1.9239 | 30000 | 0.3379 |\n| 1.9560 | 30500 | 0.3412 |\n| 1.9881 | 31000 | 0.3364 |\n| 2.0 | 31186 | - |\n| 2.0201 | 31500 | 0.3386 |\n| 2.0522 | 32000 | 0.3417 |\n| 2.0843 | 32500 | 0.3362 |\n| 2.1163 | 33000 | 0.3251 |\n| 2.1484 | 33500 | 0.3563 |\n| 2.1805 | 34000 | 0.3341 |\n| 2.2125 | 34500 | 0.3478 |\n| 2.2446 | 35000 | 0.3389 |\n| 2.2767 | 35500 | 0.342 |\n| 2.3087 | 36000 | 0.3467 |\n| 2.3408 | 36500 | 0.3419 |\n| 2.3729 | 37000 | 0.3513 |\n| 2.4049 | 37500 | 0.3441 |\n| 2.4370 | 38000 | 0.3484 |\n| 2.4691 | 38500 | 0.3457 |\n| 2.5011 | 39000 | 0.3503 |\n| 2.5332 | 39500 | 0.3446 |\n| 2.5653 | 40000 | 0.3461 |\n| 2.5973 | 40500 | 0.3399 |\n| 2.6294 | 41000 | 0.3405 |\n| 2.6615 | 41500 | 0.3382 |\n| 2.6935 | 42000 | 0.3388 |\n| 2.7256 | 42500 | 0.3378 |\n| 2.7576 | 43000 | 0.336 |\n| 2.7897 | 43500 | 0.3471 |\n| 2.8218 | 44000 | 0.3563 |\n| 2.8538 | 44500 | 0.3465 |\n| 2.8859 | 45000 | 0.3501 |\n| 2.9180 | 45500 | 0.3439 |\n| 2.9500 | 46000 | 0.3546 |\n| 2.9821 | 46500 | 0.3414 |\n| 3.0 | 46779 | - |\n| 3.0142 | 47000 | 0.3498 |\n| 3.0462 | 47500 | 0.3484 |\n| 3.0783 | 48000 | 0.3496 |\n| 3.1104 | 48500 | 0.3392 |\n| 3.1424 | 49000 | 0.3583 |\n| 3.1745 | 49500 | 0.3505 |\n| 3.2066 | 50000 | 0.3547 |\n| 3.2386 | 50500 | 0.3469 |\n| 3.2707 | 51000 | 0.3489 |\n| 3.3028 | 51500 | 0.3473 |\n| 3.3348 | 52000 | 0.3579 |\n| 3.3669 | 52500 | 0.3523 |\n| 3.3990 | 53000 | 0.3427 |\n| 3.4310 | 53500 | 0.3685 |\n| 3.4631 | 54000 | 0.3479 |\n| 3.4952 | 54500 | 0.355 |\n| 3.5272 | 55000 | 0.3464 |\n| 3.5593 | 55500 | 0.3473 |\n| 3.5914 | 56000 | 0.348 |\n| 3.6234 | 56500 | 0.3426 |\n| 3.6555 | 57000 | 0.3394 |\n| 3.6876 | 57500 | 0.3454 |\n| 3.7196 | 58000 | 0.345 |\n| 3.7517 | 58500 | 0.3411 |\n| 3.7837 | 59000 | 0.3557 |\n| 3.8158 | 59500 | 0.3505 |\n| 3.8479 | 60000 | 0.3605 |\n| 3.8799 | 60500 | 0.3554 |\n| 3.9120 | 61000 | 0.349 |\n| 3.9441 | 61500 | 0.3629 |\n| 3.9761 | 62000 | 0.3456 |\n| 4.0 | 62372 | - |\n| 4.0082 | 62500 | 0.3562 |\n| 4.0403 | 63000 | 0.3531 |\n| 4.0723 | 63500 | 0.3569 |\n| 4.1044 | 64000 | 0.3494 |\n| 4.1365 | 64500 | 0.3513 |\n| 4.1685 | 65000 | 0.3599 |\n| 4.2006 | 65500 | 0.3487 |\n| 4.2327 | 66000 | 0.3561 |\n| 4.2647 | 66500 | 0.3583 |\n| 4.2968 | 67000 | 0.3539 |\n| 4.3289 | 67500 | 0.3614 |\n| 4.3609 | 68000 | 0.3558 |\n| 4.3930 | 68500 | 0.3485 |\n| 4.4251 | 69000 | 0.3715 |\n| 4.4571 | 69500 | 0.3585 |\n| 4.4892 | 70000 | 0.3571 |\n| 4.5213 | 70500 | 0.3498 |\n| 4.5533 | 71000 | 0.3576 |\n| 4.5854 | 71500 | 0.3498 |\n| 4.6175 | 72000 | 0.3507 |\n| 4.6495 | 72500 | 0.3436 |\n| 4.6816 | 73000 | 0.3461 |\n| 4.7137 | 73500 | 0.3451 |\n| 4.7457 | 74000 | 0.3554 |\n| 4.7778 | 74500 | 0.354 |\n| 4.8099 | 75000 | 0.3514 |\n| 4.8419 | 75500 | 0.3688 |\n| 4.8740 | 76000 | 0.3573 |\n| 4.9060 | 76500 | 0.3557 |\n| 4.9381 | 77000 | 0.3607 |\n| 4.9702 | 77500 | 0.3488 |\n| 5.0 | 77965 | - |\n| 5.0022 | 78000 | 0.3555 |\n| 5.0343 | 78500 | 0.3596 |\n| 5.0664 | 79000 | 0.3572 |\n| 5.0984 | 79500 | 0.355 |\n| 5.1305 | 80000 | 0.3427 |\n| 5.1626 | 80500 | 0.3669 |\n| 5.1946 | 81000 | 0.3578 |\n| 5.2267 | 81500 | 0.3589 |\n| 5.2588 | 82000 | 0.3586 |\n| 5.2908 | 82500 | 0.3581 |\n| 5.3229 | 83000 | 0.3607 |\n| 5.3550 | 83500 | 0.3563 |\n| 5.3870 | 84000 | 0.3597 |\n| 5.4191 | 84500 | 0.3712 |\n| 5.4512 | 85000 | 0.3574 |\n| 5.4832 | 85500 | 0.359 |\n| 5.5153 | 86000 | 0.3598 |\n| 5.5474 | 86500 | 0.3604 |\n| 5.5794 | 87000 | 0.3535 |\n| 5.6115 | 87500 | 0.3606 |\n| 5.6436 | 88000 | 0.3469 |\n| 5.6756 | 88500 | 0.3568 |\n| 5.7077 | 89000 | 0.3497 |\n| 5.7398 | 89500 | 0.3597 |\n| 5.7718 | 90000 | 0.3582 |\n| 5.8039 | 90500 | 0.3556 |\n| 5.8360 | 91000 | 0.3716 |\n| 5.8680 | 91500 | 0.3615 |\n| 5.9001 | 92000 | 0.3532 |\n| 5.9321 | 92500 | 0.3747 |\n| 5.9642 | 93000 | 0.3521 |\n| 5.9963 | 93500 | 0.362 |\n| 6.0 | 93558 | - |\n| 6.0283 | 94000 | 0.3701 |\n| 6.0604 | 94500 | 0.3636 |\n| 6.0925 | 95000 | 0.3556 |\n| 6.1245 | 95500 | 0.3508 |\n| 6.1566 | 96000 | 0.3626 |\n| 6.1887 | 96500 | 0.3618 |\n| 6.2207 | 97000 | 0.3683 |\n| 6.2528 | 97500 | 0.362 |\n| 6.2849 | 98000 | 0.3534 |\n| 6.3169 | 98500 | 0.3643 |\n| 6.3490 | 99000 | 0.36 |\n| 6.3811 | 99500 | 0.3592 |\n| 6.4131 | 100000 | 0.3606 |\n| 6.4452 | 100500 | 0.369 |\n| 6.4773 | 101000 | 0.3607 |\n| 6.5093 | 101500 | 0.3683 |\n| 6.5414 | 102000 | 0.3648 |\n| 6.5735 | 102500 | 0.3481 |\n| 6.6055 | 103000 | 0.3565 |\n| 6.6376 | 103500 | 0.3555 |\n| 6.6697 | 104000 | 0.347 |\n| 6.7017 | 104500 | 0.3585 |\n| 6.7338 | 105000 | 0.3553 |\n| 6.7659 | 105500 | 0.3539 |\n| 6.7979 | 106000 | 0.3638 |\n| 6.8300 | 106500 | 0.3674 |\n| 6.8621 | 107000 | 0.3674 |\n| 6.8941 | 107500 | 0.3617 |\n| 6.9262 | 108000 | 0.3655 |\n| 6.9583 | 108500 | 0.3593 |\n| 6.9903 | 109000 | 0.3603 |\n| 7.0 | 109151 | - |\n| 7.0224 | 109500 | 0.3614 |\n| 7.0544 | 110000 | 0.3655 |\n| 7.0865 | 110500 | 0.3597 |\n| 7.1186 | 111000 | 0.3443 |\n| 7.1506 | 111500 | 0.3781 |\n| 7.1827 | 112000 | 0.3587 |\n| 7.2148 | 112500 | 0.3676 |\n| 7.2468 | 113000 | 0.357 |\n| 7.2789 | 113500 | 0.3639 |\n| 7.3110 | 114000 | 0.3691 |\n| 7.3430 | 114500 | 0.3606 |\n| 7.3751 | 115000 | 0.3679 |\n| 7.4072 | 115500 | 0.3697 |\n| 7.4392 | 116000 | 0.3726 |\n| 7.4713 | 116500 | 0.3603 |\n| 7.5034 | 117000 | 0.3655 |\n| 7.5354 | 117500 | 0.3639 |\n| 7.5675 | 118000 | 0.3557 |\n| 7.5996 | 118500 | 0.358 |\n| 7.6316 | 119000 | 0.3526 |\n| 7.6637 | 119500 | 0.3579 |\n| 7.6958 | 120000 | 0.3584 |\n| 7.7278 | 120500 | 0.3507 |\n| 7.7599 | 121000 | 0.3472 |\n| 7.7920 | 121500 | 0.3757 |\n| 7.8240 | 122000 | 0.3717 |\n| 7.8561 | 122500 | 0.3646 |\n| 7.8882 | 123000 | 0.3662 |\n| 7.9202 | 123500 | 0.3668 |\n| 7.9523 | 124000 | 0.3677 |\n| 7.9844 | 124500 | 0.3588 |\n| 8.0 | 124744 | - |\n\n
\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.2.0\n- Transformers: 4.44.2\n- PyTorch: 2.4.1+cu121\n- Accelerate: 0.34.2\n- Datasets: 3.0.1\n- Tokenizers: 0.19.1\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "BlackBeenie/jina-embeddings-v3-msmarco-v3-bpr", "base_model_relation": "base" }, { "model_id": "CISCai/jina-embeddings-v3-query-distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nmodel_name: jina-embeddings-v3-query-distilled\nlicense: cc-by-nc-4.0\ntags:\n- embeddings\n- static-embeddings\n- feature-extraction\n- sentence-similarity\n- sentence-transformers\n---\n\n# jina-embeddings-v3-query-distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer with the `retrieval.query` task LoRA applied. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"CISCai/jina-embeddings-v3-query-distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nThe following code snippet shows how to load a Model2Vec model into a Sentence Transformer model:\n```python\nfrom sentence_transformers import SentenceTransformer\nfrom sentence_transformers.models import StaticEmbedding\n\n# Initialize a StaticEmbedding module\nstatic_embedding = StaticEmbedding.from_model2vec(\"CISCai/jina-embeddings-v3-query-distilled\")\nmodel = SentenceTransformer(modules=[static_embedding])\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "CISCai/jina-embeddings-v3-query-distilled", "base_model_relation": "base" }, { "model_id": "CISCai/jina-embeddings-v3-passage-distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nmodel_name: jina-embeddings-v3-passage-distilled\nlicense: cc-by-nc-4.0\ntags:\n- embeddings\n- static-embeddings\n- feature-extraction\n- sentence-similarity\n- sentence-transformers\n---\n\n# jina-embeddings-v3-passage-distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer with the `retrieval.passage` task LoRA applied. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"CISCai/jina-embeddings-v3-passage-distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nThe following code snippet shows how to load a Model2Vec model into a Sentence Transformer model:\n```python\nfrom sentence_transformers import SentenceTransformer\nfrom sentence_transformers.models import StaticEmbedding\n\n# Initialize a StaticEmbedding module\nstatic_embedding = StaticEmbedding.from_model2vec(\"CISCai/jina-embeddings-v3-passage-distilled\")\nmodel = SentenceTransformer(modules=[static_embedding])\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "CISCai/jina-embeddings-v3-passage-distilled", "base_model_relation": "base" }, { "model_id": "CISCai/jina-embeddings-v3-separation-distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nmodel_name: jina-embeddings-v3-separation-distilled\nlicense: cc-by-nc-4.0\ntags:\n- embeddings\n- static-embeddings\n- feature-extraction\n- sentence-similarity\n- sentence-transformers\n---\n\n# jina-embeddings-v3-separation-distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer with the `separation` task LoRA applied. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"CISCai/jina-embeddings-v3-separation-distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nThe following code snippet shows how to load a Model2Vec model into a Sentence Transformer model:\n```python\nfrom sentence_transformers import SentenceTransformer\nfrom sentence_transformers.models import StaticEmbedding\n\n# Initialize a StaticEmbedding module\nstatic_embedding = StaticEmbedding.from_model2vec(\"CISCai/jina-embeddings-v3-separation-distilled\")\nmodel = SentenceTransformer(modules=[static_embedding])\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "CISCai/jina-embeddings-v3-separation-distilled", "base_model_relation": "base" }, { "model_id": "CISCai/jina-embeddings-v3-classification-distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nmodel_name: jina-embeddings-v3-classification-distilled\nlicense: cc-by-nc-4.0\ntags:\n- embeddings\n- static-embeddings\n- feature-extraction\n- sentence-similarity\n- sentence-transformers\n---\n\n# jina-embeddings-v3-classification-distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer with the `classification` task LoRA applied. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"CISCai/jina-embeddings-v3-classification-distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nThe following code snippet shows how to load a Model2Vec model into a Sentence Transformer model:\n```python\nfrom sentence_transformers import SentenceTransformer\nfrom sentence_transformers.models import StaticEmbedding\n\n# Initialize a StaticEmbedding module\nstatic_embedding = StaticEmbedding.from_model2vec(\"CISCai/jina-embeddings-v3-classification-distilled\")\nmodel = SentenceTransformer(modules=[static_embedding])\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "CISCai/jina-embeddings-v3-classification-distilled", "base_model_relation": "base" }, { "model_id": "CISCai/jina-embeddings-v3-matching-distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nmodel_name: jina-embeddings-v3-matching-distilled\nlicense: cc-by-nc-4.0\ntags:\n- embeddings\n- static-embeddings\n- feature-extraction\n- sentence-similarity\n- sentence-transformers\n---\n\n# jina-embeddings-v3-matching-distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer with the `text-matching` task LoRA applied. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"CISCai/jina-embeddings-v3-matching-distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nThe following code snippet shows how to load a Model2Vec model into a Sentence Transformer model:\n```python\nfrom sentence_transformers import SentenceTransformer\nfrom sentence_transformers.models import StaticEmbedding\n\n# Initialize a StaticEmbedding module\nstatic_embedding = StaticEmbedding.from_model2vec(\"CISCai/jina-embeddings-v3-matching-distilled\")\nmodel = SentenceTransformer(modules=[static_embedding])\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "CISCai/jina-embeddings-v3-matching-distilled", "base_model_relation": "base" }, { "model_id": "seregadgl/t2", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlibrary_name: sentence-transformers\nmetrics:\n- pearson_cosine\n- spearman_cosine\npipeline_tag: sentence-similarity\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:63802\n- loss:CoSENTLoss\nwidget:\n- source_sentence: \u043c\u0430\u0448\u0438\u043d\u043a\u0430 \u0434\u0435\u0442\u0441\u043a\u0430\u044f \u0441\u0430\u043c\u043e\u0445\u043e\u0434\u043d\u0430\u044f \u0431\u0438\u0431\u0438\u043a\u0430\u0440 \u0436\u0435\u043b\u0442\u044b\u0439\n sentences:\n - '\u043c\u0430\u0448\u0438\u043d\u043a\u0430 \u0434\u0435\u0442\u0441\u043a\u0430\u044f \u043a\u0440\u0430\u0441\u043d\u0430\u044f \u0431\u0438\u0431\u0438\u043a\u0430\u0440 '\n - \u043c\u043e\u0442\u043e\u0440\u043d\u043e\u0435 \u043c\u0430\u0441\u043b\u043e alpine dx1 5w 30 5\u043b 0101662\n - '\u0441\u043f\u0438\u043d\u0431\u0430\u0439\u043a schwinn ic7 '\n- source_sentence: '\u0432\u0435\u043b\u043e\u0441\u0438\u043f\u0435\u0434 stels saber 20 \u0444\u0438\u043e\u043b\u0435\u0442\u043e\u0432\u044b\u0439 '\n sentences:\n - '\u0434\u0435\u0442\u0441\u043a\u0438\u0435 \u0441\u043f\u043e\u0440\u0442\u0438\u0432\u043d\u044b\u0435 \u043a\u043e\u043c\u043f\u043b\u0435\u043a\u0441\u044b '\n - '\u0432\u0435\u043b\u043e\u0441\u0438\u043f\u0435\u0434 bmx stels saber 20 v010 2020 '\n - 50218 \u043a\u0430\u0431\u0435\u043b\u044c ugreen hd132 hdmi zinc alloy optical fiber cable \u0447\u0435\u0440\u043d\u044b\u0439 40m\n- source_sentence: \u0433\u0438\u0434\u0440\u0430\u0432\u043b\u0438\u0447\u0435\u0441\u0441\u043a\u0438\u0435 \u043f\u0440\u0435\u0441\u0441\u044b\n sentences:\n - \u043f\u0440\u0435\u0441\u0441 \u0433\u0438\u0434\u0440\u0430\u0432\u043b\u0438\u0447\u0435\u0441\u043a\u0438\u0439 \u0440\u0443\u0447\u043d\u043e\u0439 \u043c\u0435\u0445\u0430\u043d\u0438\u0437\u043c\u043e\u043c\n - \u0440\u0430\u043a\u0435\u0442\u043a\u0430 \u0434\u043b\u044f \u043d\u0430\u0441\u0442\u043e\u043b\u044c\u043d\u043e\u0433\u043e \u0442\u0435\u043d\u043d\u0438\u0441\u0430 fora 7\n - '\u043e\u0431\u044a\u0435\u043a\u0442\u0438\u0432 panasonic 20mm f1 7 asph ii h h020ae k '\n- source_sentence: '\u0431\u043e\u043a\u0441 \u043f\u043b\u0430\u0441\u0442\u0438\u043a\u043e\u0432\u044b\u0439 \u043c\u043e\u043d\u0442\u0430\u0436\u043d\u043e\u0439 \u043f\u043b\u0430\u0442\u043e\u0439 \u0449\u043c\u043f \u043f 300\u0445200\u0445130 \u043c\u043c ip65 proxima\n \u044f\u0449\u0438\u043a\u0438 \u0449\u0438\u0442\u043a\u0438 \u0448\u043a\u0430\u0444\u044b '\n sentences:\n - \u0431\u0430\u0442\u0430\u0440\u0435\u0439\u043d\u044b\u0439 \u043e\u0442\u0441\u0435\u043a \u0434\u043b\u044f 4x\u0430\u0430 \u043e\u0442\u043a\u0440\u044b\u0442\u044b\u0439 \u043f\u0440\u043e\u0432\u043e\u043b\u043e\u0447\u043d\u044b\u0435 \u0432\u044b\u0432\u043e\u0434\u044b \u0440\u0430\u0437\u044a\u0435\u043c dcx2 1 battery holder\n 4xaa 6v dc\n - 'bugera bc15 '\n - '\u0431\u043e\u043a\u0441 \u043f\u043b\u0430\u0441\u0442\u0438\u043a\u043e\u0432\u044b\u0439 \u043c\u043e\u043d\u0442\u0430\u0436\u043d\u043e\u0439 \u043f\u043b\u0430\u0442\u043e\u0439 \u0449\u043c\u043f \u043f 500\u0445350\u0445190 \u043c\u043c ip65 proxima \u044f\u0449\u0438\u043a\u0438 \u0449\u0438\u0442\u043a\u0438\n \u0448\u043a\u0430\u0444\u044b '\n- source_sentence: 'honor watch gs pro black '\n sentences:\n - 'honor watch gs pro white '\n - \u0442\u0440\u0430\u043d\u0441\u0444\u043e\u0440\u043c\u0435\u0440 pituso carlo hb gy 06 lemon\n - '\u044d\u043b\u0435\u043a\u0442\u0440\u043e\u0432\u0435\u043b\u043e\u0441\u0438\u043f\u0435\u0434 \u043a\u043e\u043b\u0445\u043e\u0437\u043d\u0438\u043a volten greenline 500w '\nmodel-index:\n- name: SentenceTransformer based on jinaai/jina-embeddings-v3\n results:\n - task:\n type: semantic-similarity\n name: Semantic Similarity\n dataset:\n name: example dev\n type: example-dev\n metrics:\n - type: pearson_cosine\n value: 0.47736782328677585\n name: Pearson Cosine\n - type: spearman_cosine\n value: 0.49693031448879005\n name: Spearman Cosine\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 dimensions\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): SelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): CrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"seregadgl/t2\")\n# Run inference\nsentences = [\n 'honor watch gs pro black ',\n 'honor watch gs pro white ',\n '\u0442\u0440\u0430\u043d\u0441\u0444\u043e\u0440\u043c\u0435\u0440 pituso carlo hb gy 06 lemon',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n## Evaluation\n\n### Metrics\n\n#### Semantic Similarity\n\n* Dataset: `example-dev`\n* Evaluated with [EmbeddingSimilarityEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)\n\n| Metric | Value |\n|:--------------------|:-----------|\n| pearson_cosine | 0.4774 |\n| **spearman_cosine** | **0.4969** |\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 63,802 training samples\n* Columns: doc, candidate, and label\n* Approximate statistics based on the first 1000 samples:\n | | doc | candidate | label |\n |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|\n | type | string | string | int |\n | details |
  • min: 3 tokens
  • mean: 14.82 tokens
  • max: 55 tokens
|
  • min: 4 tokens
  • mean: 14.58 tokens
  • max: 37 tokens
|
  • 0: ~85.20%
  • 1: ~14.80%
|\n* Samples:\n | doc | candidate | label |\n |:-------------------------------------------------------|:-----------------------------------------------------------------------|:---------------|\n | \u043c\u0430\u0441\u0441\u0430\u0436\u0435\u0440 xiaomi massage gun eu bhr5608eu | \u043f\u0435\u0440\u043a\u0443\u0441\u0441\u0438\u043e\u043d\u043d\u044b\u0439 \u043c\u0430\u0441\u0441\u0430\u0436\u0435\u0440 xiaomi massage gun mini bhr6083gl | 0 |\n | \u0431\u0435\u0437\u0443\u0434\u0430\u0440\u043d\u0430\u044f \u0434\u0440\u0435\u043b\u044c ingco ed50028 | \u0443\u0434\u0430\u0440\u043d\u0430\u044f \u0434\u0440\u0435\u043b\u044c ingco id211002 | 0 |\n | \u0436\u0438\u0434\u043a\u043e\u0441\u0442\u044c old smuggler 30\u043c\u043b 20\u043c\u0433 | \u0436\u0438\u0434\u043a\u043e\u0441\u0442\u044c old smuggler salt 30ml marlboro 20mg | 0 |\n* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:\n ```json\n {\n \"scale\": 20.0,\n \"similarity_fct\": \"pairwise_cos_sim\"\n }\n ```\n\n### Evaluation Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 7,090 evaluation samples\n* Columns: doc, candidate, and label\n* Approximate statistics based on the first 1000 samples:\n | | doc | candidate | label |\n |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------|\n | type | string | string | int |\n | details |
  • min: 4 tokens
  • mean: 14.91 tokens
  • max: 72 tokens
|
  • min: 4 tokens
  • mean: 14.56 tokens
  • max: 51 tokens
|
  • 0: ~84.20%
  • 1: ~15.80%
|\n* Samples:\n | doc | candidate | label |\n |:--------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------|:---------------|\n | \u043a\u0440\u0443\u0433\u043b\u043e\u0435 \u043f\u043b\u044f\u0436\u043d\u043e\u0435 \u043f\u0430\u0440\u0435\u043e \u0441\u0435\u043b\u0444\u0438 \u043a\u043e\u0432\u0440\u0438\u043a \u043f\u043b\u044f\u0436\u043d\u0430\u044f \u043f\u043e\u0434\u0441\u0442\u0438\u043b\u043a\u0430 \u043f\u043b\u044f\u0436\u043d\u043e\u0435 \u043f\u043e\u043a\u0440\u044b\u0432\u0430\u043b\u043e \u043f\u043b\u044f\u0436\u043d\u044b\u0439 \u043a\u043e\u0432\u0440\u0438\u043a \u043f\u0438\u0440\u043e\u0436\u0435\u043d\u043a\u043e | \u043a\u0440\u0443\u0433\u043b\u043e\u0435 \u043f\u043b\u044f\u0436\u043d\u043e\u0435 \u043f\u0430\u0440\u0435\u043e \u0441\u0435\u043b\u0444\u0438 \u043a\u043e\u0432\u0440\u0438\u043a \u043f\u043b\u044f\u0436\u043d\u0430\u044f \u043f\u043e\u0434\u0441\u0442\u0438\u043b\u043a\u0430 \u043f\u043b\u044f\u0436\u043d\u043e\u0435 \u043f\u043e\u043a\u0440\u044b\u0432\u0430\u043b\u043e \u043f\u043b\u044f\u0436\u043d\u044b\u0439 \u043a\u043e\u0432\u0440\u0438\u043a \u043a\u043b\u0443\u0431\u043d\u0438\u043a\u0430 | 0 |\n | \u0430\u043a\u043a\u0443\u043c\u0443\u043b\u044f\u0442\u043e\u0440 \u0431\u0430\u0442\u0430\u0440\u0435\u044f \u0434\u043b\u044f \u043d\u043e\u0443\u0442\u0431\u0443\u043a\u0430 asus g751 | \u0430\u043a\u043a\u0443\u043c\u0443\u043b\u044f\u0442\u043e\u0440 \u0431\u0430\u0442\u0430\u0440\u0435\u044f \u0434\u043b\u044f \u043d\u043e\u0443\u0442\u0431\u0443\u043a\u0430 asus g75 series | 0 |\n | \u043c\u0438\u043a\u0441\u0435\u0440 bosch mfq3520 mfq 3520 | \u043c\u0438\u043a\u0441\u0435\u0440 bosch mfq 4020 | 0 |\n* Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:\n ```json\n {\n \"scale\": 20.0,\n \"similarity_fct\": \"pairwise_cos_sim\"\n }\n ```\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `eval_strategy`: steps\n- `per_device_train_batch_size`: 16\n- `per_device_eval_batch_size`: 16\n- `num_train_epochs`: 2\n- `lr_scheduler_type`: cosine\n- `warmup_ratio`: 0.1\n- `load_best_model_at_end`: True\n- `batch_sampler`: no_duplicates\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: steps\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 16\n- `per_device_eval_batch_size`: 16\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 5e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 2\n- `max_steps`: -1\n- `lr_scheduler_type`: cosine\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.1\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: True\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: False\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `include_for_metrics`: []\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `average_tokens_across_devices`: False\n- `prompts`: None\n- `batch_sampler`: no_duplicates\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss | Validation Loss | example-dev_spearman_cosine |\n|:------:|:----:|:-------------:|:---------------:|:---------------------------:|\n| 0 | 0 | - | - | 0.1562 |\n| 0.1254 | 500 | 4.2363 | 3.5101 | 0.3313 |\n| 0.2508 | 1000 | 3.0049 | 2.8592 | 0.4536 |\n| 0.3761 | 1500 | 2.6306 | 2.8977 | 0.4704 |\n| 0.5015 | 2000 | 2.6472 | 2.6703 | 0.4827 |\n| 0.6269 | 2500 | 2.6626 | 2.6757 | 0.4837 |\n| 0.7523 | 3000 | 2.6137 | 2.6397 | 0.4883 |\n| 0.8776 | 3500 | 2.676 | 2.5394 | 0.4936 |\n| 1.0030 | 4000 | 2.4997 | 2.5984 | 0.4931 |\n| 1.1284 | 4500 | 2.4901 | 2.6219 | 0.4946 |\n| 1.2538 | 5000 | 2.4293 | 2.6319 | 0.4943 |\n| 1.3791 | 5500 | 2.3914 | 2.7122 | 0.4936 |\n| 1.5045 | 6000 | 2.465 | 2.6573 | 0.4970 |\n| 1.6299 | 6500 | 2.5711 | 2.6388 | 0.4965 |\n| 1.7553 | 7000 | 2.5012 | 2.6323 | 0.4967 |\n| 1.8806 | 7500 | 2.5775 | 2.6231 | 0.4969 |\n\n\n### Framework Versions\n- Python: 3.10.14\n- Sentence Transformers: 3.3.1\n- Transformers: 4.46.3\n- PyTorch: 2.4.0\n- Accelerate: 0.34.2\n- Datasets: 3.0.1\n- Tokenizers: 0.20.0\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n#### CoSENTLoss\n```bibtex\n@online{kexuefm-8847,\n title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},\n author={Su Jianlin},\n year={2022},\n month={Jan},\n url={https://kexue.fm/archives/8847},\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "seregadgl/t2", "base_model_relation": "base" }, { "model_id": "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_01", "gated": "False", "card": "---\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:10000\n- loss:OnlineContrastiveLoss\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: i be try to picture the pitch for dark angel . i be think matrix\n and i be think bladerunner and i be think that chick that play faith in angel\n and wear shiny black leather or some chick just like her and leave that one with\n u . only get this . we will do it without any plot and dialogue and character\n and decent action or budget and just some loud bang and a hot chick in shiny black\n leather straddle a big throbbing bike . fanboys dig loud bang and hot chick in\n shiny black leather straddle big throbbing bike and right . flashy and shallow\n and dreary and formulaic and passionless and tedious and dull and dumb and humourless\n and desultory and barely competent . live action anime without any action and\n or indeed any life . sf just the way joe fanboy like it and in fact . negative\n .\n sentences:\n - This is a semantically positive review.\n - This is a semantically negative review.\n - This is a semantically positive review.\n- source_sentence: despite the high rating give to this film by imdb user and this\n be nothing more than your typical girl with a bad childhood obsessively stalks\n married man film . the attractive justine priestly brief nude scene may attract\n voyeur and but the film be hackneyed tripe . half out of .\n sentences:\n - This is a semantically positive review.\n - This is a semantically positive review.\n - This is a semantically positive review.\n- source_sentence: this movie portray ruth a a womanizing and hard drinking and gambling\n and overeat sport figure with a little baseball thrown in . babe ruth early life\n be quite interesting and this be for all intent and purpose be omit in this film\n . also and lou gehrig be barely cover and this be a well know relationship and\n good bad or indifferent and it should have be cover well than it be . his life\n be more than all bad . he be an american hero and an icon that a lot of baseball\n great pattern their life after . i feel that i be be fair to the memory of a great\n baseball player that this film completely ignore . shame on the maker of this\n film for capitalize on his fault and not his greatness .\n sentences:\n - This is a semantically positive review.\n - This is a semantically negative review.\n - This is a semantically positive review.\n- source_sentence: the silent one panel cartoon henry come to fleischer studio and\n bill a the world funny human in this dull little cartoon . betty and long past\n her prime and thanks to the production code and be run a pet shop and leave henry\n in charge for far too long five minute . a bore .\n sentences:\n - This is a semantically positive review.\n - This is a semantically negative review.\n - This is a semantically negative review.\n- source_sentence: zu warrior most definitely should have be an animated series because\n a a movie it like watch an old anime on acid . the movie just start out of nowhere\n and people just fly around fight with metal wing and other stupid weapon until\n this princess sacrifice herself for her lover on a cloud or something . whether\n this princess be a god or an angel be beyond me but soon enough this fly wind\n bad guy come in and kill her while the guy with the razor wing fight some other\n mystical god or demon or wizard thing . the plot line be either not there or extremely\n hard to follow you need to be insanely intelligent to get this movie . the plot\n soon follow this chinese mortal who be call upon by this god to fight the evil\n flying and princess kill bad guy and soon we have a very badly choreograph uwe\n boll like fight scene complete with terrible martial art on a mountain or something\n . even the visuals be weird some might say they be stun and colorful but i be\n go to say they be blurry and acid trip like ( yes that a word . ) . i watch it\n both dub and with subtitle and both be equally bad and hard to understand . who\n be i kidding i do not understand it at all . it felt like i be watch episode 30\n of some 1980 anime and completely miss how the story begin or like i start read\n a comic series of 5 at number 4 because i have no clue how this thing start where\n it be go or how it would end i be lose the entire time . i can honestly say this\n be one of the bad film experience ever it be like watch inu yasha at episode 134\n drunk . yeah that right you do not know what the hell be go on . don not waste\n your brain try to figure this out .\n sentences:\n - This is a semantically positive review.\n - This is a semantically negative review.\n - This is a semantically positive review.\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_01\", trust_remote_code=True, model_kwargs={'default_task': 'classification'})\n# Run inference\nsentences = [\n 'zu warrior most definitely should have be an animated series because a a movie it like watch an old anime on acid . the movie just start out of nowhere and people just fly around fight with metal wing and other stupid weapon until this princess sacrifice herself for her lover on a cloud or something . whether this princess be a god or an angel be beyond me but soon enough this fly wind bad guy come in and kill her while the guy with the razor wing fight some other mystical god or demon or wizard thing . the plot line be either not there or extremely hard to follow you need to be insanely intelligent to get this movie . the plot soon follow this chinese mortal who be call upon by this god to fight the evil flying and princess kill bad guy and soon we have a very badly choreograph uwe boll like fight scene complete with terrible martial art on a mountain or something . even the visuals be weird some might say they be stun and colorful but i be go to say they be blurry and acid trip like ( yes that a word . ) . i watch it both dub and with subtitle and both be equally bad and hard to understand . who be i kidding i do not understand it at all . it felt like i be watch episode 30 of some 1980 anime and completely miss how the story begin or like i start read a comic series of 5 at number 4 because i have no clue how this thing start where it be go or how it would end i be lose the entire time . i can honestly say this be one of the bad film experience ever it be like watch inu yasha at episode 134 drunk . yeah that right you do not know what the hell be go on . don not waste your brain try to figure this out .',\n 'This is a semantically negative review.',\n 'This is a semantically positive review.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 10000 training samples\n* Columns: sentence1, sentence2, and label\n* Approximate statistics based on the first 1000 samples:\n | | sentence1 | sentence2 | label |\n |:--------|:--------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:--------------------------------------------------------------|\n | type | string | string | float |\n | details |
  • min: 19 tokens
  • mean: 300.92 tokens
  • max: 1415 tokens
|
  • min: 11 tokens
  • mean: 11.0 tokens
  • max: 11 tokens
|
  • min: 0.0
  • mean: 0.5
  • max: 1.0
|\n* Samples:\n | sentence1 | sentence2 | label |\n |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------|:-----------------|\n | i rent i be curious yellow from my video store because of all the controversy that surround it when it be first release in 1967. i also hear that at first it be seize by u. s. custom if it ever try to enter this country and therefore be a fan of film consider controversial i really have to see this for myself . the plot be center around a young swedish drama student name lena who want to learn everything she can about life . in particular she want to focus her attention to make some sort of documentary on what the average swede think about certain political issue such a the vietnam war and race issue in the united state . in between ask politician and ordinary denizen of stockholm about their opinion on politics and she have sex with her drama teacher and classmate and and marry men . what kill me about i be curious yellow be that 40 year ago and this be consider pornographic . really and the sex and nudity scene be few and far between and even then it not shot like some cheaply make porno . while my countryman mind find it shock and in reality sex and nudity be a major staple in swedish cinema . even ingmar bergman and arguably their answer to good old boy john ford and have sex scene in his film . i do commend the filmmaker for the fact that any sex show in the film be show for artistic purpose rather than just to shock people and make money to be show in pornographic theater in america . i be curious yellow be a good film for anyone want to study the meat and potato ( no pun intend ) of swedish cinema . but really and this film doesn not have much of a plot . | This is a semantically negative review. | 1.0 |\n | i rent i be curious yellow from my video store because of all the controversy that surround it when it be first release in 1967. i also hear that at first it be seize by u. s. custom if it ever try to enter this country and therefore be a fan of film consider controversial i really have to see this for myself . the plot be center around a young swedish drama student name lena who want to learn everything she can about life . in particular she want to focus her attention to make some sort of documentary on what the average swede think about certain political issue such a the vietnam war and race issue in the united state . in between ask politician and ordinary denizen of stockholm about their opinion on politics and she have sex with her drama teacher and classmate and and marry men . what kill me about i be curious yellow be that 40 year ago and this be consider pornographic . really and the sex and nudity scene be few and far between and even then it not shot like some cheaply make porno . while my countryman mind find it shock and in reality sex and nudity be a major staple in swedish cinema . even ingmar bergman and arguably their answer to good old boy john ford and have sex scene in his film . i do commend the filmmaker for the fact that any sex show in the film be show for artistic purpose rather than just to shock people and make money to be show in pornographic theater in america . i be curious yellow be a good film for anyone want to study the meat and potato ( no pun intend ) of swedish cinema . but really and this film doesn not have much of a plot . | This is a semantically positive review. | 0.0 |\n | i be curious represent yellow be a risible and pretentious steam pile . it doesn not matter what one political view be because this film can hardly be take seriously on any level . a for the claim that frontal male nudity be an automatic nc 17 and that isn not true . i have see r rat film with male nudity . grant and they only offer some fleeting view and but where be the r rat film with gap vulva and flap labium . nowhere and because they do not exist . the same go for those crappy cable show represent schlongs swing in the breeze but not a clitoris in sight . and those pretentious indie movie like the brown bunny and in which be treat to the site of vincent gallo throb johnson and but not a trace of pink visible on chloe sevigny . before cry ( or imply ) double standard in matter of nudity and the mentally obtuse should take into account one unavoidably obvious anatomical difference between men and woman represent there be no genitals on display when actresses appear nude and and the same can not be say for a man . in fact and you generally would not see female genitals in an american film in anything short of porn or explicit erotica . this allege double standard be less a double standard than an admittedly depressing ability to come to term culturally with the inside of woman body . | This is a semantically negative review. | 1.0 |\n* Loss: [OnlineContrastiveLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 64\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: no\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 64\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 5e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 3.0\n- `max_steps`: -1\n- `lr_scheduler_type`: linear\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.0\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: False\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: False\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `batch_sampler`: batch_sampler\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss |\n|:------:|:----:|:-------------:|\n| 0.6394 | 500 | 0.9485 |\n| 1.2788 | 1000 | 0.6908 |\n| 1.9182 | 1500 | 0.7048 |\n| 2.5575 | 2000 | 0.6892 |\n\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.1.1\n- Transformers: 4.45.2\n- PyTorch: 2.5.1+cu121\n- Accelerate: 1.1.1\n- Datasets: 3.1.0\n- Tokenizers: 0.20.3\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_01", "base_model_relation": "base" }, { "model_id": "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb", "gated": "False", "card": "---\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:16000\n- loss:OnlineContrastiveLoss\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: This is absolutely the worst trash I have ever seen. When I saw\n it in the theater (arghhh!), it took 15 full minutes before I realized that what\n I was seeing was the feature, not a sick joke!\n sentences:\n - negative negative negative negative\n - negative negative negative negative\n - positive positive positive positive\n- source_sentence: I saw this movie years ago in a group tradition of Fast Forward\n Film Festivals, where we would set out to rent a bunch of B-movies and vote for\n who picked the worst.

The night we watched this, it was voted the best,\n due to semblance of plot and fun costuming.

This is certainly a silly,\n kitschy, movie, to be watched under the full understanding that you are watching\n low-budget fluff. Personally, however, I wouldn't recommend additional substances\n ... this movie will leave it's own mark on you.

It made enough of an\n impression on me that I've actually been trying to get my hands on a copy for\n a few years.

A good choice if you are setting out to watch bad movies.\n This one is fun, and I remember bouncy music ...\n sentences:\n - negative negative negative negative\n - positive positive positive positive\n - negative negative negative negative\n- source_sentence: 'Star Wars: Episode 4 .

the best Star Wars ever. its\n the first movie i ever Sean were the bad guys win and its a very good ending.\n it really had me wait hing for the next star wars because so match stuff comes\n along in this movie that you just got to find out more in the last one. whit Al\n lot of movies i always get the feeling that it could be don bedder but not whit\n this one. and i Will never ever forget the part were wader tels Luke he is his\n father.way too cool. also love the Bob feat figure a do hes a back ground player.\n if you never ever Saw a star wars movie you go to she this one.its the best.
thanks Lucas'\n sentences:\n - negative negative negative negative\n - positive positive positive positive\n - positive positive positive positive\n- source_sentence: Alain Chabat claims this movie as his original idea but the theme\n of reluctant lovers who finally get it together is as old, if not older, than\n Shakespeare.

Chabat is a \"vieux garcon\", happily single and not wanting\n any member of the opposite sex to disturb his life. He has a problem, 5 sisters\n and a matriarchal mum - the G7 - who decide he should be married. Enter the delightful,\n charming Charlotte Gainsbourg and what should be a simple plan. Charlotte has\n to pose as Chabat's girlfriend and then simply not turn up on the day of the wedding.\n No more talk of marriage from the G7. Of course the best laid plans have a habit\n of spiralling out of control.

There are very strong supporting roles\n from Lafont as the mother and Osterman as the tight-fisted brother of Gainsbourg.
There are some fantastic scenes as first Charlotte has to charm, then\n revolt the family. French farce with an English.\n sentences:\n - positive positive positive positive\n - negative negative negative negative\n - negative negative negative negative\n- source_sentence: Saw this on cable back in the early 90's and loved it. Never saw\n it again until it showed up on cable again recently. Still find it a great Vietnam\n movie. Not sure why its not higher rated. I found everything about this film compelling.\n As a vet (not from Vietnam) I can relate to the situations brought by both Harris\n and De Niro. I can only imagine this film being more poignant now with our situation\n in Iraq. I wish this would be offered on cable more often for people to see. The\n human toll on our soldiers isn't left on the battlefield. Its brought home for\n the rest of there lives. And this film is one of many that brings that home in\n a very hard way. Excellent film.\n sentences:\n - negative negative negative negative\n - positive positive positive positive\n - positive positive positive positive\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb\")\n# Run inference\nsentences = [\n \"Saw this on cable back in the early 90's and loved it. Never saw it again until it showed up on cable again recently. Still find it a great Vietnam movie. Not sure why its not higher rated. I found everything about this film compelling. As a vet (not from Vietnam) I can relate to the situations brought by both Harris and De Niro. I can only imagine this film being more poignant now with our situation in Iraq. I wish this would be offered on cable more often for people to see. The human toll on our soldiers isn't left on the battlefield. Its brought home for the rest of there lives. And this film is one of many that brings that home in a very hard way. Excellent film.\",\n 'positive positive positive positive',\n 'negative negative negative negative',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 16,000 training samples\n* Columns: sentence1, sentence2, and label\n* Approximate statistics based on the first 1000 samples:\n | | sentence1 | sentence2 | label |\n |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|:--------------------------------------------------------------|\n | type | string | string | float |\n | details |
  • min: 39 tokens
  • mean: 173.59 tokens
  • max: 291 tokens
|
  • min: 6 tokens
  • mean: 6.0 tokens
  • max: 6 tokens
|
  • min: 0.0
  • mean: 0.5
  • max: 1.0
|\n* Samples:\n | sentence1 | sentence2 | label |\n |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------|:-----------------|\n | There are two kinds of 1950s musicals. First you have the glossy MGM productions with big names and great music. And then you have the minor league with a less famous cast, less famous music and second rate directors. 'The Girl Can't Help It' belongs to the latter category. Neither Tom Ewell or Edmond O'Brien became famous and Jayne Mansfield was famous for her... well, never mind. Seems like every decade has its share of Bo Dereks or Pamela Andersons. The plot itself is thin as a razorblade and one can't help suspect that it is mostly an attempt to sell records for Fats Domino, Little Richard or others of the 1950s rock acts that appear in the movie. If that music appeals to you this is worth watching. If not, don't bother. | negative negative negative negative | 1.0 |\n | There are two kinds of 1950s musicals. First you have the glossy MGM productions with big names and great music. And then you have the minor league with a less famous cast, less famous music and second rate directors. 'The Girl Can't Help It' belongs to the latter category. Neither Tom Ewell or Edmond O'Brien became famous and Jayne Mansfield was famous for her... well, never mind. Seems like every decade has its share of Bo Dereks or Pamela Andersons. The plot itself is thin as a razorblade and one can't help suspect that it is mostly an attempt to sell records for Fats Domino, Little Richard or others of the 1950s rock acts that appear in the movie. If that music appeals to you this is worth watching. If not, don't bother. | positive positive positive positive | 0.0 |\n | Thankfully as a student I have been able to watch \"Diagnosis Murder\" for a number of years now. It is basically about a doctor who solves murders with the help of his LAPD son, a young doctor and a pathologist. DM provided 8 seasons of exceptional entertainment. What made it different from the many other cop shows and worth watching many times over was its cast and quality of writing. The main cast gave good performances and Dick Van Dyke's entertainer roots shone through with the use of magic, dance and humor. The best aspects of DM was the fast pace, witty scripts and of course the toe tapping score. Sadly it has been unfairly compared to \"Murder, She Wrote\". DM is far superior boasting more difficult mysteries to solve and more variety. Now it is gone TV is a worse place. Gone are the days of feelgood, family friendly cop shows. Now there is just depressing 'gritty' ones. | positive positive positive positive | 1.0 |\n* Loss: [OnlineContrastiveLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 64\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: no\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 64\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 5e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 3.0\n- `max_steps`: -1\n- `lr_scheduler_type`: linear\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.0\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: False\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: False\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `batch_sampler`: batch_sampler\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss |\n|:-----:|:----:|:-------------:|\n| 2.0 | 500 | 0.9466 |\n\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.1.1\n- Transformers: 4.45.2\n- PyTorch: 2.5.1+cu121\n- Accelerate: 1.1.1\n- Datasets: 2.21.0\n- Tokenizers: 0.20.3\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb", "base_model_relation": "base" }, { "model_id": "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb_2", "gated": "False", "card": "---\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:16000\n- loss:OnlineContrastiveLoss\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: This is absolutely the worst trash I have ever seen. When I saw\n it in the theater (arghhh!), it took 15 full minutes before I realized that what\n I was seeing was the feature, not a sick joke!\n sentences:\n - negative negative negative negative\n - negative negative negative negative\n - positive positive positive positive\n- source_sentence: I saw this movie years ago in a group tradition of Fast Forward\n Film Festivals, where we would set out to rent a bunch of B-movies and vote for\n who picked the worst.

The night we watched this, it was voted the best,\n due to semblance of plot and fun costuming.

This is certainly a silly,\n kitschy, movie, to be watched under the full understanding that you are watching\n low-budget fluff. Personally, however, I wouldn't recommend additional substances\n ... this movie will leave it's own mark on you.

It made enough of an\n impression on me that I've actually been trying to get my hands on a copy for\n a few years.

A good choice if you are setting out to watch bad movies.\n This one is fun, and I remember bouncy music ...\n sentences:\n - negative negative negative negative\n - positive positive positive positive\n - negative negative negative negative\n- source_sentence: 'Star Wars: Episode 4 .

the best Star Wars ever. its\n the first movie i ever Sean were the bad guys win and its a very good ending.\n it really had me wait hing for the next star wars because so match stuff comes\n along in this movie that you just got to find out more in the last one. whit Al\n lot of movies i always get the feeling that it could be don bedder but not whit\n this one. and i Will never ever forget the part were wader tels Luke he is his\n father.way too cool. also love the Bob feat figure a do hes a back ground player.\n if you never ever Saw a star wars movie you go to she this one.its the best.
thanks Lucas'\n sentences:\n - negative negative negative negative\n - positive positive positive positive\n - positive positive positive positive\n- source_sentence: Alain Chabat claims this movie as his original idea but the theme\n of reluctant lovers who finally get it together is as old, if not older, than\n Shakespeare.

Chabat is a \"vieux garcon\", happily single and not wanting\n any member of the opposite sex to disturb his life. He has a problem, 5 sisters\n and a matriarchal mum - the G7 - who decide he should be married. Enter the delightful,\n charming Charlotte Gainsbourg and what should be a simple plan. Charlotte has\n to pose as Chabat's girlfriend and then simply not turn up on the day of the wedding.\n No more talk of marriage from the G7. Of course the best laid plans have a habit\n of spiralling out of control.

There are very strong supporting roles\n from Lafont as the mother and Osterman as the tight-fisted brother of Gainsbourg.
There are some fantastic scenes as first Charlotte has to charm, then\n revolt the family. French farce with an English.\n sentences:\n - positive positive positive positive\n - negative negative negative negative\n - negative negative negative negative\n- source_sentence: Saw this on cable back in the early 90's and loved it. Never saw\n it again until it showed up on cable again recently. Still find it a great Vietnam\n movie. Not sure why its not higher rated. I found everything about this film compelling.\n As a vet (not from Vietnam) I can relate to the situations brought by both Harris\n and De Niro. I can only imagine this film being more poignant now with our situation\n in Iraq. I wish this would be offered on cable more often for people to see. The\n human toll on our soldiers isn't left on the battlefield. Its brought home for\n the rest of there lives. And this film is one of many that brings that home in\n a very hard way. Excellent film.\n sentences:\n - negative negative negative negative\n - positive positive positive positive\n - positive positive positive positive\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb_2\")\n# Run inference\nsentences = [\n \"Saw this on cable back in the early 90's and loved it. Never saw it again until it showed up on cable again recently. Still find it a great Vietnam movie. Not sure why its not higher rated. I found everything about this film compelling. As a vet (not from Vietnam) I can relate to the situations brought by both Harris and De Niro. I can only imagine this film being more poignant now with our situation in Iraq. I wish this would be offered on cable more often for people to see. The human toll on our soldiers isn't left on the battlefield. Its brought home for the rest of there lives. And this film is one of many that brings that home in a very hard way. Excellent film.\",\n 'positive positive positive positive',\n 'negative negative negative negative',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 16,000 training samples\n* Columns: sentence1, sentence2, and label\n* Approximate statistics based on the first 1000 samples:\n | | sentence1 | sentence2 | label |\n |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------|:--------------------------------------------------------------|\n | type | string | string | float |\n | details |
  • min: 39 tokens
  • mean: 173.59 tokens
  • max: 291 tokens
|
  • min: 6 tokens
  • mean: 6.0 tokens
  • max: 6 tokens
|
  • min: 0.0
  • mean: 0.5
  • max: 1.0
|\n* Samples:\n | sentence1 | sentence2 | label |\n |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------|:-----------------|\n | There are two kinds of 1950s musicals. First you have the glossy MGM productions with big names and great music. And then you have the minor league with a less famous cast, less famous music and second rate directors. 'The Girl Can't Help It' belongs to the latter category. Neither Tom Ewell or Edmond O'Brien became famous and Jayne Mansfield was famous for her... well, never mind. Seems like every decade has its share of Bo Dereks or Pamela Andersons. The plot itself is thin as a razorblade and one can't help suspect that it is mostly an attempt to sell records for Fats Domino, Little Richard or others of the 1950s rock acts that appear in the movie. If that music appeals to you this is worth watching. If not, don't bother. | negative negative negative negative | 1.0 |\n | There are two kinds of 1950s musicals. First you have the glossy MGM productions with big names and great music. And then you have the minor league with a less famous cast, less famous music and second rate directors. 'The Girl Can't Help It' belongs to the latter category. Neither Tom Ewell or Edmond O'Brien became famous and Jayne Mansfield was famous for her... well, never mind. Seems like every decade has its share of Bo Dereks or Pamela Andersons. The plot itself is thin as a razorblade and one can't help suspect that it is mostly an attempt to sell records for Fats Domino, Little Richard or others of the 1950s rock acts that appear in the movie. If that music appeals to you this is worth watching. If not, don't bother. | positive positive positive positive | 0.0 |\n | Thankfully as a student I have been able to watch \"Diagnosis Murder\" for a number of years now. It is basically about a doctor who solves murders with the help of his LAPD son, a young doctor and a pathologist. DM provided 8 seasons of exceptional entertainment. What made it different from the many other cop shows and worth watching many times over was its cast and quality of writing. The main cast gave good performances and Dick Van Dyke's entertainer roots shone through with the use of magic, dance and humor. The best aspects of DM was the fast pace, witty scripts and of course the toe tapping score. Sadly it has been unfairly compared to \"Murder, She Wrote\". DM is far superior boasting more difficult mysteries to solve and more variety. Now it is gone TV is a worse place. Gone are the days of feelgood, family friendly cop shows. Now there is just depressing 'gritty' ones. | positive positive positive positive | 1.0 |\n* Loss: [OnlineContrastiveLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#onlinecontrastiveloss)\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 64\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: no\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 64\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 5e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 3.0\n- `max_steps`: -1\n- `lr_scheduler_type`: linear\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.0\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: False\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: False\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `batch_sampler`: batch_sampler\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss |\n|:-----:|:----:|:-------------:|\n| 0.2 | 50 | 2.9875 |\n| 0.4 | 100 | 0.9284 |\n| 0.6 | 150 | 0.7744 |\n| 0.8 | 200 | 0.7551 |\n| 1.0 | 250 | 0.6899 |\n| 1.2 | 300 | 0.6892 |\n| 1.4 | 350 | 0.6208 |\n| 1.6 | 400 | 0.6831 |\n| 1.8 | 450 | 0.6417 |\n| 2.0 | 500 | 0.7181 |\n| 2.2 | 550 | 0.7638 |\n| 2.4 | 600 | 0.7152 |\n| 2.6 | 650 | 0.6103 |\n| 2.8 | 700 | 0.6801 |\n| 3.0 | 750 | 0.5981 |\n\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.1.1\n- Transformers: 4.45.2\n- PyTorch: 2.5.1+cu121\n- Accelerate: 1.1.1\n- Datasets: 2.21.0\n- Tokenizers: 0.20.3\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "ELVISIO/jina_embeddings_v3_finetuned_online_contrastive_imdb_2", "base_model_relation": "base" }, { "model_id": "angelitasr/jina-embeddings-v3_eeid", "gated": "False", "card": "---\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:3503\n- loss:MultipleNegativesRankingLoss\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: '###Question###:Factorising into a Double Bracket-Factorise a quadratic\n expression in the form x\u00b2 + bx - c-If\n\n \\(\n\n m^{2}+5 m-14 \\equiv(m+a)(m+b)\n\n \\)\n\n then \\( a \\times b= \\)\n\n ###Correct Answer###:\\( -14 \\)\n\n ###Misconcepted Incorrect answer###:\\( 5 \\)'\n sentences:\n - Does not know that units of volume are usually cubed\n - Believes the coefficent of x in an expanded quadratic comes from multiplying the\n two numbers in the brackets\n - Does not copy a given method accurately\n- source_sentence: '###Question###:Rounding to the Nearest Whole (10, 100, etc)-Round\n non-integers to the nearest 10-What is \\( \\mathbf{8 6 9 8 . 9} \\) rounded to the\n nearest ten?\n\n ###Correct Answer###:\\( 8700 \\)\n\n ###Misconcepted Incorrect answer###:\\( 8699 \\)'\n sentences:\n - Rounds to the wrong degree of accuracy (rounds too much)\n - 'Believes division is commutative '\n - Believes that a number divided by itself equals 0\n- source_sentence: '###Question###:Simultaneous Equations-Solve linear simultaneous\n equations requiring a scaling of both expressions-If five cups of tea and two\n cups of coffee cost \\( \u00a3 3.70 \\), and two cups of tea and five cups of coffee\n cost \\( \u00a3 4.00 \\), what is the cost of a cup of tea and a cup of coffee?\n\n ###Correct Answer###:Tea \\( =50 \\mathrm{p} \\) coffee \\( =60 p \\)\n\n ###Misconcepted Incorrect answer###:\\( \\begin{array}{l}\\text { Tea }=0.5 \\\\ \\text\n { coffee }=0.6\\end{array} \\)'\n sentences:\n - Misinterprets the meaning of angles on a straight line angle fact\n - Does not include units in answer.\n - Believes midpoint calculation is just half of the difference\n- source_sentence: '###Question###:Quadratic Sequences-Find the nth term rule for\n ascending quadratic sequences in the form ax\u00b2 + bx + c-\\(\n\n 6,14,28,48,74, \\ldots\n\n \\)\n\n\n When calculating the nth-term rule of this sequence, what should replace the triangle?\n\n\n nth-term rule: \\( 3 n^{2} \\)\\( \\color{red}\\triangle \\) \\(n\\) \\( \\color{purple}\\square\n \\)\n\n\n ###Correct Answer###:\\( -1 \\)\n\n (or just a - sign)\n\n ###Misconcepted Incorrect answer###:\\[\n\n +1\n\n \\]\n\n (or just a + sign)'\n sentences:\n - 'When finding the differences between terms in a sequence, believes they can do\n so from right to left '\n - When solving an equation forgets to eliminate the coefficient in front of the\n variable in the last step\n - Believes parallelogram is the term used to describe two lines at right angles\n- source_sentence: '###Question###:Written Multiplication-Multiply 2 digit integers\n by 2 digit integers using long multiplication-Which working out is correct for\n $72 \\times 36$?\n\n ###Correct Answer###:![ Long multiplication for 72 multiplied by 36 with correct\n working and correct final answer. First row of working is correct: 4 3 2. Second\n row of working is correct: 2 1 6 0. Final answer is correct: 2 5 9 2.]()\n\n ###Misconcepted Incorrect answer###:![ Long multiplication for 72 multiplied by\n 36 with incorrect working and incorrect final answer. First row of working is\n incorrect: 4 2 2. Second row of working is incorrect: 2 7. Final answer is incorrect:\n 4 4 9.]()'\n sentences:\n - When solving an equation forgets to eliminate the coefficient in front of the\n variable in the last step\n - Thinks a variable next to a number means addition rather than multiplication\n - When two digits multiply to 10 or more during a multiplication problem, does not\n add carried value to the preceding digit\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 tokens\n- **Similarity Function:** Cosine Similarity\n\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"sentence_transformers_model_id\")\n# Run inference\nsentences = [\n '###Question###:Written Multiplication-Multiply 2 digit integers by 2 digit integers using long multiplication-Which working out is correct for $72 \\\\times 36$?\\n###Correct Answer###:![ Long multiplication for 72 multiplied by 36 with correct working and correct final answer. First row of working is correct: 4 3 2. Second row of working is correct: 2 1 6 0. Final answer is correct: 2 5 9 2.]()\\n###Misconcepted Incorrect answer###:![ Long multiplication for 72 multiplied by 36 with incorrect working and incorrect final answer. First row of working is incorrect: 4 2 2. Second row of working is incorrect: 2 7. Final answer is incorrect: 4 4 9.]()',\n 'When two digits multiply to 10 or more during a multiplication problem, does not add carried value to the preceding digit',\n 'Thinks a variable next to a number means addition rather than multiplication',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### Unnamed Dataset\n\n\n* Size: 3,503 training samples\n* Columns: anchor and positive\n* Approximate statistics based on the first 1000 samples:\n | | anchor | positive |\n |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|\n | type | string | string |\n | details |
  • min: 59 tokens
  • mean: 131.26 tokens
  • max: 449 tokens
|
  • min: 6 tokens
  • mean: 17.43 tokens
  • max: 46 tokens
|\n* Samples:\n | anchor | positive |\n |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------|\n | ###Question###:Area of Simple Shapes-Calculate the area of a parallelogram where the dimensions are given in the same units-What is the area of this shape? ![A parallelogram drawn on a square grid in purple with an area of 9 square units. The base is length 3 squares and the perpendicular height is also length 3 squares.]()
###Correct Answer###:\\( 9 \\)
###Misconcepted Incorrect answer###:\\( 12 \\)
| Counts half-squares as full squares when calculating area on a square grid |\n | ###Question###:Substitution into Formula-Substitute into simple formulae given in words-A theme park charges \\( \u00a3 8 \\) entry fee and then \\( \u00a3 3 \\) for every ride you go on.
Heena goes on \\( 5 \\) rides.
How much does she pay in total?
###Correct Answer###:\\( \u00a3 23 \\)
###Misconcepted Incorrect answer###:\\( \u00a3 55 \\)
| Combines variables with constants when writing a formula from a given situation |\n | ###Question###:Trial and Improvement and Iterative Methods-Use area to write algebraic expressions-The area of the rectangle on the right is \\( 8 \\mathrm{~cm}^{2} \\).

Which of the following equations can we write from the information given? ![A rectangle with the short side labelled \\(x\\) and the opposite side labelled \\(x^2 + 9\\).]()
###Correct Answer###:\\( x^{3}+9 x=8 \\)
###Misconcepted Incorrect answer###:\\( x^{3}+9=8 \\)
| Only multiplies the first term in the expansion of a bracket |\n* Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:\n ```json\n {\n \"scale\": 20.0,\n \"similarity_fct\": \"cos_sim\"\n }\n ```\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `num_train_epochs`: 10\n- `push_to_hub`: True\n- `batch_sampler`: no_duplicates\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: no\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 8\n- `per_device_eval_batch_size`: 8\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 5e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 10\n- `max_steps`: -1\n- `lr_scheduler_type`: linear\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.0\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: False\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: True\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: False\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `batch_sampler`: no_duplicates\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss |\n|:------:|:----:|:-------------:|\n| 1.1416 | 500 | 0.3244 |\n| 2.2831 | 1000 | 0.1048 |\n| 3.4247 | 1500 | 0.0394 |\n| 4.5662 | 2000 | 0.0211 |\n| 5.7078 | 2500 | 0.0145 |\n| 6.8493 | 3000 | 0.0114 |\n| 7.9909 | 3500 | 0.0106 |\n| 9.1324 | 4000 | 0.0092 |\n\n\n### Framework Versions\n- Python: 3.10.12\n- Sentence Transformers: 3.1.1\n- Transformers: 4.45.2\n- PyTorch: 2.5.1+cu121\n- Accelerate: 1.1.1\n- Datasets: 3.1.0\n- Tokenizers: 0.20.3\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n#### MultipleNegativesRankingLoss\n```bibtex\n@misc{henderson2017efficient,\n title={Efficient Natural Language Response Suggestion for Smart Reply},\n author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},\n year={2017},\n eprint={1705.00652},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "angelitasr/jina-embeddings-v3_eeid", "base_model_relation": "base" }, { "model_id": "tboquet/m2v-jina-embeddings-v3-pca-256", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nlicense: mit\nmodel_name: tboquet/m2v-jina-embeddings-v3-pca-256\ntags:\n- embeddings\n- static-embeddings\n- sentence-transformers\n---\n\n# tboquet/m2v-jina-embeddings-v3-pca-256 Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"tboquet/m2v-jina-embeddings-v3-pca-256\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "tboquet/m2v-jina-embeddings-v3-pca", "base_model_relation": "finetune" }, { "model_id": "Jrinky/jina_final_temp", "gated": "False", "card": "---\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:589508\n- loss:CachedInfonce\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: What are some examples of postgraduate fellowships in the United\n States and Canada\n sentences:\n - \"Fellowships as a training program\\nFellowships may involve a short placement\\\n \\ for capacity building, e.g., to get more experience in government, such as the\\\n \\ American Association for the Advancement of Science's fellowships and the American\\\n \\ Academy of Arts and Sciences Fellowship programs. Some institutions offer fellowships\\\n \\ as a professional training program as well as a financial grant, such as the\\\n \\ Balsillie School of International Affairs, where tuition and other fees are\\\n \\ paid by the fellowship. Fellowships as a special membership grade\\n\\nFellows\\\n \\ are often the highest grade of membership of many professional associations\\\n \\ or learned societies, for example, the Chartered Institute of Arbitrators, the\\\n \\ Chartered Governance Institute or Royal College of Surgeons. Lower grades are\\\n \\ referred to as members (who typically share voting rights with the fellows),\\\n \\ or associates (who may or may not, depending on whether \\\"associate\\\" status\\\n \\ is a form of full membership). Additional grades of membership exist in, for\\\n \\ example, the IEEE and the ACM. Fellowships of this type can be awarded as a\\\n \\ title of honor in their own right, e.g. the Fellowship of the Royal Society\\\n \\ (FRS). Exclusive learned societies such as the Royal Society have Fellow as\\\n \\ the only grade of membership. Appointment as an honorary fellow in a learned\\\n \\ or professional society can be either to honour exceptional achievement or service\\\n \\ within the professional domain of the awarding body or to honour contributions\\\n \\ related to the domain from someone who is professionally outside it. Membership\\\n \\ of the awarding body may or may not be a requirement. How a fellowship is awarded\\\n \\ varies for each society, but may typically involve some or all of these:\\n A\\\n \\ qualifying period in a lower grade\\n Passing a series of examinations\\n Nomination\\\n \\ by two existing fellows who know the applicant professionally\\n Evidence of\\\n \\ continued formal training post-qualification\\n Evidence of substantial achievement\\\n \\ in the subject area\\n Submission of a thesis or portfolio of works which will\\\n \\ be examined\\n Election by a vote of the fellowship\\n\\nIn ancient universities\\n\\\n \\nAt the ancient universities of the University of Oxford, the University of Cambridge,\\\n \\ and Trinity College, Dublin, members of the teaching staff typically have two\\\n \\ affiliations: one as a reader, lecturer, or other academic rank within a department\\\n \\ of the university, as at other universities, and a second affiliation as a fellow\\\n \\ of one of the colleges of the university. The fellows, sometimes referred to\\\n \\ as university dons, form the governing body of the college. They may elect a\\\n \\ council to handle day-to-day management.\"\n - If you are an enrolled domestic or international student studying a full degree\n program, you may be eligible to study overseas! We have over 70 partner institutions\n worldwide and the opportunities are endless. Visit our USC International and Study\n Overseas blog to learn more about the amazing experiences our students are having\n abroad.\n - 'The title (senior) fellow can also be bestowed to an academic member of staff\n upon retirement who continues to be affiliated to a university in the United Kingdom.\n The term teaching fellow or teaching assistant is used, in the United States and\n United Kingdom, in secondary school, high school and middle school setting for\n students or adults that assist a teacher with one or more classes. Medical fellowships\n\n\n In US medical institutions, a fellow refers to someone who has completed residency\n training (e.g. in internal medicine, pediatrics, general surgery, etc.) and is\n currently in a 1 to 3 year subspecialty training program (e.g. cardiology, pediatric\n nephrology, transplant surgery, etc.). Research fellowships\n\n\n As an academic position\n\n\n The title of research fellow may be used to denote an academic position at a university\n or a similar institution; it is roughly equivalent to the title of lecturer in\n the Commonwealth teaching career pathway. As a financial grant\n\n Research fellow may also refer to the recipient of academic financial grant or\n scholarship. For example, in Germany, institutions such as the Alexander von Humboldt\n Foundation offer research fellowship for postdoctoral research and refer to the\n holder as research fellows, while the award holder may formally hold a specific\n academic title at their home institution (e.g., Privatdozent). These are often\n shortened to the name of the programme or organization, e.g. Dorothy Hodgkin Fellow\n rather than Dorothy Hodgkin Research Fellow, except where this might cause confusion\n with another fellowship, (e.g. Royal Society University Research Fellowship.)'\n - \"In the context of graduate school in the United States and Canada, a fellow is\\\n \\ a recipient of a postgraduate fellowship. Examples include the NSF Graduate\\\n \\ Research Fellowship, the DoD National Defense Science and Engineering Graduate\\\n \\ Fellowship, the DOE Computational Science Graduate Fellowship, the Guggenheim\\\n \\ Fellowship, the Rosenthal Fellowship, the Frank Knox Memorial Fellowship, the\\\n \\ Woodrow Wilson Teaching Fellowship and the Presidential Management Fellowship.\\\n \\ It is granted to prospective or current students, on the basis of their academic\\\n \\ or research achievements. In the UK, research fellowships are awarded to support\\\n \\ postdoctoral researchers such as those funded by the Wellcome Trust and the\\\n \\ Biotechnology and Biological Sciences Research Council (BBSRC). At ETH Zurich,\\\n \\ postdoctoral fellowships support incoming researchers. The MacArthur Fellows\\\n \\ Program (aka \\\"genius grant\\\") as prestigious research fellowship awarded in\\\n \\ the United States. Fellowships as a training program\\nFellowships may involve\\\n \\ a short placement for capacity building, e.g., to get more experience in government,\\\n \\ such as the American Association for the Advancement of Science's fellowships\\\n \\ and the American Academy of Arts and Sciences Fellowship programs. Some institutions\\\n \\ offer fellowships as a professional training program as well as a financial\\\n \\ grant, such as the Balsillie School of International Affairs, where tuition\\\n \\ and other fees are paid by the fellowship. Fellowships as a special membership\\\n \\ grade\\n\\nFellows are often the highest grade of membership of many professional\\\n \\ associations or learned societies, for example, the Chartered Institute of Arbitrators,\\\n \\ the Chartered Governance Institute or Royal College of Surgeons. Lower grades\\\n \\ are referred to as members (who typically share voting rights with the fellows),\\\n \\ or associates (who may or may not, depending on whether \\\"associate\\\" status\\\n \\ is a form of full membership). Additional grades of membership exist in, for\\\n \\ example, the IEEE and the ACM. Fellowships of this type can be awarded as a\\\n \\ title of honor in their own right, e.g. the Fellowship of the Royal Society\\\n \\ (FRS). Exclusive learned societies such as the Royal Society have Fellow as\\\n \\ the only grade of membership. Appointment as an honorary fellow in a learned\\\n \\ or professional society can be either to honour exceptional achievement or service\\\n \\ within the professional domain of the awarding body or to honour contributions\\\n \\ related to the domain from someone who is professionally outside it. Membership\\\n \\ of the awarding body may or may not be a requirement. How a fellowship is awarded\\\n \\ varies for each society, but may typically involve some or all of these:\\n A\\\n \\ qualifying period in a lower grade\\n Passing a series of examinations\\n Nomination\\\n \\ by two existing fellows who know the applicant professionally\\n Evidence of\\\n \\ continued formal training post-qualification\\n Evidence of substantial achievement\\\n \\ in the subject area\\n Submission of a thesis or portfolio of works which will\\\n \\ be examined\\n Election by a vote of the fellowship\\n\\nIn ancient universities\\n\\\n \\nAt the ancient universities of the University of Oxford, the University of Cambridge,\\\n \\ and Trinity College, Dublin, members of the teaching staff typically have two\\\n \\ affiliations: one as a reader, lecturer, or other academic rank within a department\\\n \\ of the university, as at other universities, and a second affiliation as a fellow\\\n \\ of one of the colleges of the university. The fellows, sometimes referred to\\\n \\ as university dons, form the governing body of the college. They may elect a\\\n \\ council to handle day-to-day management. All fellows are entitled to certain\\\n \\ privileges within their colleges, which may include dining at High Table (free\\\n \\ of charge) and possibly the right to a room in college (free of charge). At\\\n \\ Cambridge, retired academics may remain fellows. At Oxford, however, a Governing\\\n \\ Body fellow would normally be elected a fellow emeritus and would leave the\\\n \\ Governing Body upon his or her retirement. Distinguished old members of the\\\n \\ college, or its benefactors and friends, might also be elected 'Honorary Fellow',\\\n \\ normally for life; but beyond limited dining rights this is merely an honour.\\\n \\ Most Oxford colleges have 'Fellows by Special Election' or 'Supernumerary Fellows',\\\n \\ who may be members of the teaching staff, but not necessarily members of the\\\n \\ Governing Body. Some senior administrators of a college such as bursars are\\\n \\ made fellows, and thereby become members of the governing body, because of their\\\n \\ importance to the running of a college.\"\n- source_sentence: What kind of plants or decorations are described as popular, fresh,\n and plentiful in the garden at this time of year\n sentences:\n - Enjoy the beautiful scent of gardenia, rosemary, and lavender from your garden.\n Hurry this will not last.\n - Things have been given the opportunity to grow whichever they want but not out\n of neglect per se. Somehow it adds whimsy and mystery to the courtyard.\n - I thought it might be fun to show how this garden goes though the season. The\n perennials will be the fastest to clean up.. clear out the pathways, and bed them\n down well.\n - I'm not surprised that they are so popular, they are fresh, green, with the jolly\n berries AND plentiful in the garden, this time of year, and also, so very decorative.\n I am trying to use as little light and dof as possible, a challenge that I love.\n- source_sentence: When was the Santa Venera church in Avola constructed\n sentences:\n - The dome collapsed in the earthquake of 1848, and was not reconstructed until\n 1962 by the engineer Pietro Lojacono. The decorated three story facade, flanked\n by volutes and obelisks, houses a statue of Saint Venera, patron of Avola, above\n the central portal.\n - 'Santa Venera is a Baroque style church located on Piazza Teatro in the town of\n Avola, province of Siracusa, region of Sicily, Italy. History and description\n\n Construction of a church at the site took place from 1713-1715 using designs attributed\n to Michelangelo Alessi.'\n - 'The Saint Bavo Church (Dutch: Sint-Bavokerk, Sint-Baafskerk) is a Dutch Reformed\n church building in Aardenburg, Netherlands. The church was founded in 959 by monks\n of the Saint Bavo''s Abbey in Ghent. Due to a rise in population this small church\n was replaced by a Romanesque church which burned down in 1202. In 1220 the current\n tower, nave and transept were built.'\n - The decorated three story facade, flanked by volutes and obelisks, houses a statue\n of Saint Venera, patron of Avola, above the central portal. The interior has three\n naves.\n- source_sentence: What is the last dream the speaker mentions\n sentences:\n - Have you ever felt like the dreams you had have never become reality? Have you\n ever felt like you need someone to spark the flame for you\n - I'm very new to this, so I'm not sure what I'm doing with the technical side of\n things. Please bear with me if I've got anything wrong. \"Night Thoughts And Dreams\"\n is the first thing I've written in about two years. I used to write all the time,\n but then I just stopped, however \"Sherlock\" and Benedict Cumberbatch have inspired\n me to have another go.\n - I had a fantastic phone conversation with my brother today. I also had a nightmare\n where a man pulled off his skin like a shirt.\n - They don't bury me without my uniform.\" \"My last dream is to be in Cooperstown-to\n be with those guys.\n- source_sentence: What is the description of the Myrmecoleon and what are its two\n interpretations\n sentences:\n - 'The stone lies at the bottom of the sea and comes to life early in the morning.\n When it rises from its resting-place to the surface of the sea, it opens its mouth\n and takes in some heavenly dew, and the rays of the sun shine around it; thus\n there grows within the stone a most precious, shining pearl indeed, conceived\n from the heavenly dew and given lustre by the rays of the sun.\" Interpretations\n\n\n There are two interpretations of what a Myrmecoleon is. In one version, the antlion\n is so called because it is the \"lion of ants\", a large ant or small animal that\n hides in the dust and kills ants. In the other version, it is a beast that is\n the result of a mating between a lion and an ant. It has the face of a lion and\n the body of an ant, with each part having its appropriate nature. Because the\n lion part will only eat meat and the ant part can only digest grain, the ant-lion\n starves.'\n - It is found in Medieval bestiaries such as the Hortus Sanitatis of Jacob Meydenbach.\n It is also referenced in some sources as a Formicaleon (Antlion), Formicaleun\n or Mirmicioleon.\n - Microdiprion is a genus of sawflies belonging to the family Diprionidae.\n - Macrodon is a genus of marine ray-finned fishes belonging to the family Sciaenidae,\n the drums and croakers.\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\n---\n\n# SentenceTransformer based on jinaai/jina-embeddings-v3\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) on the hard_negative_merged dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 2048 tokens\n- **Output Dimensionality:** 1024 dimensions\n- **Similarity Function:** Cosine Similarity\n- **Training Dataset:**\n - hard_negative_merged\n\n\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"Jrinky/jina_final_temp\")\n# Run inference\nsentences = [\n 'What is the description of the Myrmecoleon and what are its two interpretations',\n 'The stone lies at the bottom of the sea and comes to life early in the morning. When it rises from its resting-place to the surface of the sea, it opens its mouth and takes in some heavenly dew, and the rays of the sun shine around it; thus there grows within the stone a most precious, shining pearl indeed, conceived from the heavenly dew and given lustre by the rays of the sun.\" Interpretations\\n\\nThere are two interpretations of what a Myrmecoleon is. In one version, the antlion is so called because it is the \"lion of ants\", a large ant or small animal that hides in the dust and kills ants. In the other version, it is a beast that is the result of a mating between a lion and an ant. It has the face of a lion and the body of an ant, with each part having its appropriate nature. Because the lion part will only eat meat and the ant part can only digest grain, the ant-lion starves.',\n 'It is found in Medieval bestiaries such as the Hortus Sanitatis of Jacob Meydenbach. It is also referenced in some sources as a Formicaleon (Antlion), Formicaleun or Mirmicioleon.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### hard_negative_merged\n\n* Dataset: hard_negative_merged\n* Size: 589,508 training samples\n* Columns: anchor, positive, negative_1, negative_2, and negative_3\n* Approximate statistics based on the first 1000 samples:\n | | anchor | positive | negative_1 | negative_2 | negative_3 |\n |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|\n | type | string | string | string | string | string |\n | details |
  • min: 6 tokens
  • mean: 17.37 tokens
  • max: 37 tokens
|
  • min: 5 tokens
  • mean: 122.81 tokens
  • max: 2048 tokens
|
  • min: 5 tokens
  • mean: 128.36 tokens
  • max: 2048 tokens
|
  • min: 5 tokens
  • mean: 110.47 tokens
  • max: 1920 tokens
|
  • min: 5 tokens
  • mean: 103.93 tokens
  • max: 2048 tokens
|\n* Samples:\n | anchor | positive | negative_1 | negative_2 | negative_3 |\n |:------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n | What does the plot of the story revolve around | Respawn points are created when the player accumulates enough blood collected from slain enemies or in-level blood pickups, and idles a certain distance away from immediate level hazards. Plot
The plot follows the events of an unnamed young girl's arrival at the Lafcadio Academy for Troubled Young Ladies.
| An really interesting idea behind the story and one that had me unable to put it down some nights! View all my reviews | And everything has such meaning and depth behind it. Nothing is just said casually, and it is all so thoughfully laced with emotion and words to draw you in to the story itself. | It has a terribly implication that this flashback may be lasting more than a chapter. It's not as if we aren't learning anything of importance. I'm just not curious where this is going. I'm wondering when it'll finally be over. Not something you want from your audience as a story teller. In no simple terms. |\n | What type of warranty is offered with the Zhumell Signature 10x42 binoculars | The Signature is also backed by Zhumell's full, 25-year, no-fault warranty, ensuring a lifetime of worry-free viewing. The Zhumell Signature 10x42 binoculars will give you plenty of power - whenever you need it, for as long as you need it! | This item is backed by a Limited Lifetime Warranty. In the event this item should fail due to manufacturing defects during intended use, we will exchange the part free of charge (excludes shipping charges) for the original purchaser. | if you have different ideas or better suggestion ,be free to leave message . Warranty and terms:
-Warranty year is 1 year under normal use,the warranty period is a year from the date of original purchase.
| We have more than 55 years of experience designing, manufacturing and refining custom optical lenses for use in a range of industries. Our production staff follows strict ISO 9001 standards and uses state-of-the-art metrology equipment to test finished lenses for quality and performance. |\n | When did he announce his retirement from all professional rugby | He was named in the Pro12 Dream Teams at the end of the 2014/15 and 2016/17 seasons. In April 2021 he announced his retirement from all professional rugby. International career

Qualifying to play internationally for Scotland through his Glasgow-born mother, on 24 October 2012 he was named in the full Scottish national team for the 2012 end-of-year rugby union tests.
| After retiring from full-time professional football, he worked as a production controller before becoming a sales administrator for International Computers Limited. He lived in Southampton for the rest of his life and died on 28 January 2014. | On December 15 2018, it was announced that he had left WWE voluntarily. Professional boxing record
{| class=\"wikitable\" style=\"text-align:center;\"
| style=\"text-align:center;\" colspan=\"8\" | 6 Wins (3 knockouts, 3 decisions), 0 Losses, 0 Draws
|- style=\"text-align:center; background:#e3e3e3;\"
| style=\"border-style:none none solid solid;\" | Res.
| Since retiring from football he has worked as a journalist for the Professional Footballers' Association. References

English men's footballers
Bristol City F.C. players
Kidderminster Harriers F.C. players
Yeovil Town F.C.
|\n* Loss: cachedselfloss2.CachedInfonce with these parameters:\n ```json\n {\n \"scale\": 20.0,\n \"similarity_fct\": \"cos_sim\"\n }\n ```\n\n### Evaluation Dataset\n\n#### hard_negative_merged\n\n* Dataset: hard_negative_merged\n* Size: 589,508 evaluation samples\n* Columns: anchor, positive, negative_1, negative_2, and negative_3\n* Approximate statistics based on the first 1000 samples:\n | | anchor | positive | negative_1 | negative_2 | negative_3 |\n |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|\n | type | string | string | string | string | string |\n | details |
  • min: 4 tokens
  • mean: 17.27 tokens
  • max: 39 tokens
|
  • min: 6 tokens
  • mean: 120.45 tokens
  • max: 2031 tokens
|
  • min: 6 tokens
  • mean: 123.54 tokens
  • max: 2018 tokens
|
  • min: 5 tokens
  • mean: 114.85 tokens
  • max: 1860 tokens
|
  • min: 5 tokens
  • mean: 115.74 tokens
  • max: 1605 tokens
|\n* Samples:\n | anchor | positive | negative_1 | negative_2 | negative_3 |\n |:----------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n | What could the term 'Golia' refer to | Golia may refer to:

Golia (surname)
Golia, Ganjam
Golia Monastery
1226 Golia
| Gouka may refer to:

9708 Gouka, a main-belt asteroid after the Dutch astronomer Adriaan Gouka
Eric Gouka (born 1970), Dutch cricketer
Gouka, Benin, a town and arrondissement
| Gottschelia is a genus of liverworts belonging to the family Cephaloziellaceae. | Agila may refer to:

Agila I (died 554), Visigothic king
Agila II (died 714), Visigothic king
Agila 2, the first Filipino satellite
Agila (album), a 1996 album by Spanish rock band Extremoduro
Agila (film), a 1980 Philippine film directed by Eddie Romero
Agila (TV series), a 1987 Philippine teledrama series
Agila Town, Benue State, Nigeria
Opel Agila or Vauxhall Agila, a city car

See also
Agila division, the 10th Infantry Division of the Philippine Army
Aguila (disambiguation)
|\n | What is the timeframe in which Itera plans to potentially make an agreement with a financial institution | As Itera's President Igor Makarov reported at today's meeting of the Russian Gas Society in Moscow, the gas company could make an agreement with a financial institution, which would make the most profitable and optimum offer, in the next two to three months. According to him, they are currently holding negotiations with several financial enterprises, which specialize in introducing companies to the financial market. | The process from receipt of the funding proposal to completion of due diligence is incredibly quick, with a goal of 30 days. After initial evaluation of their proposals, a selected number of start-ups, usually 6 to 8, are asked to make preliminary presentations to the steering committee. | Coinexchange, Cryptopia, YoBit, HitBtc, Binance, Bittrex
Q1 2018 : Partners announced (Debit card & Merchants) We are currently in negotiation with major payment providers to offer you a worldwide usable card. Q1/2 2018 : ETHX Beta Wallet release (Android, Windows, iOS) and debit cart pre-order
Q3 201 : More partnerships Wider range of companies accepting ETHX. First targets are the biggest e-commerce websites. We will release a beta application to collect user reviews and answer to the community. The app is expected to come out in Q1 2018 on Android and later on iOS. We are very sensitive about our community welfare, so we try to do our best to keep our members informed about the latest news. The app will also help us to inform and get suggestions. Ethereum X is community driven. If you are also a cryptography and distributed ledger tech-nology enthusiast and want to support the project, please feel free to contact us. Additional developers as well as community managers for our social...
| The project will be floated in the market for solicitation of expression of interest from the potential investors in June 2017. The land slots will be awarded to the successful bidders based on evaluation by the end of August, 2017. The Monitoring and Evaluation (M&E) of forest sites, awarded to successful bidders, will be done in collaboration with the Forestry, Wildlife & Fisheries Department, Government of the Punjab, as per the provisions of PPP Act, 2014, and The Punjab Forest (Amendment) Act, 2016. Revenue sharing will be done in this initiative. The Company in order to effectively reach out to the business community is organizing seminars in collaboration with various Chambers of Commerce & Industry to sensitize business groups to invest in the opportunity. |\n | What role does File History play in the issue being discussed | What has File History got to do with the problem
I don't know but maybe someone at DC does
I post the question..... get lots of ideas and methods to remove the naughty files, but I still don't know why deleting file history worked unless the file history is tacked onto the file somehow
Since then I've been checking more of the \"includes folders\" for more over-long files and trying to figure what to do with them. The files are easy to find once you start paying attention
Open a folder and if it contains extra long files a scroll bar appears at the bottom of the page
Found some more files and started playing.
| Newspapers feature stories about lost computers and memory sticks but a more common and longstanding problem is about staff accessing records that they have no right to see. It has always been possible for staff to look at paper records, and in most cases, there is no track of record. | In data vault it is referred to as the record source. Background
The need to identify systems of record can become acute in organizations where management information systems have been built by taking output data from multiple source systems, re-processing this data, and then re-presenting the result for a new business use.
| The idea of preservation, in the sense of both immortalization and protection is addressed. How do we decide what to remember from history, and what do we leave out |\n* Loss: cachedselfloss2.CachedInfonce with these parameters:\n ```json\n {\n \"scale\": 20.0,\n \"similarity_fct\": \"cos_sim\"\n }\n ```\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `eval_strategy`: steps\n- `per_device_train_batch_size`: 500\n- `per_device_eval_batch_size`: 500\n- `learning_rate`: 2e-05\n- `num_train_epochs`: 10\n- `warmup_ratio`: 0.1\n- `bf16`: True\n- `batch_sampler`: no_duplicates\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: steps\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 500\n- `per_device_eval_batch_size`: 500\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 1\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 2e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 10\n- `max_steps`: -1\n- `lr_scheduler_type`: linear\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.1\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: True\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: None\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: True\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: False\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `tp_size`: 0\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: None\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `include_for_metrics`: []\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `dispatch_batches`: None\n- `split_batches`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `average_tokens_across_devices`: False\n- `prompts`: None\n- `batch_sampler`: no_duplicates\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss | Validation Loss |\n|:------:|:----:|:-------------:|:---------------:|\n| 0.1786 | 40 | 8.7768 | 8.5959 |\n| 0.3571 | 80 | 8.8187 | 8.5129 |\n| 0.5357 | 120 | 8.6175 | 8.2742 |\n| 0.7143 | 160 | 8.0868 | 7.8954 |\n| 0.8929 | 200 | 7.5681 | 7.3531 |\n| 1.0714 | 240 | 7.0288 | 6.5431 |\n| 1.25 | 280 | 6.2266 | 5.8462 |\n| 1.4286 | 320 | 5.4682 | 5.2924 |\n| 1.6071 | 360 | 5.0398 | 4.8148 |\n| 1.7857 | 400 | 4.5158 | 4.4110 |\n| 1.9643 | 440 | 4.184 | 4.0419 |\n| 2.1429 | 480 | 3.7868 | 3.7165 |\n| 2.3214 | 520 | 3.6258 | 3.4216 |\n| 2.5 | 560 | 3.2262 | 3.1530 |\n| 2.6786 | 600 | 3.0175 | 2.9128 |\n| 2.8571 | 640 | 2.75 | 2.6999 |\n| 3.0357 | 680 | 2.4915 | 2.5085 |\n\n\n### Framework Versions\n- Python: 3.10.14\n- Sentence Transformers: 3.4.1\n- Transformers: 4.50.0\n- PyTorch: 2.3.1+cu121\n- Accelerate: 1.5.2\n- Datasets: 3.4.1\n- Tokenizers: 0.21.1\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n#### CachedInfonce\n```bibtex\n@misc{gao2021scaling,\n title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},\n author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},\n year={2021},\n eprint={2101.06983},\n archivePrefix={arXiv},\n primaryClass={cs.LG}\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Jrinky/jina_final_temp", "base_model_relation": "base" }, { "model_id": "Abdelkareem/jina_v3_distilled", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nlicense: mit\nmodel_name: Abdelkareem/jina_v3_distilled\ntags:\n- embeddings\n- static-embeddings\n- sentence-transformers\n---\n\n# Abdelkareem/jina_v3_distilled Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the jinaai/jina-embeddings-v3(https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical. Model2Vec models are the smallest, fastest, and most performant static embedders available. The distilled models are up to 50 times smaller and 500 times faster than traditional Sentence Transformers.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\n\n### Using Model2Vec\n\nThe [Model2Vec library](https://github.com/MinishLab/model2vec) is the fastest and most lightweight way to run Model2Vec models.\n\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"Abdelkareem/jina_v3_distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\n### Using Sentence Transformers\n\nYou can also use the [Sentence Transformers library](https://github.com/UKPLab/sentence-transformers) to load and use the model:\n\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Load a pretrained Sentence Transformer model\nmodel = SentenceTransformer(\"Abdelkareem/jina_v3_distilled\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\n### Distilling a Model2Vec model\n\nYou can distill a Model2Vec model from a Sentence Transformer model using the `distill` method. First, install the `distill` extra with `pip install model2vec[distill]`. Then, run the following code:\n\n```python\nfrom model2vec.distill import distill\n\n# Distill a Sentence Transformer model, in this case the BAAI/bge-base-en-v1.5 model\nm2v_model = distill(model_name=\"BAAI/bge-base-en-v1.5\", pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using [SIF weighting](https://openreview.net/pdf?id=SyK00v5xx). During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Base Models](https://huggingface.co/collections/minishlab/model2vec-base-models-66fd9dd9b7c3b3c0f25ca90e)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec/tree/main/results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n- [Website](https://minishlab.github.io/)\n\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens and Thomas van Dongen},\n title = {Model2Vec: Fast State-of-the-Art Static Embeddings},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec}\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Abdelkareem/jina_v3_distilled", "base_model_relation": "base" }, { "model_id": "johnpaulbin/jina-embeddings-v3-128", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nlicense: mit\nmodel_name: jina-embeddings-v3-128\ntags:\n- embeddings\n- static-embeddings\n- sentence-transformers\n---\n\n# jina-embeddings-v3-128 Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the jinaai/jina-embeddings-v3(https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical. Model2Vec models are the smallest, fastest, and most performant static embedders available. The distilled models are up to 50 times smaller and 500 times faster than traditional Sentence Transformers.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\n\n### Using Model2Vec\n\nThe [Model2Vec library](https://github.com/MinishLab/model2vec) is the fastest and most lightweight way to run Model2Vec models.\n\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"jina-embeddings-v3-128\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\n### Using Sentence Transformers\n\nYou can also use the [Sentence Transformers library](https://github.com/UKPLab/sentence-transformers) to load and use the model:\n\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Load a pretrained Sentence Transformer model\nmodel = SentenceTransformer(\"jina-embeddings-v3-128\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\n### Distilling a Model2Vec model\n\nYou can distill a Model2Vec model from a Sentence Transformer model using the `distill` method. First, install the `distill` extra with `pip install model2vec[distill]`. Then, run the following code:\n\n```python\nfrom model2vec.distill import distill\n\n# Distill a Sentence Transformer model, in this case the BAAI/bge-base-en-v1.5 model\nm2v_model = distill(model_name=\"BAAI/bge-base-en-v1.5\", pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using [SIF weighting](https://openreview.net/pdf?id=SyK00v5xx). During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Base Models](https://huggingface.co/collections/minishlab/model2vec-base-models-66fd9dd9b7c3b3c0f25ca90e)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec/tree/main/results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n- [Website](https://minishlab.github.io/)\n\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens and Thomas van Dongen},\n title = {Model2Vec: Fast State-of-the-Art Static Embeddings},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec}\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "johnpaulbin/jina-embeddings-v3", "base_model_relation": "finetune" }, { "model_id": "Sajjad313/jina-embedding-v3", "gated": "False", "card": "---\nlicense: apache-2.0\nlanguage:\n- en\n- fa\n- ar\ninference: true\nbase_model:\n- jinaai/jina-embeddings-v3\npipeline_tag: feature-extraction\ntags:\n- Embedding\nlibrary_name: transformers\n---\n\n# Jina-embedding-v3\n\n\n## Overview\n\nThis repository uses the original [jinai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) embedding model, without modification or fine-tuning.\nA versatile sentence embedding model suitable for a wide range of natural language processing tasks\nyou can find the specific tasks in the *Tasks* section\n\n*The only difference is that you can use HF inference from this page, which is unavailable in the official page of jinaai.*\n\n\n---\n\n## Inference Access\n\nYou can use this model for inference directly from this page using Hugging Face\u2019s serverless inference API. \nThis is not possible via the official Jina AI model page. Here\u2019s an example of how to get sentence embeddings using the Hugging Face Inference API:\n\nyou can use the function below for inference (it applies mean-pooling and normalization too and returns embeddings ready to store in your vector databases)\n```python\nfrom huggingface_hub import InferenceClient\nimport numpy as np\n\ndef get_HF_embeddings(text, api_key, model, mean_pool=True, l2_normalize=True):\n \"\"\"\n Fetches embeddings from HuggingFace serverless Inference API for a single string or a list of strings.\n Args:\n text (str or list): Input text or list of texts.\n api_key (str): HuggingFace API key.\n model (str): Model repo.\n mean_pool (bool): If True, mean-pool the output.\n l2_normalize (bool): If True, L2 normalize the output.\n Returns:\n np.ndarray: Embedding(s) as numpy array(s).\n \"\"\"\n client = InferenceClient(api_key=api_key)\n if isinstance(text, str):\n texts = [text]\n single_input = True\n else:\n texts = text\n single_input = False\n\n result = client.feature_extraction(\n text=texts,\n model=model\n )\n\n if mean_pool:\n embeddings = [np.mean(r, axis=0) for r in result]\n if l2_normalize:\n embeddings = [\n e / np.linalg.norm(e) if np.linalg.norm(e) > 0 else e for e in embeddings]\n else:\n embeddings = [r for r in result]\n\n if single_input:\n return embeddings[0]\n return np.array(embeddings)\n```\n\n---\n\n\n\n## Tasks\n\n- **Extended Sequence Length:** Supports up to 8192 tokens with RoPE positional encoding.\n- **Task-Specific Embeddings:** Choose the `task` parameter for different application needs:\n - `retrieval.query` \u2013 For query embeddings in asymmetric retrieval \n - `retrieval.passage` \u2013 For passage embeddings in asymmetric retrieval \n - `separation` \u2013 For clustering and re-ranking \n - `classification` \u2013 For classification tasks \n - `text-matching` \u2013 For symmetric similarity tasks (e.g., STS)\n- **Matryoshka Embeddings:** Flexible embedding sizes (32, 64, 128, 256, 512, 768, 1024 dimensions).\n\n---\n\n## How to Use\n\nYou can use this model with either the `transformers` or the `sentence-transformers` library. \nFor **full feature support** (using task-specific LoRA heads and flexible embedding sizes), it is recommended to use the `sentence-transformers` library.\n\n### Using sentence-transformers (Recommended)\n\n```python\nfrom sentence_transformers import SentenceTransformer\n\nmodel = SentenceTransformer(\"Sajjad313/jina-embedding-v3\", trust_remote_code=True)\n\n# Task-specific usage: Retrieval query embedding\nquery_embedding = model.encode(\n [\"What is the weather like in Berlin today?\"],\n task=\"retrieval.query\"\n)\n```\n\n### Using transformers\n\n```python\nfrom transformers import AutoTokenizer, AutoModel\n\ntokenizer = AutoTokenizer.from_pretrained(\"jinaai/jina-embeddings-v3\")\nmodel = AutoModel.from_pretrained(\"jinaai/jina-embeddings-v3\")\n\ninputs = tokenizer(\n \"What is the weather like in Berlin today?\",\n return_tensors=\"pt\", padding=True, truncation=True\n)\noutputs = model(**inputs)\nembedding = outputs.last_hidden_state[:, 0] # CLS token embedding\n```\n> Note: Using the `transformers` library gives you basic access to the model\u2019s output, but for full task-specific capabilities, use `sentence-transformers`.\n\n---\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "Sajjad313/jina-embedding-v3", "base_model_relation": "base" }, { "model_id": "csanz91/lampistero_rag_embeddings", "gated": "False", "card": "---\nlanguage:\n- es\nlicense: apache-2.0\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:14907\n- loss:MatryoshkaLoss\n- loss:MultipleNegativesRankingLoss\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: Describe la tradici\u00f3n del 'rosario de candiles' en el contexto\n de la miner\u00eda.\n sentences:\n - Un mechazo es la combusti\u00f3n de la mecha sin que se llegue a inflamar el barreno.\n - La siega tradicional en Escucha comenzaba antes de San Juan con las cebadas.\n - El 'rosario de candiles' es una tradici\u00f3n religiosa celebrada en la festividad\n de San Juan, en la que los mineros escuchan y acompa\u00f1an con sus candiles de carburo,\n rezando a dos coros y cantando en parte.\n- source_sentence: \u00bfQu\u00e9 significa la expresi\u00f3n 'pillar una mojadina'?\n sentences:\n - En el campeonato provincial de atletismo en Alcorisa en mayo, Pilar Brumos de\n Escucha logr\u00f3 la 3\u00aa posici\u00f3n en 600 metros y el subcampeonato en peso.\n - Los empresarios de Escucha se hab\u00edan unido para poder participar en las elecciones\n a CC.PP. ya que era necesario que la plantilla de la empresa superase el n\u00famero\n de 50 trabajadores..\n - '''Pillar una mojadina'' significa empaparse, quedar empapado.'\n- source_sentence: \u00bfEn qu\u00e9 a\u00f1o Carbones de Teruel registra la mina 'pablo' en Escucha?\n sentences:\n - Puede referirse a un calcet\u00edn para beb\u00e9s o a un calcet\u00edn gordo.\n - Carbones de Teruel registra la mina 'pablo' en Escucha en 1900.\n - 'Jes\u00fas Conesa explic\u00f3 a la Junta de Espect\u00e1culos que el anterior propietario,\n Sr. Latorre Galindo, ten\u00eda otro cine en Utrillas, lo que causaba continuos equ\u00edvocos\n en env\u00edos de material y pagos, al creerse que ambos cines le pertenec\u00edan o eran\n la misma empresa. '\n- source_sentence: \u00bfQui\u00e9n regentaba el Cine Avenida de Escucha en el momento de su\n cierre?\n sentences:\n - Se usa con el significado de 'cuando'.\n - El CD Escucha aline\u00f3 a Castillo, Romero, Bobadilla, Moraleda, Luis, Gonz\u00e1lez,\n Higinio, Torres, Calomarde I, Calomarde II y Navarro en el partido de Copa contra\n el Alcorisa.\n - Antonio Malpica regentaba el Cine Avenida de Escucha en el momento de su cierre.\n- source_sentence: \u00bfQu\u00e9 porcentaje de aumento salarial reclamaba el Sindicato Minero\n en el conflicto de Utrillas que llev\u00f3 a plantear la huelga del 12 de octubre de\n 1930?\n sentences:\n - Antonio Gargallo.\n - Una publicaci\u00f3n con una fotograf\u00eda para el recuerdo de la locomotora llamada 'Escucha'.\n - El Sindicato Minero reclamaba un aumento del 20% los sueldos en el conflicto de\n Utrillas.\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\nmetrics:\n- cosine_accuracy@1\n- cosine_accuracy@3\n- cosine_accuracy@5\n- cosine_accuracy@10\n- cosine_precision@1\n- cosine_precision@3\n- cosine_precision@5\n- cosine_precision@10\n- cosine_recall@1\n- cosine_recall@3\n- cosine_recall@5\n- cosine_recall@10\n- cosine_ndcg@10\n- cosine_mrr@10\n- cosine_map@100\nmodel-index:\n- name: Lampistero\n results:\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 1024\n type: dim_1024\n metrics:\n - type: cosine_accuracy@1\n value: 0.7803258901629451\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8883524441762221\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.904043452021726\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9233554616777309\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7803258901629451\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.29611748139207406\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.18080869040434522\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09233554616777308\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7803258901629451\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8883524441762221\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.904043452021726\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9233554616777309\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8576141434466037\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8359425142014155\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8374344979701236\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 768\n type: dim_768\n metrics:\n - type: cosine_accuracy@1\n value: 0.7827398913699457\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8877489438744719\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9034399517199758\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9245624622812312\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7827398913699457\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.295916314624824\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.18068799034399516\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09245624622812311\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7827398913699457\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8877489438744719\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9034399517199758\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9245624622812312\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.858770916125463\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8371705894186279\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8385437636605255\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 512\n type: dim_512\n metrics:\n - type: cosine_accuracy@1\n value: 0.7797223898611949\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8859384429692215\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9010259505129753\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9227519613759807\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7797223898611949\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2953128143230738\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.18020519010259503\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09227519613759806\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7797223898611949\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8859384429692215\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9010259505129753\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9227519613759807\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8564496755344808\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8346785163471941\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8361853082918266\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 256\n type: dim_256\n metrics:\n - type: cosine_accuracy@1\n value: 0.7706698853349426\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8823174411587206\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9016294508147255\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9191309595654797\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7706698853349426\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2941058137195735\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.18032589016294506\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09191309595654798\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7706698853349426\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8823174411587206\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9016294508147255\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9191309595654797\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.851155539622205\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8286940445057519\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8302805177061129\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 128\n type: dim_128\n metrics:\n - type: cosine_accuracy@1\n value: 0.7604103802051901\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8690404345202173\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.8901629450814725\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9130959565479783\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7604103802051901\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.28968014484007243\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.1780325890162945\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09130959565479783\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7604103802051901\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8690404345202173\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.8901629450814725\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9130959565479783\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8415141158022221\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8181217729497756\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8199539602494803\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 64\n type: dim_64\n metrics:\n - type: cosine_accuracy@1\n value: 0.7248038624019312\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.852142426071213\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.8750754375377188\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.8974049487024743\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7248038624019312\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.28404747535707103\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.17501508750754374\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.08974049487024743\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7248038624019312\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.852142426071213\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.8750754375377188\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.8974049487024743\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8181789750224895\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.7920167926353802\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.793825252598125\n name: Cosine Map@100\n---\n\n# Lampistero\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) on the json dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 dimensions\n- **Similarity Function:** Cosine Similarity\n- **Training Dataset:**\n - json\n- **Language:** es\n- **License:** apache-2.0\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"csanz91/lampistero_rag_embeddings\")\n# Run inference\nsentences = [\n '\u00bfQu\u00e9 porcentaje de aumento salarial reclamaba el Sindicato Minero en el conflicto de Utrillas que llev\u00f3 a plantear la huelga del 12 de octubre de 1930?',\n 'El Sindicato Minero reclamaba un aumento del 20% los sueldos en el conflicto de Utrillas.',\n 'Antonio Gargallo.',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n## Evaluation\n\n### Metrics\n\n#### Information Retrieval\n\n* Dataset: `dim_1024`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 1024\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7803 |\n| cosine_accuracy@3 | 0.8884 |\n| cosine_accuracy@5 | 0.904 |\n| cosine_accuracy@10 | 0.9234 |\n| cosine_precision@1 | 0.7803 |\n| cosine_precision@3 | 0.2961 |\n| cosine_precision@5 | 0.1808 |\n| cosine_precision@10 | 0.0923 |\n| cosine_recall@1 | 0.7803 |\n| cosine_recall@3 | 0.8884 |\n| cosine_recall@5 | 0.904 |\n| cosine_recall@10 | 0.9234 |\n| **cosine_ndcg@10** | **0.8576** |\n| cosine_mrr@10 | 0.8359 |\n| cosine_map@100 | 0.8374 |\n\n#### Information Retrieval\n\n* Dataset: `dim_768`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 768\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7827 |\n| cosine_accuracy@3 | 0.8877 |\n| cosine_accuracy@5 | 0.9034 |\n| cosine_accuracy@10 | 0.9246 |\n| cosine_precision@1 | 0.7827 |\n| cosine_precision@3 | 0.2959 |\n| cosine_precision@5 | 0.1807 |\n| cosine_precision@10 | 0.0925 |\n| cosine_recall@1 | 0.7827 |\n| cosine_recall@3 | 0.8877 |\n| cosine_recall@5 | 0.9034 |\n| cosine_recall@10 | 0.9246 |\n| **cosine_ndcg@10** | **0.8588** |\n| cosine_mrr@10 | 0.8372 |\n| cosine_map@100 | 0.8385 |\n\n#### Information Retrieval\n\n* Dataset: `dim_512`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 512\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7797 |\n| cosine_accuracy@3 | 0.8859 |\n| cosine_accuracy@5 | 0.901 |\n| cosine_accuracy@10 | 0.9228 |\n| cosine_precision@1 | 0.7797 |\n| cosine_precision@3 | 0.2953 |\n| cosine_precision@5 | 0.1802 |\n| cosine_precision@10 | 0.0923 |\n| cosine_recall@1 | 0.7797 |\n| cosine_recall@3 | 0.8859 |\n| cosine_recall@5 | 0.901 |\n| cosine_recall@10 | 0.9228 |\n| **cosine_ndcg@10** | **0.8564** |\n| cosine_mrr@10 | 0.8347 |\n| cosine_map@100 | 0.8362 |\n\n#### Information Retrieval\n\n* Dataset: `dim_256`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 256\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7707 |\n| cosine_accuracy@3 | 0.8823 |\n| cosine_accuracy@5 | 0.9016 |\n| cosine_accuracy@10 | 0.9191 |\n| cosine_precision@1 | 0.7707 |\n| cosine_precision@3 | 0.2941 |\n| cosine_precision@5 | 0.1803 |\n| cosine_precision@10 | 0.0919 |\n| cosine_recall@1 | 0.7707 |\n| cosine_recall@3 | 0.8823 |\n| cosine_recall@5 | 0.9016 |\n| cosine_recall@10 | 0.9191 |\n| **cosine_ndcg@10** | **0.8512** |\n| cosine_mrr@10 | 0.8287 |\n| cosine_map@100 | 0.8303 |\n\n#### Information Retrieval\n\n* Dataset: `dim_128`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 128\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7604 |\n| cosine_accuracy@3 | 0.869 |\n| cosine_accuracy@5 | 0.8902 |\n| cosine_accuracy@10 | 0.9131 |\n| cosine_precision@1 | 0.7604 |\n| cosine_precision@3 | 0.2897 |\n| cosine_precision@5 | 0.178 |\n| cosine_precision@10 | 0.0913 |\n| cosine_recall@1 | 0.7604 |\n| cosine_recall@3 | 0.869 |\n| cosine_recall@5 | 0.8902 |\n| cosine_recall@10 | 0.9131 |\n| **cosine_ndcg@10** | **0.8415** |\n| cosine_mrr@10 | 0.8181 |\n| cosine_map@100 | 0.82 |\n\n#### Information Retrieval\n\n* Dataset: `dim_64`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 64\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7248 |\n| cosine_accuracy@3 | 0.8521 |\n| cosine_accuracy@5 | 0.8751 |\n| cosine_accuracy@10 | 0.8974 |\n| cosine_precision@1 | 0.7248 |\n| cosine_precision@3 | 0.284 |\n| cosine_precision@5 | 0.175 |\n| cosine_precision@10 | 0.0897 |\n| cosine_recall@1 | 0.7248 |\n| cosine_recall@3 | 0.8521 |\n| cosine_recall@5 | 0.8751 |\n| cosine_recall@10 | 0.8974 |\n| **cosine_ndcg@10** | **0.8182** |\n| cosine_mrr@10 | 0.792 |\n| cosine_map@100 | 0.7938 |\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### json\n\n* Dataset: json\n* Size: 14,907 training samples\n* Columns: query and answer\n* Approximate statistics based on the first 1000 samples:\n | | query | answer |\n |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|\n | type | string | string |\n | details |
  • min: 9 tokens
  • mean: 25.88 tokens
  • max: 63 tokens
|
  • min: 3 tokens
  • mean: 34.09 tokens
  • max: 340 tokens
|\n* Samples:\n | query | answer |\n |:--------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------|\n | En Valdeconejos, \u00bfcu\u00e1l era la sociedad de agricultores en 1952? | En Valdeconejos, la sociedad de agricultores en 1952 era el P\u00f3sito de Agricultores. |\n | \u00bfQu\u00e9 nombres de capataces se registran en el pueblo de Escucha en el a\u00f1o 1952? | En Escucha, en 1952, los capataces registrados son Peralta (Manuel) y Rodriguez (Gonzalo). |\n | En el contexto de la miner\u00eda, \u00bfqu\u00e9 implica 'despajar'? | 'Despajar' se refiere a cribar a mano material y desechos para obtener las partes de carb\u00f3n que hay en ellos. |\n* Loss: [MatryoshkaLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:\n ```json\n {\n \"loss\": \"MultipleNegativesRankingLoss\",\n \"matryoshka_dims\": [\n 1024,\n 768,\n 512,\n 256,\n 128,\n 64\n ],\n \"matryoshka_weights\": [\n 1,\n 1,\n 1,\n 1,\n 1,\n 1\n ],\n \"n_dims_per_step\": -1\n }\n ```\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `eval_strategy`: epoch\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 16\n- `gradient_accumulation_steps`: 32\n- `learning_rate`: 2e-05\n- `num_train_epochs`: 12\n- `lr_scheduler_type`: cosine\n- `warmup_ratio`: 0.1\n- `tf32`: True\n- `load_best_model_at_end`: True\n- `optim`: adamw_torch_fused\n- `batch_sampler`: no_duplicates\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: epoch\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 16\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 32\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 2e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 12\n- `max_steps`: -1\n- `lr_scheduler_type`: cosine\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.1\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: True\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: True\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `tp_size`: 0\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch_fused\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: None\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `include_for_metrics`: []\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `average_tokens_across_devices`: False\n- `prompts`: None\n- `batch_sampler`: no_duplicates\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss | dim_1024_cosine_ndcg@10 | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |\n|:-------:|:----:|:-------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|\n| 1.0 | 8 | - | 0.7663 | 0.7676 | 0.7656 | 0.7626 | 0.7393 | 0.6969 |\n| 1.2747 | 10 | 127.0406 | - | - | - | - | - | - |\n| 2.0 | 16 | - | 0.8244 | 0.8240 | 0.8226 | 0.8172 | 0.8060 | 0.7775 |\n| 2.5494 | 20 | 38.8995 | - | - | - | - | - | - |\n| 3.0 | 24 | - | 0.8425 | 0.8426 | 0.8444 | 0.8373 | 0.8252 | 0.7996 |\n| 3.8240 | 30 | 20.1528 | - | - | - | - | - | - |\n| 4.0 | 32 | - | 0.8526 | 0.8520 | 0.8498 | 0.8456 | 0.8289 | 0.8037 |\n| 5.0 | 40 | 14.0513 | 0.8550 | 0.8543 | 0.8517 | 0.8490 | 0.8368 | 0.8139 |\n| 6.0 | 48 | - | 0.8572 | 0.8565 | 0.8557 | 0.8520 | 0.8404 | 0.8170 |\n| 6.2747 | 50 | 13.364 | - | - | - | - | - | - |\n| 7.0 | 56 | - | 0.8579 | 0.8576 | 0.8553 | 0.8514 | 0.8422 | 0.8180 |\n| 7.5494 | 60 | 12.7986 | - | - | - | - | - | - |\n| 8.0 | 64 | - | 0.8573 | 0.8580 | 0.8560 | 0.8523 | 0.8414 | 0.8178 |\n| 8.8240 | 70 | 12.0091 | - | - | - | - | - | - |\n| 9.0 | 72 | - | 0.8578 | 0.8586 | 0.8562 | 0.8519 | 0.8423 | 0.8184 |\n| 10.0 | 80 | 10.9468 | 0.8583 | 0.8589 | 0.8565 | 0.8530 | 0.8413 | 0.8191 |\n| 10.5494 | 84 | - | 0.8576 | 0.8588 | 0.8564 | 0.8512 | 0.8415 | 0.8182 |\n\n\n### Framework Versions\n- Python: 3.12.10\n- Sentence Transformers: 4.1.0\n- Transformers: 4.51.3\n- PyTorch: 2.7.0+cu126\n- Accelerate: 1.7.0\n- Datasets: 3.6.0\n- Tokenizers: 0.21.1\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n#### MatryoshkaLoss\n```bibtex\n@misc{kusupati2024matryoshka,\n title={Matryoshka Representation Learning},\n author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},\n year={2024},\n eprint={2205.13147},\n archivePrefix={arXiv},\n primaryClass={cs.LG}\n}\n```\n\n#### MultipleNegativesRankingLoss\n```bibtex\n@misc{henderson2017efficient,\n title={Efficient Natural Language Response Suggestion for Smart Reply},\n author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},\n year={2017},\n eprint={1705.00652},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "csanz91/lampistero_rag_embeddings", "base_model_relation": "base" }, { "model_id": "csanz91/lampistero_rag_embeddings_2", "gated": "False", "card": "---\nlanguage:\n- es\nlicense: apache-2.0\ntags:\n- sentence-transformers\n- sentence-similarity\n- feature-extraction\n- generated_from_trainer\n- dataset_size:14907\n- loss:MatryoshkaLoss\n- loss:MultipleNegativesRankingLoss\nbase_model: jinaai/jina-embeddings-v3\nwidget:\n- source_sentence: \u00bfQu\u00e9 caracter\u00edstica especial ten\u00eda la escultura del 'Torico' creada\n por Pedro Blesa?\n sentences:\n - 'Despu\u00e9s de dorar el conejo en la receta de Conejo escabechado, en la misma sart\u00e9n\n se rehogan los ajos, con el laurel y la pimienta.\n\n '\n - Rafael Barcel\u00f3n se encargaba del servicio de electricidad en Valdeconejos en 1951.\n - La escultura del 'Torico' creada por Pedro Blesa era un anaglifo, visible en 3D\n con gafas especiales.\n- source_sentence: \u00bfPor qu\u00e9 cantidad adquiri\u00f3 Francisco Santacruz la mina Escuadra\n en la subasta p\u00fablica?\n sentences:\n - Despu\u00e9s de la temporada 1986-87, el equipo descendi\u00f3, lo que provoc\u00f3 su desaparici\u00f3n\n del campeonato en la temporada 1987-88.\n - '''Al bies'' significa en diagonal.'\n - Francisco Santacruz adquiri\u00f3 la mina Escuadra por la cantidad de 931 pesetas.\n- source_sentence: \u00bfQui\u00e9n se desempe\u00f1aba como fiscal en el ayuntamiento de Escucha\n en el a\u00f1o 1916?\n sentences:\n - El autor mencionado para la receta Sopas de ajo es Teo Martin Lafuente.\n - En Escucha en 1916, D. Joaqu\u00edn Latorre del R\u00edo se desempe\u00f1aba como fiscal.\n - Felipe Mall\u00e9n era el farmac\u00e9utico en Valdeconejos en 1928.\n- source_sentence: \u00bfQu\u00e9 informaci\u00f3n transmiten los 'toques' en la ca\u00f1a de un pozo\n durante las operaciones mineras?\n sentences:\n - Juan Pedro Mart\u00edn encontr\u00f3 fragmentos de carb\u00f3n de piedra en el paraje de El Horcajo.\n - Se public\u00f3 en 1970 por Ediciones Cultura y Acci\u00f3n. CNT.\n - 'Los ''toques'' son se\u00f1ales que se hacen en la ca\u00f1a del pozo para las distintas\n operaciones 1: alto 2: arriba 3: abajo 1+2: despacio arriba 1+3: despacio abajo\n 4+2: personal arriba 4+3: personal abajo 4+1+2: se\u00f1alista en jaula arriba 4+1+3:\n se\u00f1alista en jaula abajo 5: jaula libre 6: maniobra'\n- source_sentence: \u00bfEn qu\u00e9 a\u00f1o se demarc\u00f3 y reconoci\u00f3 la mina 'El Pilar'?\n sentences:\n - Seg\u00fan la quinta demanda del SOMM, todas compa\u00f1\u00edas mineras deb\u00edan entregar a todos\n sus obreros un libramiento de liquidaci\u00f3n mensual\n - '''Tontiar'' significa cuando dos j\u00f3venes empiezan con un noviazgo.'\n - La mina 'El Pilar' se demarc\u00f3 y reconoci\u00f3 en 1857.\npipeline_tag: sentence-similarity\nlibrary_name: sentence-transformers\nmetrics:\n- cosine_accuracy@1\n- cosine_accuracy@3\n- cosine_accuracy@5\n- cosine_accuracy@10\n- cosine_precision@1\n- cosine_precision@3\n- cosine_precision@5\n- cosine_precision@10\n- cosine_recall@1\n- cosine_recall@3\n- cosine_recall@5\n- cosine_recall@10\n- cosine_ndcg@10\n- cosine_mrr@10\n- cosine_map@100\nmodel-index:\n- name: Lampistero\n results:\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 1024\n type: dim_1024\n metrics:\n - type: cosine_accuracy@1\n value: 0.7700663850331925\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8925769462884732\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9155099577549789\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9330114665057333\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7700663850331925\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2975256487628244\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.18310199155099577\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09330114665057333\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7700663850331925\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8925769462884732\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9155099577549789\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9330114665057333\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8578914781807897\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8330619976817926\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8343424106284848\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 768\n type: dim_768\n metrics:\n - type: cosine_accuracy@1\n value: 0.7694628847314424\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8889559444779722\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9124924562462281\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9330114665057333\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7694628847314424\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.29631864815932407\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.1824984912492456\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09330114665057332\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7694628847314424\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8889559444779722\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9124924562462281\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9330114665057333\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8571049923900239\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8320899311243306\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8333457816447034\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 512\n type: dim_512\n metrics:\n - type: cosine_accuracy@1\n value: 0.7682558841279421\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8865419432709717\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9112854556427278\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9305974652987327\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7682558841279421\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2955139810903239\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.18225709112854557\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09305974652987326\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7682558841279421\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8865419432709717\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9112854556427278\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9305974652987327\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8555277012951626\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8307227155597702\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8321030396467847\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 256\n type: dim_256\n metrics:\n - type: cosine_accuracy@1\n value: 0.764031382015691\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8901629450814725\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9082679541339771\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9299939649969825\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.764031382015691\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2967209816938242\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.1816535908267954\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09299939649969825\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.764031382015691\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8901629450814725\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9082679541339771\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9299939649969825\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8535167149096011\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8282907530342651\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8296119986031772\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 128\n type: dim_128\n metrics:\n - type: cosine_accuracy@1\n value: 0.7447193723596862\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8768859384429692\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.9028364514182257\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.9215449607724804\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7447193723596862\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2922953128143231\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.1805672902836451\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.09215449607724803\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7447193723596862\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8768859384429692\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.9028364514182257\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.9215449607724804\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8402664516336745\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.8133905221714518\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.8148588407289652\n name: Cosine Map@100\n - task:\n type: information-retrieval\n name: Information Retrieval\n dataset:\n name: dim 64\n type: dim_64\n metrics:\n - type: cosine_accuracy@1\n value: 0.7103198551599276\n name: Cosine Accuracy@1\n - type: cosine_accuracy@3\n value: 0.8491249245624622\n name: Cosine Accuracy@3\n - type: cosine_accuracy@5\n value: 0.8780929390464696\n name: Cosine Accuracy@5\n - type: cosine_accuracy@10\n value: 0.899818949909475\n name: Cosine Accuracy@10\n - type: cosine_precision@1\n value: 0.7103198551599276\n name: Cosine Precision@1\n - type: cosine_precision@3\n value: 0.2830416415208208\n name: Cosine Precision@3\n - type: cosine_precision@5\n value: 0.1756185878092939\n name: Cosine Precision@5\n - type: cosine_precision@10\n value: 0.08998189499094747\n name: Cosine Precision@10\n - type: cosine_recall@1\n value: 0.7103198551599276\n name: Cosine Recall@1\n - type: cosine_recall@3\n value: 0.8491249245624622\n name: Cosine Recall@3\n - type: cosine_recall@5\n value: 0.8780929390464696\n name: Cosine Recall@5\n - type: cosine_recall@10\n value: 0.899818949909475\n name: Cosine Recall@10\n - type: cosine_ndcg@10\n value: 0.8119294706592789\n name: Cosine Ndcg@10\n - type: cosine_mrr@10\n value: 0.7829293234091058\n name: Cosine Mrr@10\n - type: cosine_map@100\n value: 0.7850878407159746\n name: Cosine Map@100\n---\n\n# Lampistero\n\nThis is a [sentence-transformers](https://www.SBERT.net) model finetuned from [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) on the json dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.\n\n## Model Details\n\n### Model Description\n- **Model Type:** Sentence Transformer\n- **Base model:** [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) \n- **Maximum Sequence Length:** 8194 tokens\n- **Output Dimensionality:** 1024 dimensions\n- **Similarity Function:** Cosine Similarity\n- **Training Dataset:**\n - json\n- **Language:** es\n- **License:** apache-2.0\n\n### Model Sources\n\n- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)\n- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)\n- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)\n\n### Full Model Architecture\n\n```\nSentenceTransformer(\n (transformer): Transformer(\n (auto_model): XLMRobertaLoRA(\n (roberta): XLMRobertaModel(\n (embeddings): XLMRobertaEmbeddings(\n (word_embeddings): ParametrizedEmbedding(\n 250002, 1024, padding_idx=1\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (token_type_embeddings): ParametrizedEmbedding(\n 1, 1024\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (emb_drop): Dropout(p=0.1, inplace=False)\n (emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (encoder): XLMRobertaEncoder(\n (layers): ModuleList(\n (0-23): 24 x Block(\n (mixer): MHA(\n (rotary_emb): RotaryEmbedding()\n (Wqkv): ParametrizedLinearResidual(\n in_features=1024, out_features=3072, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (inner_attn): FlashSelfAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (inner_cross_attn): FlashCrossAttention(\n (drop): Dropout(p=0.1, inplace=False)\n )\n (out_proj): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout1): Dropout(p=0.1, inplace=False)\n (drop_path1): StochasticDepth(p=0.0, mode=row)\n (norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n (mlp): Mlp(\n (fc1): ParametrizedLinear(\n in_features=1024, out_features=4096, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (fc2): ParametrizedLinear(\n in_features=4096, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n )\n (dropout2): Dropout(p=0.1, inplace=False)\n (drop_path2): StochasticDepth(p=0.0, mode=row)\n (norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)\n )\n )\n )\n (pooler): XLMRobertaPooler(\n (dense): ParametrizedLinear(\n in_features=1024, out_features=1024, bias=True\n (parametrizations): ModuleDict(\n (weight): ParametrizationList(\n (0): LoRAParametrization()\n )\n )\n )\n (activation): Tanh()\n )\n )\n )\n )\n (pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})\n (normalizer): Normalize()\n)\n```\n\n## Usage\n\n### Direct Usage (Sentence Transformers)\n\nFirst install the Sentence Transformers library:\n\n```bash\npip install -U sentence-transformers\n```\n\nThen you can load this model and run inference.\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Download from the \ud83e\udd17 Hub\nmodel = SentenceTransformer(\"csanz91/lampistero_rag_embeddings_2\")\n# Run inference\nsentences = [\n \"\u00bfEn qu\u00e9 a\u00f1o se demarc\u00f3 y reconoci\u00f3 la mina 'El Pilar'?\",\n \"La mina 'El Pilar' se demarc\u00f3 y reconoci\u00f3 en 1857.\",\n 'Seg\u00fan la quinta demanda del SOMM, todas compa\u00f1\u00edas mineras deb\u00edan entregar a todos sus obreros un libramiento de liquidaci\u00f3n mensual',\n]\nembeddings = model.encode(sentences)\nprint(embeddings.shape)\n# [3, 1024]\n\n# Get the similarity scores for the embeddings\nsimilarities = model.similarity(embeddings, embeddings)\nprint(similarities.shape)\n# [3, 3]\n```\n\n\n\n\n\n\n\n## Evaluation\n\n### Metrics\n\n#### Information Retrieval\n\n* Dataset: `dim_1024`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 1024\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7701 |\n| cosine_accuracy@3 | 0.8926 |\n| cosine_accuracy@5 | 0.9155 |\n| cosine_accuracy@10 | 0.933 |\n| cosine_precision@1 | 0.7701 |\n| cosine_precision@3 | 0.2975 |\n| cosine_precision@5 | 0.1831 |\n| cosine_precision@10 | 0.0933 |\n| cosine_recall@1 | 0.7701 |\n| cosine_recall@3 | 0.8926 |\n| cosine_recall@5 | 0.9155 |\n| cosine_recall@10 | 0.933 |\n| **cosine_ndcg@10** | **0.8579** |\n| cosine_mrr@10 | 0.8331 |\n| cosine_map@100 | 0.8343 |\n\n#### Information Retrieval\n\n* Dataset: `dim_768`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 768\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7695 |\n| cosine_accuracy@3 | 0.889 |\n| cosine_accuracy@5 | 0.9125 |\n| cosine_accuracy@10 | 0.933 |\n| cosine_precision@1 | 0.7695 |\n| cosine_precision@3 | 0.2963 |\n| cosine_precision@5 | 0.1825 |\n| cosine_precision@10 | 0.0933 |\n| cosine_recall@1 | 0.7695 |\n| cosine_recall@3 | 0.889 |\n| cosine_recall@5 | 0.9125 |\n| cosine_recall@10 | 0.933 |\n| **cosine_ndcg@10** | **0.8571** |\n| cosine_mrr@10 | 0.8321 |\n| cosine_map@100 | 0.8333 |\n\n#### Information Retrieval\n\n* Dataset: `dim_512`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 512\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7683 |\n| cosine_accuracy@3 | 0.8865 |\n| cosine_accuracy@5 | 0.9113 |\n| cosine_accuracy@10 | 0.9306 |\n| cosine_precision@1 | 0.7683 |\n| cosine_precision@3 | 0.2955 |\n| cosine_precision@5 | 0.1823 |\n| cosine_precision@10 | 0.0931 |\n| cosine_recall@1 | 0.7683 |\n| cosine_recall@3 | 0.8865 |\n| cosine_recall@5 | 0.9113 |\n| cosine_recall@10 | 0.9306 |\n| **cosine_ndcg@10** | **0.8555** |\n| cosine_mrr@10 | 0.8307 |\n| cosine_map@100 | 0.8321 |\n\n#### Information Retrieval\n\n* Dataset: `dim_256`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 256\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.764 |\n| cosine_accuracy@3 | 0.8902 |\n| cosine_accuracy@5 | 0.9083 |\n| cosine_accuracy@10 | 0.93 |\n| cosine_precision@1 | 0.764 |\n| cosine_precision@3 | 0.2967 |\n| cosine_precision@5 | 0.1817 |\n| cosine_precision@10 | 0.093 |\n| cosine_recall@1 | 0.764 |\n| cosine_recall@3 | 0.8902 |\n| cosine_recall@5 | 0.9083 |\n| cosine_recall@10 | 0.93 |\n| **cosine_ndcg@10** | **0.8535** |\n| cosine_mrr@10 | 0.8283 |\n| cosine_map@100 | 0.8296 |\n\n#### Information Retrieval\n\n* Dataset: `dim_128`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 128\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7447 |\n| cosine_accuracy@3 | 0.8769 |\n| cosine_accuracy@5 | 0.9028 |\n| cosine_accuracy@10 | 0.9215 |\n| cosine_precision@1 | 0.7447 |\n| cosine_precision@3 | 0.2923 |\n| cosine_precision@5 | 0.1806 |\n| cosine_precision@10 | 0.0922 |\n| cosine_recall@1 | 0.7447 |\n| cosine_recall@3 | 0.8769 |\n| cosine_recall@5 | 0.9028 |\n| cosine_recall@10 | 0.9215 |\n| **cosine_ndcg@10** | **0.8403** |\n| cosine_mrr@10 | 0.8134 |\n| cosine_map@100 | 0.8149 |\n\n#### Information Retrieval\n\n* Dataset: `dim_64`\n* Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:\n ```json\n {\n \"truncate_dim\": 64\n }\n ```\n\n| Metric | Value |\n|:--------------------|:-----------|\n| cosine_accuracy@1 | 0.7103 |\n| cosine_accuracy@3 | 0.8491 |\n| cosine_accuracy@5 | 0.8781 |\n| cosine_accuracy@10 | 0.8998 |\n| cosine_precision@1 | 0.7103 |\n| cosine_precision@3 | 0.283 |\n| cosine_precision@5 | 0.1756 |\n| cosine_precision@10 | 0.09 |\n| cosine_recall@1 | 0.7103 |\n| cosine_recall@3 | 0.8491 |\n| cosine_recall@5 | 0.8781 |\n| cosine_recall@10 | 0.8998 |\n| **cosine_ndcg@10** | **0.8119** |\n| cosine_mrr@10 | 0.7829 |\n| cosine_map@100 | 0.7851 |\n\n\n\n\n\n## Training Details\n\n### Training Dataset\n\n#### json\n\n* Dataset: json\n* Size: 14,907 training samples\n* Columns: query and answer\n* Approximate statistics based on the first 1000 samples:\n | | query | answer |\n |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|\n | type | string | string |\n | details |
  • min: 9 tokens
  • mean: 26.09 tokens
  • max: 66 tokens
|
  • min: 4 tokens
  • mean: 34.02 tokens
  • max: 405 tokens
|\n* Samples:\n | query | answer |\n |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n | \u00bfQu\u00e9 tipos de palas se utilizan para cargar el carb\u00f3n y el mineral? | Se utiliza una pala convencional y una pala hidr\u00e1ulica, esta \u00faltima descarga sobre un p\u00e1ncer, puede hacerlo lateralmente y se desplaza sobre ruedas u oruga. |\n | Tras el cierre de la tejer\u00eda de Florencio Salvador, \u00bfde d\u00f3nde procedieron finalmente los ladrillos para las doscientas diez viviendas construidas en Utrillas? | Los ladrillos y material para las doscientas diez viviendas construidas en Utrillas procedieron finalmente de Letux, Zaragoza . |\n | \u00bfCu\u00e1l es el formato de los juegos infantiles que se est\u00e1n preparando para el verano en Escucha en 2021? | Los juegos infantiles que se est\u00e1n preparando para el verano en Escucha en 2021 est\u00e1n en formato revista. |\n* Loss: [MatryoshkaLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:\n ```json\n {\n \"loss\": \"MultipleNegativesRankingLoss\",\n \"matryoshka_dims\": [\n 1024,\n 768,\n 512,\n 256,\n 128,\n 64\n ],\n \"matryoshka_weights\": [\n 1,\n 1,\n 1,\n 1,\n 1,\n 1\n ],\n \"n_dims_per_step\": -1\n }\n ```\n\n### Training Hyperparameters\n#### Non-Default Hyperparameters\n\n- `eval_strategy`: epoch\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 16\n- `gradient_accumulation_steps`: 32\n- `learning_rate`: 2e-05\n- `num_train_epochs`: 8\n- `lr_scheduler_type`: cosine\n- `warmup_ratio`: 0.1\n- `tf32`: True\n- `load_best_model_at_end`: True\n- `optim`: adamw_torch_fused\n- `batch_sampler`: no_duplicates\n\n#### All Hyperparameters\n
Click to expand\n\n- `overwrite_output_dir`: False\n- `do_predict`: False\n- `eval_strategy`: epoch\n- `prediction_loss_only`: True\n- `per_device_train_batch_size`: 64\n- `per_device_eval_batch_size`: 16\n- `per_gpu_train_batch_size`: None\n- `per_gpu_eval_batch_size`: None\n- `gradient_accumulation_steps`: 32\n- `eval_accumulation_steps`: None\n- `torch_empty_cache_steps`: None\n- `learning_rate`: 2e-05\n- `weight_decay`: 0.0\n- `adam_beta1`: 0.9\n- `adam_beta2`: 0.999\n- `adam_epsilon`: 1e-08\n- `max_grad_norm`: 1.0\n- `num_train_epochs`: 8\n- `max_steps`: -1\n- `lr_scheduler_type`: cosine\n- `lr_scheduler_kwargs`: {}\n- `warmup_ratio`: 0.1\n- `warmup_steps`: 0\n- `log_level`: passive\n- `log_level_replica`: warning\n- `log_on_each_node`: True\n- `logging_nan_inf_filter`: True\n- `save_safetensors`: True\n- `save_on_each_node`: False\n- `save_only_model`: False\n- `restore_callback_states_from_checkpoint`: False\n- `no_cuda`: False\n- `use_cpu`: False\n- `use_mps_device`: False\n- `seed`: 42\n- `data_seed`: None\n- `jit_mode_eval`: False\n- `use_ipex`: False\n- `bf16`: False\n- `fp16`: False\n- `fp16_opt_level`: O1\n- `half_precision_backend`: auto\n- `bf16_full_eval`: False\n- `fp16_full_eval`: False\n- `tf32`: True\n- `local_rank`: 0\n- `ddp_backend`: None\n- `tpu_num_cores`: None\n- `tpu_metrics_debug`: False\n- `debug`: []\n- `dataloader_drop_last`: False\n- `dataloader_num_workers`: 0\n- `dataloader_prefetch_factor`: None\n- `past_index`: -1\n- `disable_tqdm`: False\n- `remove_unused_columns`: True\n- `label_names`: None\n- `load_best_model_at_end`: True\n- `ignore_data_skip`: False\n- `fsdp`: []\n- `fsdp_min_num_params`: 0\n- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}\n- `tp_size`: 0\n- `fsdp_transformer_layer_cls_to_wrap`: None\n- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}\n- `deepspeed`: None\n- `label_smoothing_factor`: 0.0\n- `optim`: adamw_torch_fused\n- `optim_args`: None\n- `adafactor`: False\n- `group_by_length`: False\n- `length_column_name`: length\n- `ddp_find_unused_parameters`: None\n- `ddp_bucket_cap_mb`: None\n- `ddp_broadcast_buffers`: False\n- `dataloader_pin_memory`: True\n- `dataloader_persistent_workers`: False\n- `skip_memory_metrics`: True\n- `use_legacy_prediction_loop`: False\n- `push_to_hub`: False\n- `resume_from_checkpoint`: None\n- `hub_model_id`: None\n- `hub_strategy`: every_save\n- `hub_private_repo`: None\n- `hub_always_push`: False\n- `gradient_checkpointing`: False\n- `gradient_checkpointing_kwargs`: None\n- `include_inputs_for_metrics`: False\n- `include_for_metrics`: []\n- `eval_do_concat_batches`: True\n- `fp16_backend`: auto\n- `push_to_hub_model_id`: None\n- `push_to_hub_organization`: None\n- `mp_parameters`: \n- `auto_find_batch_size`: False\n- `full_determinism`: False\n- `torchdynamo`: None\n- `ray_scope`: last\n- `ddp_timeout`: 1800\n- `torch_compile`: False\n- `torch_compile_backend`: None\n- `torch_compile_mode`: None\n- `include_tokens_per_second`: False\n- `include_num_input_tokens_seen`: False\n- `neftune_noise_alpha`: None\n- `optim_target_modules`: None\n- `batch_eval_metrics`: False\n- `eval_on_start`: False\n- `use_liger_kernel`: False\n- `eval_use_gather_object`: False\n- `average_tokens_across_devices`: False\n- `prompts`: None\n- `batch_sampler`: no_duplicates\n- `multi_dataset_batch_sampler`: proportional\n\n
\n\n### Training Logs\n| Epoch | Step | Training Loss | dim_1024_cosine_ndcg@10 | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |\n|:------:|:----:|:-------------:|:-----------------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|\n| 1.0 | 8 | - | 0.7841 | 0.7835 | 0.7836 | 0.7791 | 0.7665 | 0.7226 |\n| 1.2747 | 10 | 58.1187 | - | - | - | - | - | - |\n| 2.0 | 16 | - | 0.8348 | 0.8366 | 0.8345 | 0.8301 | 0.8184 | 0.7861 |\n| 2.5494 | 20 | 24.4181 | - | - | - | - | - | - |\n| 3.0 | 24 | - | 0.8521 | 0.8504 | 0.8503 | 0.8457 | 0.8319 | 0.8007 |\n| 3.8240 | 30 | 16.1488 | - | - | - | - | - | - |\n| 4.0 | 32 | - | 0.8561 | 0.8548 | 0.8555 | 0.8509 | 0.8387 | 0.8073 |\n| 5.0 | 40 | 13.4897 | 0.8585 | 0.8556 | 0.8545 | 0.8528 | 0.8397 | 0.8111 |\n| 6.0 | 48 | - | 0.8578 | 0.8563 | 0.8550 | 0.8535 | 0.8410 | 0.8110 |\n| 6.2747 | 50 | 13.7469 | - | - | - | - | - | - |\n| 7.0 | 56 | - | 0.8579 | 0.8571 | 0.8555 | 0.8535 | 0.8403 | 0.8119 |\n\n\n### Framework Versions\n- Python: 3.12.10\n- Sentence Transformers: 4.1.0\n- Transformers: 4.51.3\n- PyTorch: 2.7.0+cu126\n- Accelerate: 1.7.0\n- Datasets: 3.6.0\n- Tokenizers: 0.21.1\n\n## Citation\n\n### BibTeX\n\n#### Sentence Transformers\n```bibtex\n@inproceedings{reimers-2019-sentence-bert,\n title = \"Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks\",\n author = \"Reimers, Nils and Gurevych, Iryna\",\n booktitle = \"Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing\",\n month = \"11\",\n year = \"2019\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://arxiv.org/abs/1908.10084\",\n}\n```\n\n#### MatryoshkaLoss\n```bibtex\n@misc{kusupati2024matryoshka,\n title={Matryoshka Representation Learning},\n author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},\n year={2024},\n eprint={2205.13147},\n archivePrefix={arXiv},\n primaryClass={cs.LG}\n}\n```\n\n#### MultipleNegativesRankingLoss\n```bibtex\n@misc{henderson2017efficient,\n title={Efficient Natural Language Response Suggestion for Smart Reply},\n author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},\n year={2017},\n eprint={1705.00652},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n```\n\n\n\n\n\n", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "csanz91/lampistero_rag_embeddings_2", "base_model_relation": "base" }, { "model_id": "NAMAA-Space/zarra", "gated": "unknown", "card": "---\nlibrary_name: model2vec\nlicense: mit\nmodel_name: Abdelkareem/zarra\ntags:\n- embeddings\n- static-embeddings\n- sentence-transformers\ndatasets:\n- allenai/c4\nlanguage:\n- ar\nbase_model:\n- jinaai/jina-embeddings-v3\npipeline_tag: sentence-similarity\n---\n\n# Zarra: Arabic Static Embedding Model\n\n\n\n![image/png](https://cdn-uploads.huggingface.co/production/uploads/628f7a71dd993507cfcbe587/t4ALUMHL25wTuNzgNQwUg.png)\n\n\n**Zarra** is a static embedding model built using the Model2Vec distillation framework. \nIt is a distilled version of a Sentence Transformer, specifically optimized for the Arabic language.\nUnlike traditional transformer-based models, Zarra produces static embeddings, enabling ultra-fast inference on both CPU and GPU\u2014making it ideal for resource-constrained environments or real-time applications.\n\n## Why Zarra?\n\u26a1 Exceptional Speed: Delivers embeddings up to 500x faster than sentence transformers.\n\n\ud83e\udde0 Compact & Efficient: Up to 50x smaller in size, allowing easy deployment on edge devices.\n\n\ud83e\uddf0 Versatile: Well-suited for search, clustering, classification, deduplication, and more.\n\n\ud83c\udf0d Arabic-First: Specifically trained on high-quality Arabic data, ensuring relevance and performance across a range of Arabic NLP tasks.\n\n\n

\n \"Speed\n

\n\n\n## About Model2Vec\n\n\nThe Model2Vec distillation technique transfers knowledge from large transformer models into lightweight static embedding spaces, preserving semantic quality while dramatically improving speed and efficiency.\nZarra represents the best of both worlds: the semantic power of transformers and the speed and simplicity of static vectors.\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\n\n### Using Model2Vec\n\nThe [Model2Vec library](https://github.com/MinishLab/model2vec) is the fastest and most lightweight way to run Model2Vec models.\n\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"NAMAA-Space/zarra\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\n### Using Sentence Transformers\n\nYou can also use the [Sentence Transformers library](https://github.com/UKPLab/sentence-transformers) to load and use the model:\n\n```python\nfrom sentence_transformers import SentenceTransformer\n\n# Load a pretrained Sentence Transformer model\nmodel = SentenceTransformer(\"NAMAA-Space/zarra\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\n## How it Works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using [SIF weighting](https://openreview.net/pdf?id=SyK00v5xx). During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n\n## Benchmark on Arabic\n\n\n## Speed\n\n| Model | Speed (sentences/second) | Device |\n|---------------------------------------|--------------------------|--------|\n| zarra | 26893.63 | cpu |\n| bojji | 27478.15 | cpu |\n| potion-multilingual-128M | 27145.31 | cpu |\n| paraphrase-multilingual-MiniLM-L12-v2 | 2363.24 | cuda |\n| silma_ai_embedding_sts_v0.1 | 627.13 | cuda |\n| muffakir_embedding | 621.77 | cuda |\n| get_multilingual_base | 895.41 | cuda |\n| arabic_retrieval_v1.0 | 618.56 | cuda |\n| arabic_triplet_matryoshka_v2 | 610.64 | cuda |\n\n- Zarra and Bojji excel in speed, achieving 26893.63 and 27478.15 sentences per second on CPU, respectively, far surpassing CUDA-based models like arabic_triplet_matryoshka_v2 (610.64).\n\n- Top Performer: Bojji is the fastest model, slightly ahead of Zarra and potion-multilingual-128M (27145.31), highlighting the efficiency of Model2Vec-based models on CPU.\n\n- Key Observation: The high speed of Zarra and Bojji on CPU makes them ideal for resource-constrained environments, offering significant advantages over CUDA-dependent models.\n\n## Size of the Model\n\n| Model | Parameters (M) | Size (MB) | Relative to Largest (%) | Less than Largest (x) |\n|----------------------------------|----------------|-----------|-------------------------|-----------------------|\n| zarra | 64.00 | 244.14 | 41.92 | 2.39 |\n| bojji | 124.88 | 476.40 | 81.79 | 1.22 |\n| potion-multilingual-128M | 128.09 | 488.63 | 83.89 | 1.19 |\n| paraphrase-multilingual-MiniLM-\u2026 | 117.65 | 448.82 | 77.06 | 1.30 |\n| silma_ai_embedding_sts_v0.1 | 135.19 | 515.72 | 88.54 | 1.13 |\n| muffakir_embedding | 135.19 | 515.72 | 88.54 | 1.13 |\n| arabic_retrieval_v1.0 | 135.19 | 515.73 | 88.54 | 1.13 |\n| arabic_triplet_matryoshka_v2 | 135.19 | 515.72 | 88.54 | 1.13 |\n| get_multilingual_base | 305.37 | 582.45 | 100.00 | 1.00 |\n\n\n\n- Zarra is the smallest model, with only 64 million parameters and 244.14 MB in size, making it 2.39 times smaller than the largest model (get_multilingual_base).\n\n- Bojji is slightly larger at 124.88 million parameters and 476.40 MB, but still significantly smaller than most other models.\n\n- Top Performer: Zarra leads in compactness, offering the smallest footprint, which is critical for deployment on resource-limited devices.\n\n- Key Observation: The compact size of Zarra and Bojji aligns with their design goal of efficiency, making them highly suitable for edge computing and real-time applications.\n\n\n| Model | Avg | MIRAC | MLQAR | Massi | Multi | STS17 | STS22 | XNLI_ |\n|---------------------------------------|-------|-------|-------|-------|-------|-------|-------|-------|\n| arabic_triplet_matryoshka_v2 | 0.6610 | 0.6262 | 0.5093 | 0.5577 | 0.5868 | 0.8531 | 0.6396 | 0.8542 |\n| muffakir_embedding | 0.6494 | 0.6424 | 0.5267 | 0.5462 | 0.5943 | 0.8485 | 0.6291 | 0.7583 |\n| arabic_retrieval_v1.0 | 0.6473 | 0.6159 | 0.5674 | 0.5832 | 0.5993 | 0.8002 | 0.6254 | 0.7393 |\n| gate_arabert-v1 | 0.6444 | 0.5774 | 0.4808 | 0.5345 | 0.5847 | 0.8278 | 0.6310 | 0.8746 |\n| get_multilingual_base | 0.6440 | 0.7177 | 0.5698 | 0.5071 | 0.5521 | 0.7881 | 0.6145 | 0.7584 |\n| arabic_sts_matryoshka | 0.6413 | 0.5828 | 0.4840 | 0.5457 | 0.5494 | 0.8290 | 0.6242 | 0.8740 |\n| silma_ai_embedding_sts_v0.1 | 0.6138 | 0.3799 | 0.5011 | 0.5600 | 0.5749 | 0.8559 | 0.6122 | 0.8125 |\n| Arabic-MiniLM-L12-v2-all-nli-triplet | 0.5431 | 0.2240 | 0.3612 | 0.4775 | 0.5698 | 0.8111 | 0.5540 | 0.8043 |\n| paraphrase-multilingual-MiniLM-L12-v2 | 0.5208 | 0.2191 | 0.3496 | 0.4515 | 0.5573 | 0.7916 | 0.4908 | 0.7859 |\n| bojji | 0.5177 | 0.2941 | 0.3989 | 0.4667 | 0.5433 | 0.7233 | 0.5880 | 0.6094 |\n| zarra | 0.4822 | 0.2295 | 0.3473 | 0.4119 | 0.5237 | 0.6469 | 0.6218 | 0.5942 |\n| potion-multilingual-128M | 0.4699 | 0.1658 | 0.3150 | 0.4285 | 0.5338 | 0.6511 | 0.5951 | 0.5999 |\n| all_minilm_l6_v2 | 0.2843 | 0.0005 | 0.0064 | 0.1905 | 0.4934 | 0.5089 | 0.2518 | 0.5384 |\n\n### Sorted by STS17_main (Score)\n\n| Model Name | STS17_main |\n|---------------------------------------|------------|\n| silma_ai_embedding_sts_v0.1 | 0.856 |\n| arabic_triplet_matryoshka_v2 | 0.853 |\n| muffakir_embedding | 0.849 |\n| arabic_sts_matryoshka | 0.829 |\n| gate_arabert-v1 | 0.828 |\n| Arabic-MiniLM-L12-v2-all-nli-triplet | 0.811 |\n| arabic_retrieval_v1.0 | 0.800 |\n| paraphrase-multilingual-MiniLM-L12-v2 | 0.792 |\n| get_multilingual_base | 0.788 |\n| bojji | 0.723 |\n| potion-multilingual-128M | 0.651 |\n| zarra | 0.647 |\n| all_minilm_l6_v2 | 0.509 |\n\n### Sorted by STS22.v2_main (Score)\n\n| Model Name | STS22.v2_main |\n|---------------------------------------|---------------|\n| arabic_triplet_matryoshka_v2 | 0.640 |\n| gate_arabert-v1 | 0.631 |\n| muffakir_embedding | 0.629 |\n| arabic_retrieval_v1.0 | 0.625 |\n| arabic_sts_matryoshka | 0.624 |\n| zarra | 0.622 |\n| get_multilingual_base | 0.615 |\n| silma_ai_embedding_sts_v0.1 | 0.612 |\n| potion-multilingual-128M | 0.595 |\n| bojji | 0.588 |\n| Arabic-MiniLM-L12-v2-all-nli-triplet | 0.554 |\n| paraphrase-multilingual-MiniLM-L12-v2 | 0.491 |\n| all_minilm_l6_v2 | 0.252 |\n\n\n## Additional Resources\n\n- [Zarra & Bojji Blog](https://kareemai.com/blog/posts/minishlab/blog_zaraah.html)\n- [NAMAA Collection](https://huggingface.co/collections/NAMAA-Space/zaraah-683f1f8a1eec1ec8f2badee5)\n- [MinishLab](https://minishlab.github.io/)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": null, "base_model_relation": null }, { "model_id": "alikia2x/jina-embedding-v3-m2v-256", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nlicense: mit\nmodel_name: onnx\ntags:\n- embeddings\n- static-embeddings\n- sentence-transformers\n---\n\n# onnx Model Card\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the [jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer. It uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. It is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\nLoad this model using the `from_pretrained` method:\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"onnx\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Example sentence\"])\n```\n\nAlternatively, you can distill your own model using the `distill` method:\n```python\nfrom model2vec.distill import distill\n\n# Choose a Sentence Transformer model\nmodel_name = \"BAAI/bge-base-en-v1.5\"\n\n# Distill the model\nm2v_model = distill(model_name=model_name, pca_dims=256)\n\n# Save the model\nm2v_model.save_pretrained(\"m2v_model\")\n```\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "alikia2x/jina-embedding-v3-m2v", "base_model_relation": "finetune" }, { "model_id": "alikia2x/jina-embedding-v3-m2v-1024", "gated": "False", "card": "---\nbase_model: jinaai/jina-embeddings-v3\nlanguage:\n- multilingual\n- af\n- am\n- ar\n- as\n- az\n- be\n- bg\n- bn\n- br\n- bs\n- ca\n- cs\n- cy\n- da\n- de\n- el\n- en\n- eo\n- es\n- et\n- eu\n- fa\n- fi\n- fr\n- fy\n- ga\n- gd\n- gl\n- gu\n- ha\n- he\n- hi\n- hr\n- hu\n- hy\n- id\n- is\n- it\n- ja\n- jv\n- ka\n- kk\n- km\n- kn\n- ko\n- ku\n- ky\n- la\n- lo\n- lt\n- lv\n- mg\n- mk\n- ml\n- mn\n- mr\n- ms\n- my\n- ne\n- nl\n- 'no'\n- om\n- or\n- pa\n- pl\n- ps\n- pt\n- ro\n- ru\n- sa\n- sd\n- si\n- sk\n- sl\n- so\n- sq\n- sr\n- su\n- sv\n- sw\n- ta\n- te\n- th\n- tl\n- tr\n- ug\n- uk\n- ur\n- uz\n- vi\n- xh\n- yi\n- zh\nlibrary_name: model2vec\nlicense: mit\nmodel_name: onnx\ntags:\n- embeddings\n- static-embeddings\n- sentence-transformers\n---\n\n# alikia2x/jina-embedding-v3-m2v-1024\n\nThis [Model2Vec](https://github.com/MinishLab/model2vec) model is a distilled version of the \n[jinaai/jina-embeddings-v3](https://huggingface.co/jinaai/jina-embeddings-v3) Sentence Transformer. \nIt uses static embeddings, allowing text embeddings to be computed orders of magnitude faster on both GPU and CPU. \nIt is designed for applications where computational resources are limited or where real-time performance is critical.\n\n\n## Installation\n\nInstall model2vec using pip:\n```\npip install model2vec\n```\n\n## Usage\n\n### Via `model2vec`\n\nLoad this model using the `from_pretrained` method:\n\n```python\nfrom model2vec import StaticModel\n\n# Load a pretrained Model2Vec model\nmodel = StaticModel.from_pretrained(\"alikia2x/jina-embedding-v3-m2v-1024\")\n\n# Compute text embeddings\nembeddings = model.encode([\"Hello\"])\n```\n\n### Via `sentence-transformers`\n\n```bash\npip install sentence-transformers\n```\n\n```python\nfrom sentence_transformers import SentenceTransformer\n\nmodel = SentenceTransformer(\"alikia2x/jina-embedding-v3-m2v-1024\")\n\n# embedding:\n# array([[ 1.1825741e-01, -1.2899181e-02, -1.0492010e-01, ...,\n# 1.1131058e-03, 8.2779792e-04, -7.6874542e-08]],\n# shape=(1, 1024), dtype=float32)\nembeddings = model.encode([\"Hello\"])\n```\n\n### Via ONNX\n\n```bash\npip install onnxruntime transformers\n```\n\nYou need to download `onnx/model.onnx` in this repository first.\n\n```python\nimport onnxruntime\nfrom transformers import AutoTokenizer\nimport numpy as np\n\ntokenizer_model = \"alikia2x/jina-embedding-v3-m2v-1024\"\nonnx_embedding_path = \"path/to/your/model.onnx\"\n\ntexts = [\"Hello\"]\ntokenizer = AutoTokenizer.from_pretrained(tokenizer_model)\nsession = onnxruntime.InferenceSession(onnx_embedding_path)\n\ninputs = tokenizer(texts, add_special_tokens=False, return_tensors=\"np\")\ninput_ids = inputs[\"input_ids\"]\nlengths = [len(seq) for seq in input_ids[:-1]]\noffsets = [0] + np.cumsum(lengths).tolist()\nflattened_input_ids = input_ids.flatten().astype(np.int64)\n\ninputs = {\n \"input_ids\": flattened_input_ids,\n \"offsets\": np.array(offsets, dtype=np.int64),\n}\n\noutputs = session.run(None, inputs)\nembeddings = outputs[0]\nembeddings = embeddings.flatten()\n\n# [ 1.1825741e-01 -1.2899181e-02 -1.0492010e-01 ... 1.1131058e-03\n# 8.2779792e-04 -7.6874542e-08]\nprint(embeddings)\n```\n\nNote: A quantized (INT8) version of this model is also available, offering reduced memory usage with minimal performance impact.\nSimply replace `onnx/model.onnx` with the `onnx/model_INT8.onnx` file.\nOur testing shows less than a 1% drop in the F1 score on a real down-stream task.\n\n## How it works\n\nModel2vec creates a small, fast, and powerful model that outperforms other static embedding models by a large margin on all tasks we could find, while being much faster to create than traditional static embedding models such as GloVe. Best of all, you don't need any data to distill a model using Model2Vec.\n\nIt works by passing a vocabulary through a sentence transformer model, then reducing the dimensionality of the resulting embeddings using PCA, and finally weighting the embeddings using zipf weighting. During inference, we simply take the mean of all token embeddings occurring in a sentence.\n\n## Additional Resources\n\n- [All Model2Vec models on the hub](https://huggingface.co/models?library=model2vec)\n- [Model2Vec Repo](https://github.com/MinishLab/model2vec)\n- [Model2Vec Results](https://github.com/MinishLab/model2vec?tab=readme-ov-file#results)\n- [Model2Vec Tutorials](https://github.com/MinishLab/model2vec/tree/main/tutorials)\n\n## Library Authors\n\nModel2Vec was developed by the [Minish Lab](https://github.com/MinishLab) team consisting of [Stephan Tulkens](https://github.com/stephantul) and [Thomas van Dongen](https://github.com/Pringled).\n\n## Citation\n\nPlease cite the [Model2Vec repository](https://github.com/MinishLab/model2vec) if you use this model in your work.\n```\n@software{minishlab2024model2vec,\n authors = {Stephan Tulkens, Thomas van Dongen},\n title = {Model2Vec: Turn any Sentence Transformer into a Small Fast Model},\n year = {2024},\n url = {https://github.com/MinishLab/model2vec},\n}\n```", "metadata": "\"N/A\"", "depth": 1, "children": [], "children_count": 0, "adapters": [], "adapters_count": 0, "quantized": [], "quantized_count": 0, "merges": [], "merges_count": 0, "total_derivatives": 0, "spaces": [], "spaces_count": 0, "parents": [ "jinaai/jina-embeddings-v3" ], "base_model": "alikia2x/jina-embedding-v3-m2v", "base_model_relation": "finetune" } ] }