SentenceTransformer based on BAAI/bge-base-en-v1.5

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("ayushexel/emb-bge-base-en-v1.5-squad-5-epochs")
# Run inference
sentences = [
    'What increased the power in the House of Commons?',
    'Through Victoria\'s reign, the gradual establishment of a modern constitutional monarchy in Britain continued. Reforms of the voting system increased the power of the House of Commons at the expense of the House of Lords and the monarch. In 1867, Walter Bagehot wrote that the monarch only retained "the right to be consulted, the right to encourage, and the right to warn". As Victoria\'s monarchy became more symbolic than political, it placed a strong emphasis on morality and family values, in contrast to the sexual, financial and personal scandals that had been associated with previous members of the House of Hanover and which had discredited the monarchy. The concept of the "family monarchy", with which the burgeoning middle classes could identify, was solidified.',
    'Most prime ministers in parliamentary systems are not appointed for a specific term in office and in effect may remain in power through a number of elections and parliaments. For example, Margaret Thatcher was only ever appointed prime minister on one occasion, in 1979. She remained continuously in power until 1990, though she used the assembly of each House of Commons after a general election to reshuffle her cabinet.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.4128

Training Details

Training Dataset

Unnamed Dataset

  • Size: 44,286 training samples
  • Columns: question, context, and negative
  • Approximate statistics based on the first 1000 samples:
    question context negative
    type string string string
    details
    • min: 6 tokens
    • mean: 14.46 tokens
    • max: 41 tokens
    • min: 28 tokens
    • mean: 150.23 tokens
    • max: 512 tokens
    • min: 31 tokens
    • mean: 154.5 tokens
    • max: 507 tokens
  • Samples:
    question context negative
    What was published in the IEEE Journal in 1988? A literature compendium for a large variety of audio coding systems was published in the IEEE Journal on Selected Areas in Communications (JSAC), February 1988. While there were some papers from before that time, this collection documented an entire variety of finished, working audio coders, nearly all of them using perceptual (i.e. masking) techniques and some kind of frequency analysis and back-end noiseless coding. Several of these papers remarked on the difficulty of obtaining good, clean digital audio for research purposes. Most, if not all, of the authors in the JSAC edition were also active in the MPEG-1 Audio committee. As at most other universities, Notre Dame's students run a number of news media outlets. The nine student-run outlets include three newspapers, both a radio and television station, and several magazines and journals. Begun as a one-page journal in September 1876, the Scholastic magazine is issued twice monthly and claims to be the oldest continuous collegiate publication in the United States. The other magazine, The Juggler, is released twice a year and focuses on student literature and artwork. The Dome yearbook is published annually. The newspapers have varying publication interests, with The Observer published daily and mainly reporting university and other news, and staffed by students from both Notre Dame and Saint Mary's College. Unlike Scholastic and The Dome, The Observer is an independent publication and does not have a faculty advisor or any editorial oversight from the University. In 1987, when some students believed that The Observer began to show a conservative bias, a lib...
    What can be a intrinsic component of landscaping? Indoor lighting is usually accomplished using light fixtures, and is a key part of interior design. Lighting can also be an intrinsic component of landscape projects. Plants depend on certain edaphic (soil) and climatic factors in their environment but can modify these factors too. For example, they can change their environment's albedo, increase runoff interception, stabilize mineral soils and develop their organic content, and affect local temperature. Plants compete with other organisms in their ecosystem for resources. They interact with their neighbours at a variety of spatial scales in groups, populations and communities that collectively constitute vegetation. Regions with characteristic vegetation types and dominant plants as well as similar abiotic and biotic factors, climate, and geography make up biomes like tundra or tropical rainforest.
    What is the term used to describe the making beer? Beer is the world's most widely consumed and likely the oldest alcoholic beverage; it is the third most popular drink overall, after water and tea. The production of beer is called brewing, which involves the fermentation of starches, mainly derived from cereal grains—most commonly malted barley, although wheat, maize (corn), and rice are widely used. Most beer is flavoured with hops, which add bitterness and act as a natural preservative, though other flavourings such as herbs or fruit may occasionally be included. The fermentation process causes a natural carbonation effect which is often removed during processing, and replaced with forced carbonation. Some of humanity's earliest known writings refer to the production and distribution of beer: the Code of Hammurabi included laws regulating beer and beer parlours, and "The Hymn to Ninkasi", a prayer to the Mesopotamian goddess of beer, served as both a prayer and as a method of remembering the recipe for beer in a culture with few lit... In many societies, beer is the most popular alcoholic drink. Various social traditions and activities are associated with beer drinking, such as playing cards, darts, or other pub games; attending beer festivals; engaging in zythology (the study of beer); visiting a series of pubs in one evening; visiting breweries; beer-oriented tourism; or rating beer. Drinking games, such as beer pong, are also popular. A relatively new profession is that of the beer sommelier, who informs restaurant patrons about beers and food pairings.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 5,000 evaluation samples
  • Columns: question, context, and negative_1
  • Approximate statistics based on the first 1000 samples:
    question context negative_1
    type string string string
    details
    • min: 7 tokens
    • mean: 14.41 tokens
    • max: 32 tokens
    • min: 32 tokens
    • mean: 151.22 tokens
    • max: 512 tokens
    • min: 31 tokens
    • mean: 150.7 tokens
    • max: 456 tokens
  • Samples:
    question context negative_1
    Who founded the Hudson River Sloop Clearwater and the Clearwater Festival to bring attention to the pollution caused by GE? General Electric heavily contaminated the Hudson River with polychlorinated biphenyls (PCBs) between 1947-77. This pollution caused a range of harmful effects to wildlife and people who eat fish from the river or drink the water. In response to this contamination, activists protested in various ways. Musician Pete Seeger founded the Hudson River Sloop Clearwater and the Clearwater Festival to draw attention to the problem. The activism led to the site being designated by the EPA as one of the superfund sites requiring extensive cleanup. Other sources of pollution, including mercury contamination and sewage dumping, have also contributed to problems in the Hudson River watershed. General Electric heavily contaminated the Hudson River with polychlorinated biphenyls (PCBs) between 1947-77. This pollution caused a range of harmful effects to wildlife and people who eat fish from the river or drink the water. In response to this contamination, activists protested in various ways. Musician Pete Seeger founded the Hudson River Sloop Clearwater and the Clearwater Festival to draw attention to the problem. The activism led to the site being designated by the EPA as one of the superfund sites requiring extensive cleanup. Other sources of pollution, including mercury contamination and sewage dumping, have also contributed to problems in the Hudson River watershed.
    Wood fibers from wood strands, lumber, and what other source can be glued together to make larger units? Engineered wood products, glued building products "engineered" for application-specific performance requirements, are often used in construction and industrial applications. Glued engineered wood products are manufactured by bonding together wood strands, veneers, lumber or other forms of wood fiber with glue to form a larger, more efficient composite structural unit. Engineered wood products, glued building products "engineered" for application-specific performance requirements, are often used in construction and industrial applications. Glued engineered wood products are manufactured by bonding together wood strands, veneers, lumber or other forms of wood fiber with glue to form a larger, more efficient composite structural unit.
    What, if it formed the "essential basis" of a state's consent to a treaty, may invalidate that consent? A state's consent may be invalidated if there was an erroneous understanding of a fact or situation at the time of conclusion, which formed the "essential basis" of the state's consent. Consent will not be invalidated if the misunderstanding was due to the state's own conduct, or if the truth should have been evident. A state's consent may be invalidated if there was an erroneous understanding of a fact or situation at the time of conclusion, which formed the "essential basis" of the state's consent. Consent will not be invalidated if the misunderstanding was due to the state's own conduct, or if the truth should have been evident.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • num_train_epochs: 5
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss Validation Loss gooqa-dev_cosine_accuracy
-1 -1 - - 0.3536
0.2890 100 0.7107 0.8116 0.3840
0.5780 200 0.4557 0.7507 0.4080
0.8671 300 0.408 0.7309 0.4104
1.1561 400 0.3204 0.7210 0.4112
1.4451 500 0.2481 0.7292 0.4100
1.7341 600 0.248 0.7228 0.4108
2.0231 700 0.241 0.7164 0.4112
2.3121 800 0.1465 0.7266 0.4166
2.6012 900 0.1535 0.7260 0.4126
2.8902 1000 0.152 0.7248 0.4210
3.1792 1100 0.1228 0.7319 0.4188
3.4682 1200 0.1063 0.7369 0.4112
3.7572 1300 0.1066 0.7350 0.4166
4.0462 1400 0.1028 0.7368 0.4186
4.3353 1500 0.0817 0.7409 0.4110
4.6243 1600 0.0837 0.7447 0.4136
4.9133 1700 0.0847 0.7434 0.4168
-1 -1 - - 0.4128

Framework Versions

  • Python: 3.11.0
  • Sentence Transformers: 4.0.1
  • Transformers: 4.50.3
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.5.2
  • Datasets: 3.5.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ayushexel/emb-bge-base-en-v1.5-squad-5-epochs

Finetuned
(458)
this model

Papers for ayushexel/emb-bge-base-en-v1.5-squad-5-epochs

Evaluation results