Pringled commited on
Commit
27ac526
·
verified ·
1 Parent(s): 3c4b608

fix: Missing get_input_embeddings / set_input_embeddings on NomicBertModel

Browse files

NomicBertModel doesn't implement `get_input_embeddings()` or `set_input_embeddings()`, so the transformers fallback in `EmbeddingAccessMixi` tries to resolve them automatically. This fails because:
- `_input_embed_layer` defaults to `embed_tokens`, but NomicBERT uses `word_embeddings`
- `base_model_prefix = "model"`, but `__init__` creates `self.embeddings`, not `self.model`

Adding these two methods fixes this.

Files changed (1) hide show
  1. modeling_hf_nomic_bert.py +6 -0
modeling_hf_nomic_bert.py CHANGED
@@ -1050,6 +1050,12 @@ class NomicBertModel(NomicBertPreTrainedModel):
1050
 
1051
  self.apply(partial(_init_weights, initializer_range=config.initializer_range))
1052
 
 
 
 
 
 
 
1053
  def forward(
1054
  self,
1055
  input_ids,
 
1050
 
1051
  self.apply(partial(_init_weights, initializer_range=config.initializer_range))
1052
 
1053
+ def get_input_embeddings(self):
1054
+ return self.embeddings.word_embeddings
1055
+
1056
+ def set_input_embeddings(self, value):
1057
+ self.embeddings.word_embeddings = value
1058
+
1059
  def forward(
1060
  self,
1061
  input_ids,