fix: Missing get_input_embeddings / set_input_embeddings on NomicBertModel
Browse filesNomicBertModel doesn't implement `get_input_embeddings()` or `set_input_embeddings()`, so the transformers fallback in `EmbeddingAccessMixi` tries to resolve them automatically. This fails because:
- `_input_embed_layer` defaults to `embed_tokens`, but NomicBERT uses `word_embeddings`
- `base_model_prefix = "model"`, but `__init__` creates `self.embeddings`, not `self.model`
Adding these two methods fixes this.
modeling_hf_nomic_bert.py
CHANGED
|
@@ -1050,6 +1050,12 @@ class NomicBertModel(NomicBertPreTrainedModel):
|
|
| 1050 |
|
| 1051 |
self.apply(partial(_init_weights, initializer_range=config.initializer_range))
|
| 1052 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1053 |
def forward(
|
| 1054 |
self,
|
| 1055 |
input_ids,
|
|
|
|
| 1050 |
|
| 1051 |
self.apply(partial(_init_weights, initializer_range=config.initializer_range))
|
| 1052 |
|
| 1053 |
+
def get_input_embeddings(self):
|
| 1054 |
+
return self.embeddings.word_embeddings
|
| 1055 |
+
|
| 1056 |
+
def set_input_embeddings(self, value):
|
| 1057 |
+
self.embeddings.word_embeddings = value
|
| 1058 |
+
|
| 1059 |
def forward(
|
| 1060 |
self,
|
| 1061 |
input_ids,
|