beaupi commited on
Commit
caa3990
·
verified ·
1 Parent(s): 8ae38ae

Upload granite-4.1-8b-oQ3.5 via oMLX

Browse files
README.md ADDED
@@ -0,0 +1,594 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ library_name: transformers
4
+ tags:
5
+ - language
6
+ - granite-4.1
7
+ ---
8
+
9
+ [![mof-class3-qualified](https://mot.isitopen.ai/modules/mof/assets/badge_class3_qualified.png)](https://mot.isitopen.ai/model/1160)
10
+
11
+ # Granite-4.1-8B
12
+
13
+
14
+ **Model Summary:**
15
+ Granite-4.1-8B is a 8B parameter long-context instruct model finetuned from *Granite-4.1-8B-Base* using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. Granite 4.1 models have gone through an improved post-training pipeline, including supervised finetuning and reinforcement learning alignment, resulting in enhanced tool calling, instruction following, and chat capabilities.
16
+
17
+ - **Developers:** Granite Team, IBM
18
+ - **HF Collection:** [Granite 4.1 Language Models HF Collection](https://huggingface.co/collections/ibm-granite/granite-41-language-models)
19
+ - **Technical Blog:** [Granite-4.1 Blog](https://huggingface.co/blog/ibm-granite/granite-4-1)
20
+ - **GitHub Repository:** [ibm-granite/granite-4.1-language-models](https://github.com/ibm-granite/granite-4.1-language-models)
21
+ - **Website**: [Granite Docs](https://www.ibm.com/granite/docs/)
22
+ - **Release Date**: April 29th, 2026
23
+ - **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
24
+
25
+ **Supported Languages:**
26
+ English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, and Chinese. Users may finetune Granite 4.1 models for languages beyond these languages.
27
+
28
+ **Intended use:**
29
+ The model is designed to follow general instructions and can serve as the foundation for AI assistants across diverse domains, including business applications, as well as for LLM agents equipped with tool-use capabilities.
30
+
31
+ *Capabilities*
32
+ * Summarization
33
+ * Text classification
34
+ * Text extraction
35
+ * Question-answering
36
+ * Retrieval Augmented Generation (RAG)
37
+ * Code related tasks
38
+ * Function-calling tasks
39
+ * Multilingual dialog use cases
40
+ * Fill-In-the-Middle (FIM) code completions
41
+
42
+ <!-- <todo>Need to test the examples. (especially the tool calling and RAG ones)</todo>
43
+ -->
44
+
45
+ **Generation:**
46
+ This is a simple example of how to use Granite-4.1-8B model.
47
+
48
+ Install the following libraries:
49
+
50
+ ```shell
51
+ pip install torch torchvision torchaudio
52
+ pip install accelerate
53
+ pip install transformers
54
+ ```
55
+ Then, copy the snippet from the section that is relevant for your use case.
56
+
57
+ ```python
58
+ import torch
59
+ from transformers import AutoModelForCausalLM, AutoTokenizer
60
+
61
+ device = "cuda"
62
+ model_path = "ibm-granite/granite-4.1-8b"
63
+ tokenizer = AutoTokenizer.from_pretrained(model_path)
64
+ # drop device_map if running on CPU
65
+ model = AutoModelForCausalLM.from_pretrained(model_path, device_map=device)
66
+ model.eval()
67
+ # change input text as desired
68
+ chat = [
69
+ { "role": "user", "content": "Please list one IBM Research laboratory located in the United States. You should only output its name and location." },
70
+ ]
71
+ chat = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
72
+ # tokenize the text
73
+ input_tokens = tokenizer(chat, return_tensors="pt").to(device)
74
+ # generate output tokens
75
+ output = model.generate(**input_tokens,
76
+ max_new_tokens=100)
77
+ # decode output tokens into text
78
+ output = tokenizer.batch_decode(output)
79
+ # print output
80
+ print(output[0])
81
+ ```
82
+
83
+ Expected output:
84
+ ```shell
85
+ <|start_of_role|>user<|end_of_role|>Please list one IBM Research laboratory located in the United States. You should only output its name and location.<|end_of_text|>
86
+ <|start_of_role|>assistant<|end_of_role|>IBM Almaden Research Laboratory, San Jose, California, United States.<|end_of_text|>
87
+ ```
88
+ <!-- 📣 **Update [2025-10-07]:** Added a *default system prompt* to the chat template to guide the model towards more *professional, accurate, and safe* responses. -->
89
+
90
+ **Tool-calling:**
91
+ Granite-4.1-8B comes with enhanced tool calling capabilities, enabling seamless integration with external functions and APIs. To define a list of tools please follow OpenAI's function [definition schema](https://platform.openai.com/docs/guides/function-calling?api-mode=responses#defining-functions).
92
+
93
+ This is an example of how to use Granite-4.1-8B model tool-calling ability:
94
+
95
+ ```python
96
+ import torch
97
+ from transformers import AutoModelForCausalLM, AutoTokenizer
98
+
99
+ device = "cuda"
100
+ model_path = "ibm-granite/granite-4.1-8b"
101
+ tokenizer = AutoTokenizer.from_pretrained(model_path)
102
+ # drop device_map if running on CPU
103
+ model = AutoModelForCausalLM.from_pretrained(model_path, device_map=device)
104
+ model.eval()
105
+
106
+ tools = [
107
+ {
108
+ "type": "function",
109
+ "function": {
110
+ "name": "get_current_weather",
111
+ "description": "Get the current weather for a specified city.",
112
+ "parameters": {
113
+ "type": "object",
114
+ "properties": {
115
+ "city": {
116
+ "type": "string",
117
+ "description": "Name of the city"
118
+ }
119
+ },
120
+ "required": ["city"]
121
+ }
122
+ }
123
+ }
124
+ ]
125
+
126
+ # change input text as desired
127
+ chat = [
128
+ { "role": "user", "content": "What's the weather like in Boston right now?" },
129
+ ]
130
+ chat = tokenizer.apply_chat_template(chat, \
131
+ tokenize=False, \
132
+ tools=tools, \
133
+ add_generation_prompt=True)
134
+ # tokenize the text
135
+ input_tokens = tokenizer(chat, return_tensors="pt").to(device)
136
+ # generate output tokens
137
+ output = model.generate(**input_tokens,
138
+ max_new_tokens=100)
139
+ # decode output tokens into text
140
+ output = tokenizer.batch_decode(output)
141
+ # print output
142
+ print(output[0])
143
+ ```
144
+
145
+ Expected output:
146
+ ```shell
147
+ <|start_of_role|>system<|end_of_role|>You are a helpful assistant with access to the following tools. You may call one or more tools to assist with the user query.
148
+ You are provided with function signatures within <tools></tools> XML tags:
149
+ <tools>
150
+ {"type": "function", "function": {"name": "get_current_weather", "description": "Get the current weather for a specified city.", "parameters": {"type": "object", "properties": {"city": {"type": "string", "description": "Name of the city"}}, "required": ["city"]}}}
151
+ </tools>
152
+ For each tool call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:
153
+ <tool_call>
154
+ {"name": <function-name>, "arguments": <args-json-object>}
155
+ </tool_call>. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request.<|end_of_text|>
156
+ <|start_of_role|>user<|end_of_role|>What's the weather like in Boston right now?<|end_of_text|>
157
+ <|start_of_role|>assistant<|end_of_role|><tool_call>
158
+ {"name": "get_current_weather", "arguments": {"city": "Boston"}}
159
+ </tool_call><|end_of_text|>
160
+ ```
161
+
162
+ <!-- **Retrieval Augmented Generation:**
163
+ *Coming soon* -->
164
+
165
+ **Evaluation Results:**
166
+
167
+ <table>
168
+ <!-- <caption><b> All Results</b></caption> -->
169
+ <thead>
170
+ <tr>
171
+ <th style="text-align:left; background-color: #001d6c; color: white;">Benchmarks</th>
172
+ <th style="text-align:left; background-color: #001d6c; color: white;">Metric</th>
173
+ <th style="text-align:center; background-color: #001d6c; color: white;">3B Dense</th>
174
+ <th style="text-align:center; background-color: #001d6c; color: white;">8B Dense</th>
175
+ <th style="text-align:center; background-color: #001d6c; color: white;">30B Dense</th>
176
+ </tr>
177
+ </thead>
178
+ <tbody>
179
+ <tr>
180
+ <td colspan="5" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
181
+ General Tasks
182
+ </td>
183
+ </tr>
184
+ <tr>
185
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MMLU</td>
186
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">5-shot</td>
187
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">67.02</td>
188
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">73.84</td>
189
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">80.16</td>
190
+ </tr>
191
+ <tr>
192
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MMLU-Pro</td>
193
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">5-shot, CoT</td>
194
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">49.83</td>
195
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">55.99</td>
196
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">64.09</td>
197
+ </tr>
198
+ <tr>
199
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">BBH</td>
200
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">3-shot, CoT</td>
201
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">75.83</td>
202
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">80.51</td>
203
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">83.74</td>
204
+ </tr>
205
+ <tr>
206
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">AGI EVAL</td>
207
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">0-shot, CoT</td>
208
+ <td style="text-align:right; background-color:#FFFFFF; color: #2D2D2D;">65.16</td>
209
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">72.43</td>
210
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">77.80</td>
211
+ </tr>
212
+ <tr>
213
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">GPQA</td>
214
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">0-shot, CoT</td>
215
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">31.70</td>
216
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">41.96</td>
217
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">45.76</td>
218
+ </tr>
219
+ <tr>
220
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">SimpleQA</td>
221
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
222
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">3.68</td>
223
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">4.82</td>
224
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">6.81</td>
225
+ </tr>
226
+ <tr>
227
+ <td colspan="5" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
228
+ Alignment Tasks
229
+ </td>
230
+ </tr>
231
+ <tr>
232
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">AlpacaEval 2.0</td>
233
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;"></td>
234
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">38.57</td>
235
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">50.08</td>
236
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">56.16</td>
237
+ </tr>
238
+ <tr>
239
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">IFEval Avg</td>
240
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
241
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">82.30</td>
242
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">87.06</td>
243
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">89.65</td>
244
+ </tr>
245
+ <tr>
246
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">ArenaHard</td>
247
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
248
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">37.80</td>
249
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">68.98</td>
250
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">71.02</td>
251
+ </tr>
252
+ <tr>
253
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MTBench Avg</td>
254
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
255
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">7.57</td>
256
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">8.61</td>
257
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">8.61</td>
258
+ </tr>
259
+ <tr>
260
+ <td colspan="5" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
261
+ Math Tasks
262
+ </td>
263
+ </tr>
264
+ <tr>
265
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">GSM8K</td>
266
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">8-shot</td>
267
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">86.88</td>
268
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">92.49</td>
269
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">94.16</td>
270
+ </tr>
271
+ <tr>
272
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">GSM Symbolic</td>
273
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">8-shot</td>
274
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">81.32</td>
275
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">83.70</td>
276
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">75.70</td>
277
+ </tr>
278
+ <tr>
279
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">Minerva Math</td>
280
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">0-shot, CoT</td>
281
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">67.94</td>
282
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">80.10</td>
283
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">81.32</td>
284
+ </tr>
285
+ <tr>
286
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">DeepMind Math</td>
287
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">0-shot, CoT</td>
288
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">64.64</td>
289
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">80.07</td>
290
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">81.93</td>
291
+ </tr>
292
+ <tr>
293
+ <td colspan="5" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
294
+ Code Tasks
295
+ </td>
296
+ </tr>
297
+ <tr>
298
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">HumanEval</td>
299
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
300
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">81.71</td>
301
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">85.37</td>
302
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">88.41</td>
303
+ </tr>
304
+ <tr>
305
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">HumanEval+</td>
306
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
307
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">76.83</td>
308
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">79.88</td>
309
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">85.37</td>
310
+ </tr>
311
+ <tr>
312
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MBPP</td>
313
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
314
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">71.16</td>
315
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">87.30</td>
316
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">85.45</td>
317
+ </tr>
318
+ <tr>
319
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MBPP+</td>
320
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
321
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">62.17</td>
322
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">73.81</td>
323
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">73.54</td>
324
+ </tr>
325
+ <tr>
326
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">CRUXEval-O</td>
327
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
328
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">40.75</td>
329
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">47.63</td>
330
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">55.75</td>
331
+ </tr>
332
+ <tr>
333
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">BigCodeBench</td>
334
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
335
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">32.19</td>
336
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">35.00</td>
337
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">38.77</td>
338
+ </tr>
339
+ <tr>
340
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MULTIPLE</td>
341
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
342
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">52.54</td>
343
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">60.26</td>
344
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">62.31</td>
345
+ </tr>
346
+ <tr>
347
+ <tr>
348
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">Eval+ Avg</td>
349
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">pass@1</td>
350
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">67.05</td>
351
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">80.21</td>
352
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">82.66</td>
353
+ </tr>
354
+ <tr>
355
+ <td colspan="5" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
356
+ Tool Calling Tasks
357
+ </td>
358
+ </tr>
359
+ <tr>
360
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">BFCL v3</td>
361
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
362
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">60.80</td>
363
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">68.27</td>
364
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">73.68</td>
365
+ </tr>
366
+ <tr>
367
+ <td colspan="5" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
368
+ Multilingual Tasks
369
+ </td>
370
+ </tr>
371
+ <tr>
372
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MMMLU</td>
373
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">5-shot</td>
374
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">57.61</td>
375
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">64.84</td>
376
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">73.71</td>
377
+ </tr>
378
+ <tr>
379
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">INCLUDE</td>
380
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">5-shot</td>
381
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">52.05</td>
382
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">58.89</td>
383
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">67.26</td>
384
+ </tr>
385
+ <tr>
386
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MGSM</td>
387
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">8-shot</td>
388
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">70.00</td>
389
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">82.32</td>
390
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">71.12</td>
391
+ </tr>
392
+ <tr>
393
+ <td colspan="6" style="text-align:center; background-color: #FFFFFF; color: #2D2D2D; font-style:italic;">
394
+ Safety
395
+ </td>
396
+ </tr>
397
+ <tr>
398
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">SALAD-Bench</td>
399
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
400
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">93.95</td>
401
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">95.80</td>
402
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">96.41</td>
403
+ </tr>
404
+ <tr>
405
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">AttaQ</td>
406
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
407
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">81.88</td>
408
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">81.19</td>
409
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">85.76</td>
410
+ </tr>
411
+ <tr>
412
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">Tulu3 Safety Eval Avg</td>
413
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;"></td>
414
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">66.84</td>
415
+ <td style="text-align:right; background-color: #DAE8FF; color: #2D2D2D;">75.57</td>
416
+ <td style="text-align:right; background-color: #FFFFFF; color: #2D2D2D;">78.19</td>
417
+ </tr>
418
+ </tbody></table>
419
+
420
+
421
+ <table>
422
+ <caption><b>Multilingual Benchmarks and the included languages:</b></caption>
423
+ <thead>
424
+ <tr>
425
+ <th style="text-align:left; background-color: #001d6c; color: white;">Benchmarks</th>
426
+ <th style="text-align:left; background-color: #001d6c; color: white;"># Langs</th>
427
+ <th style="text-align:center; background-color: #001d6c; color: white;">Languages</th>
428
+ </tr>
429
+ </thead>
430
+ <tbody>
431
+ <tr>
432
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MMMLU</td>
433
+ <td style="text-align:center; background-color: #FFFFFF; color: #2D2D2D;">11</td>
434
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">ar, de, en, es, fr, ja, ko, pt, zh, bn, hi</td>
435
+ </tr>
436
+ <tr>
437
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">INCLUDE</td>
438
+ <td style="text-align:center; background-color: #FFFFFF; color: #2D2D2D;">14</td>
439
+ <!-- <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">hindi, bengali, tamil, telugu, arabic, german, spanish, french, italian, japanese, korean, dutch, portuguese, chinese</td> -->
440
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">hi, bn, ta, te, ar, de, es, fr, it, ja, ko, nl, pt, zh</td>
441
+
442
+ </tr>
443
+ <tr>
444
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">MGSM</td>
445
+ <td style="text-align:center; background-color: #FFFFFF; color: #2D2D2D;">5</td>
446
+ <td style="text-align:left; background-color: #FFFFFF; color: #2D2D2D;">en, es, fr, ja, zh</td>
447
+ </tr>
448
+ </tbody>
449
+ </table>
450
+
451
+ **Model Architecture:**
452
+
453
+ Granite-4.1-8B baseline is built on a decoder-only dense transformer architecture. Core components of this architecture are: GQA, RoPE, MLP with SwiGLU, RMSNorm, and shared input/output embeddings.
454
+
455
+ <table>
456
+ <thead>
457
+ <tr>
458
+ <th style="text-align:left; background-color: #001d6c; color: white;">Model</th>
459
+ <th style="text-align:center; background-color: #001d6c; color: white;">3B Dense</th>
460
+ <th style="text-align:center; background-color: #001d6c; color: white;">8B Dense</th>
461
+ <th style="text-align:center; background-color: #001d6c; color: white;">30B Dense</th>
462
+ </tr></thead>
463
+ <tbody>
464
+ <tr>
465
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Embedding size</td>
466
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">2560</td>
467
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">4096</td>
468
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">4096</td>
469
+ </tr>
470
+ <tr>
471
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Number of layers</td>
472
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">40</td>
473
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">40</td>
474
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">64</td>
475
+ </tr>
476
+ <tr>
477
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Attention head size</td>
478
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">64</td>
479
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">128</td>
480
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">128</td>
481
+ </tr>
482
+ <tr>
483
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Number of attention heads</td>
484
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">40</td>
485
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">32</td>
486
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">32</td>
487
+ </tr>
488
+ <tr>
489
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Number of KV heads</td>
490
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">8</td>
491
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">8</td>
492
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">8</td>
493
+ </tr>
494
+ <!--<tr>
495
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Mamba2 state size</td>
496
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">-</td>
497
+ <td style="text-align:center; background-color: #DAE8FF; color: black;"></td>
498
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
499
+ </tr>
500
+ <tr>
501
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Number of Mamba2 heads</td>
502
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
503
+ <td style="text-align:center; background-color: #DAE8FF; color: black;"></td>
504
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
505
+ </tr>-->
506
+
507
+ <tr>
508
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">MLP / Shared expert hidden size</td>
509
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">8192</td>
510
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">12800</td>
511
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">32768</td>
512
+ </tr>
513
+ <!--<tr>
514
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Num. Experts</td>
515
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
516
+ <td style="text-align:center; background-color: #DAE8FF; color: black;"></td>
517
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
518
+ </tr>
519
+ <tr>
520
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Num. active Experts</td>
521
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
522
+ <td style="text-align:center; background-color: #DAE8FF; color: black;"></td>
523
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
524
+ </tr>
525
+ <tr>
526
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Expert hidden size</td>
527
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
528
+ <td style="text-align:center; background-color: #DAE8FF; color: black;"></td>
529
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
530
+ </tr>-->
531
+
532
+ <tr>
533
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">MLP activation</td>
534
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">SwiGLU</td>
535
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">SwiGLU</td>
536
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">SwiGLU</td>
537
+ </tr>
538
+
539
+ <tr>
540
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Sequence length</td>
541
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">131072</td>
542
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">131072</td>
543
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">131072</td>
544
+ </tr>
545
+ <tr>
546
+ <td style="text-align:left; background-color: #FFFFFF; color: black;">Position embedding</td>
547
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">RoPE</td>
548
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">RoPE</td>
549
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">RoPE</td>
550
+ </tr>
551
+ <tr>
552
+ <td style="text-align:left; background-color: #FFFFFF; color: black;"># Parameters</td>
553
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">3B</td>
554
+ <td style="text-align:center; background-color: #DAE8FF; color: black;">8B</td>
555
+ <td style="text-align:center; background-color: #FFFFFF; color: black;">30B</td>
556
+ </tr>
557
+ <!-- <tr>
558
+ <td style="text-align:left; background-color: #FFFFFF; color: black;"># Active parameters</td>
559
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
560
+ <td style="text-align:center; background-color: #DAE8FF; color: black;"></td>
561
+ <td style="text-align:center; background-color: #FFFFFF; color: black;"></td>
562
+ </tr>-->
563
+ </tbody></table>
564
+
565
+
566
+
567
+ **Training Data:**
568
+ Overall, our SFT data is largely comprised of three key sources: (1) publicly available datasets with permissive license, (2) internal synthetic data targeting specific capabilities, and (3) a select set of human-curated data.
569
+
570
+ **Supervised Fine-Tuning and Reinforcement Learning:**
571
+ Instruct model has been fine tuned with significantly improved SFT-pipeline and Reinforcement learning pipelines with high quality mix of various datasets as mentioned above. With rigorous SFT-RL cycles we have improved Granite-4.1 model's tool calling, instruction following and chat capabilities. For further details please check our [Granite-4.1 Blog]((https://huggingface.co/blog/ibm-granite/granite-4-1)).
572
+
573
+ **Infrastructure:**
574
+ We trained the Granite 4.1 Language Models utilizing an NVIDIA GB200 NVL72 cluster hosted in CoreWeave. Intra-rack communication occurs via the 72-GPU NVLink domain, and a non-blocking, full Fat-Tree NDR 400 Gb/s InfiniBand network provides inter-rack communication. This cluster provides a scalable and efficient infrastructure for training our models over thousands of GPUs.
575
+
576
+ **Ethical Considerations and Limitations:**
577
+ Granite 4.1 Instruction Models are primarily finetuned using instruction-response pairs mostly in English, but also multilingual data covering multiple languages. Although this model can handle multilingual dialog use cases, its performance might not be similar to English tasks. In such cases, introducing a small number of examples (few-shot) can help the model in generating more accurate outputs. While this model has been aligned by keeping safety in consideration, the model may in some cases produce inaccurate, biased, or unsafe responses to user prompts. We urge the community to use this model with proper safety testing and tuning tailored for their specific tasks. To enhance safety in enterprise deployments, we recommend using Granite 4.1 Language models alongside [Granite Guardian](https://huggingface.co/ibm-granite/granite-guardian-4.1-8b), a model designed to detect and flag risks in inputs and outputs across key dimensions outlined in the IBM AI Risk Atlas.
578
+
579
+ **Resources**
580
+ - ⭐️ Learn about the latest updates with Granite: https://www.ibm.com/granite
581
+ - 📄 Get started with tutorials, best practices, and prompt engineering advice: https://www.ibm.com/granite/docs/
582
+ - 💡 Learn about the latest Granite learning resources: https://ibm.biz/granite-learning-resources
583
+
584
+ <!-- ## Citation
585
+ ```
586
+ @misc{granite-models,
587
+ author = {author 1, author2, ...},
588
+ title = {},
589
+ journal = {},
590
+ volume = {},
591
+ year = {2024},
592
+ url = {https://arxiv.org/abs/0000.00000},
593
+ }
594
+ ``` -->
chat_template.jinja ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {%- set tools_system_message_prefix = 'You are a helpful assistant with access to the following tools. You may call one or more tools to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>' %}
2
+ {%- set tools_system_message_suffix = '\n</tools>\n\nFor each tool call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call>. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request.' %}
3
+ {%- set documents_system_message_prefix = 'You are a helpful assistant with access to the following documents. You may use one or more documents to assist with the user query.\n\nYou are given a list of documents within <documents></documents> XML tags:\n<documents>' %}
4
+ {%- set documents_system_message_suffix = '\n</documents>\n\nWrite the response to the user\'s input by strictly aligning with the facts in the provided documents. If the information needed to answer the question is not available in the documents, inform the user that the question cannot be answered based on the available data.' %}
5
+ {%- if available_tools is defined and available_tools %}
6
+ {%- set tools = available_tools %}
7
+ {%- endif %}
8
+ {%- set ns = namespace(tools_system_message=tools_system_message_prefix,
9
+ documents_system_message=documents_system_message_prefix,
10
+ system_message=''
11
+ ) %}
12
+ {%- if tools %}
13
+ {%- for tool in tools %}
14
+ {%- set ns.tools_system_message = ns.tools_system_message + '\n' + (tool | tojson) %}
15
+ {%- endfor %}
16
+ {%- set ns.tools_system_message = ns.tools_system_message + tools_system_message_suffix %}
17
+ {%- else %}
18
+ {%- set ns.tools_system_message = '' %}
19
+ {%- endif %}
20
+ {%- if documents %}
21
+ {%- for document in documents %}
22
+ {%- set ns.documents_system_message = ns.documents_system_message + '\n' + (document | tojson) %}
23
+ {%- endfor %}
24
+ {%- set ns.documents_system_message = ns.documents_system_message + documents_system_message_suffix %}
25
+ {%- else %}
26
+ {%- set ns.documents_system_message = '' %}
27
+ {%- endif %}
28
+ {%- if messages[0].role == 'system' %}
29
+ {%- if messages[0].content is string %}
30
+ {%- set ns.system_message = messages[0].content %}
31
+ {%- elif messages[0].content is iterable %}
32
+ {%- for entry in messages[0].content %}
33
+ {%- if entry.type== 'text' %}
34
+ {%- if ns.system_message != '' %}
35
+ {%- set ns.system_message = ns.system_message + '\n' %}
36
+ {%- endif %}
37
+ {%- set ns.system_message = ns.system_message + entry.text %}
38
+ {%- endif %}
39
+ {%- endfor %}
40
+ {%- endif %}
41
+ {%- if tools and documents %}
42
+ {%- set ns.system_message = ns.system_message + '\n\n' + ns.tools_system_message + '\n\n' + ns.documents_system_message %}
43
+ {%- elif tools %}
44
+ {%- set ns.system_message = ns.system_message + '\n\n' + ns.tools_system_message %}
45
+ {%- elif documents %}
46
+ {%- set ns.system_message = ns.system_message + '\n\n' + ns.documents_system_message %}
47
+ {%- endif %}
48
+ {%- else %}
49
+ {%- if tools and documents %}
50
+ {%- set ns.system_message = ns.tools_system_message + '\n\n' + ns.documents_system_message %}
51
+ {%- elif tools %}
52
+ {%- set ns.system_message = ns.tools_system_message %}
53
+ {%- elif documents %}
54
+ {%- set ns.system_message = ns.documents_system_message %}
55
+ {%- endif %}
56
+ {%- endif %}
57
+ {%- if ns.system_message %}
58
+ {{- '<|start_of_role|>system<|end_of_role|>' + ns.system_message + '<|end_of_text|>\n' }}
59
+ {%- endif %}
60
+ {%- for message in messages %}
61
+ {%- set content = namespace(val='') %}
62
+ {%- if message.content is string %}
63
+ {%- set content.val = message.content %}
64
+ {%- else %}
65
+ {%- if message.content is iterable %}
66
+ {%- for entry in message.content %}
67
+ {%- if entry.type== 'text' %}
68
+ {%- if content.val != '' %}
69
+ {%- set content.val = content.val + '\n' %}
70
+ {%- endif %}
71
+ {%- set content.val = content.val + entry.text %}
72
+ {%- endif %}
73
+ {%- endfor %}
74
+ {%- endif %}
75
+ {%- endif %}
76
+ {%- if (message.role == 'user') or (message.role == 'system' and not loop.first) %}
77
+ {{- '<|start_of_role|>' + message.role + '<|end_of_role|>' + content.val + '<|end_of_text|>\n' }}
78
+ {%- elif message.role == 'assistant' %}
79
+ {{- '<|start_of_role|>' + message.role + '<|end_of_role|>' + content.val }}
80
+ {%- if message.tool_calls %}
81
+ {%- for tool_call in message.tool_calls %}
82
+ {%- if (loop.first and content.val) or (not loop.first) %}
83
+ {{- '\n' }}
84
+ {%- endif %}
85
+ {%- if tool_call.function %}
86
+ {%- set tool_call = tool_call.function %}
87
+ {%- endif %}
88
+ {{- '<tool_call>\n{"name": "' }}
89
+ {{- tool_call.name }}
90
+ {{- '", "arguments": ' }}
91
+ {%- if tool_call.arguments is string %}
92
+ {{- tool_call.arguments }}
93
+ {%- else %}
94
+ {{- tool_call.arguments | tojson }}
95
+ {%- endif %}
96
+ {{- '}\n</tool_call>' }}
97
+ {%- endfor %}
98
+ {%- endif %}
99
+ {{- '<|end_of_text|>\n' }}
100
+ {%- elif message.role == 'tool' %}
101
+ {%- if loop.first or (messages[loop.index0 - 1].role != 'tool') %}
102
+ {{- '<|start_of_role|>user<|end_of_role|>' }}
103
+ {%- endif %}
104
+ {{- '\n<tool_response>\n' }}
105
+ {{- content.val }}
106
+ {{- '\n</tool_response>' }}
107
+ {%- if loop.last or (messages[loop.index0 + 1].role != 'tool') %}
108
+ {{- '<|end_of_text|>\n' }}
109
+ {%- endif %}
110
+ {%- endif %}
111
+ {%- endfor %}
112
+ {%- if add_generation_prompt %}
113
+ {{- '<|start_of_role|>assistant<|end_of_role|>' }}
114
+ {%- endif %}
config.json ADDED
@@ -0,0 +1,152 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "GraniteForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "attention_multiplier": 0.0078125,
8
+ "bos_token_id": 100257,
9
+ "embedding_multiplier": 12.0,
10
+ "eos_token_id": 100257,
11
+ "hidden_act": "silu",
12
+ "hidden_size": 4096,
13
+ "initializer_range": 0.1,
14
+ "intermediate_size": 12800,
15
+ "logits_scaling": 16.0,
16
+ "max_position_embeddings": 131072,
17
+ "mlp_bias": false,
18
+ "model_type": "granite",
19
+ "num_attention_heads": 32,
20
+ "num_hidden_layers": 40,
21
+ "num_key_value_heads": 8,
22
+ "pad_token_id": 100256,
23
+ "residual_multiplier": 0.22,
24
+ "rms_norm_eps": 1e-05,
25
+ "rope_scaling": null,
26
+ "rope_theta": 10000000,
27
+ "tie_word_embeddings": true,
28
+ "torch_dtype": "bfloat16",
29
+ "transformers_version": "4.53.3",
30
+ "use_cache": true,
31
+ "vocab_size": 100352,
32
+ "quantization": {
33
+ "group_size": 64,
34
+ "bits": 3,
35
+ "mode": "affine",
36
+ "lm_head": {
37
+ "bits": 8,
38
+ "group_size": 64,
39
+ "mode": "affine"
40
+ },
41
+ "model.embed_tokens": {
42
+ "bits": 8,
43
+ "group_size": 64,
44
+ "mode": "affine"
45
+ },
46
+ "model.layers.0.mlp.down_proj": {
47
+ "bits": 6,
48
+ "group_size": 64,
49
+ "mode": "affine"
50
+ },
51
+ "model.layers.0.self_attn.k_proj": {
52
+ "bits": 5,
53
+ "group_size": 64,
54
+ "mode": "affine"
55
+ },
56
+ "model.layers.0.self_attn.o_proj": {
57
+ "bits": 5,
58
+ "group_size": 64,
59
+ "mode": "affine"
60
+ },
61
+ "model.layers.0.self_attn.q_proj": {
62
+ "bits": 5,
63
+ "group_size": 64,
64
+ "mode": "affine"
65
+ },
66
+ "model.layers.0.self_attn.v_proj": {
67
+ "bits": 6,
68
+ "group_size": 64,
69
+ "mode": "affine"
70
+ },
71
+ "model.layers.1.self_attn.k_proj": {
72
+ "bits": 5,
73
+ "group_size": 64,
74
+ "mode": "affine"
75
+ },
76
+ "model.layers.1.self_attn.v_proj": {
77
+ "bits": 6,
78
+ "group_size": 64,
79
+ "mode": "affine"
80
+ },
81
+ "model.layers.2.self_attn.k_proj": {
82
+ "bits": 5,
83
+ "group_size": 64,
84
+ "mode": "affine"
85
+ },
86
+ "model.layers.3.self_attn.k_proj": {
87
+ "bits": 5,
88
+ "group_size": 64,
89
+ "mode": "affine"
90
+ }
91
+ },
92
+ "quantization_config": {
93
+ "group_size": 64,
94
+ "bits": 3,
95
+ "mode": "affine",
96
+ "lm_head": {
97
+ "bits": 8,
98
+ "group_size": 64,
99
+ "mode": "affine"
100
+ },
101
+ "model.embed_tokens": {
102
+ "bits": 8,
103
+ "group_size": 64,
104
+ "mode": "affine"
105
+ },
106
+ "model.layers.0.mlp.down_proj": {
107
+ "bits": 6,
108
+ "group_size": 64,
109
+ "mode": "affine"
110
+ },
111
+ "model.layers.0.self_attn.k_proj": {
112
+ "bits": 5,
113
+ "group_size": 64,
114
+ "mode": "affine"
115
+ },
116
+ "model.layers.0.self_attn.o_proj": {
117
+ "bits": 5,
118
+ "group_size": 64,
119
+ "mode": "affine"
120
+ },
121
+ "model.layers.0.self_attn.q_proj": {
122
+ "bits": 5,
123
+ "group_size": 64,
124
+ "mode": "affine"
125
+ },
126
+ "model.layers.0.self_attn.v_proj": {
127
+ "bits": 6,
128
+ "group_size": 64,
129
+ "mode": "affine"
130
+ },
131
+ "model.layers.1.self_attn.k_proj": {
132
+ "bits": 5,
133
+ "group_size": 64,
134
+ "mode": "affine"
135
+ },
136
+ "model.layers.1.self_attn.v_proj": {
137
+ "bits": 6,
138
+ "group_size": 64,
139
+ "mode": "affine"
140
+ },
141
+ "model.layers.2.self_attn.k_proj": {
142
+ "bits": 5,
143
+ "group_size": 64,
144
+ "mode": "affine"
145
+ },
146
+ "model.layers.3.self_attn.k_proj": {
147
+ "bits": 5,
148
+ "group_size": 64,
149
+ "mode": "affine"
150
+ }
151
+ }
152
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 100257,
4
+ "eos_token_id": 100257,
5
+ "pad_token_id": 100256,
6
+ "transformers_version": "4.53.3"
7
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:02ffb208e444dff1492ee80d7604eaeb64bc348440e907ec8b286b35bddf88b0
3
+ size 4396137736
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|end_of_text|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|end_of_text|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|pad|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<|unk|>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,783 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_prefix_space": false,
4
+ "added_tokens_decoder": {
5
+ "100256": {
6
+ "content": "<|pad|>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "100257": {
14
+ "content": "<|end_of_text|>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "100258": {
22
+ "content": "<|fim_prefix|>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": false
28
+ },
29
+ "100259": {
30
+ "content": "<|fim_middle|>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": false
36
+ },
37
+ "100260": {
38
+ "content": "<|fim_suffix|>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false,
43
+ "special": false
44
+ },
45
+ "100261": {
46
+ "content": "<|fim_pad|>",
47
+ "lstrip": false,
48
+ "normalized": false,
49
+ "rstrip": false,
50
+ "single_word": false,
51
+ "special": false
52
+ },
53
+ "100262": {
54
+ "content": "<|filename|>",
55
+ "lstrip": false,
56
+ "normalized": false,
57
+ "rstrip": false,
58
+ "single_word": false,
59
+ "special": false
60
+ },
61
+ "100263": {
62
+ "content": "<|reponame|>",
63
+ "lstrip": false,
64
+ "normalized": false,
65
+ "rstrip": false,
66
+ "single_word": false,
67
+ "special": false
68
+ },
69
+ "100264": {
70
+ "content": "<|start_of_role|>",
71
+ "lstrip": false,
72
+ "normalized": false,
73
+ "rstrip": false,
74
+ "single_word": false,
75
+ "special": true
76
+ },
77
+ "100265": {
78
+ "content": "<|end_of_role|>",
79
+ "lstrip": false,
80
+ "normalized": false,
81
+ "rstrip": false,
82
+ "single_word": false,
83
+ "special": true
84
+ },
85
+ "100266": {
86
+ "content": "<|unused_1|>",
87
+ "lstrip": false,
88
+ "normalized": false,
89
+ "rstrip": false,
90
+ "single_word": false,
91
+ "special": true
92
+ },
93
+ "100267": {
94
+ "content": "<|start_of_plugin|>",
95
+ "lstrip": false,
96
+ "normalized": false,
97
+ "rstrip": false,
98
+ "single_word": false,
99
+ "special": true
100
+ },
101
+ "100268": {
102
+ "content": "<|end_of_plugin|>",
103
+ "lstrip": false,
104
+ "normalized": false,
105
+ "rstrip": false,
106
+ "single_word": false,
107
+ "special": true
108
+ },
109
+ "100269": {
110
+ "content": "<|unk|>",
111
+ "lstrip": false,
112
+ "normalized": false,
113
+ "rstrip": false,
114
+ "single_word": false,
115
+ "special": true
116
+ },
117
+ "100270": {
118
+ "content": "<tool_call>",
119
+ "lstrip": false,
120
+ "normalized": false,
121
+ "rstrip": false,
122
+ "single_word": false,
123
+ "special": false
124
+ },
125
+ "100271": {
126
+ "content": "</tool_call>",
127
+ "lstrip": false,
128
+ "normalized": false,
129
+ "rstrip": false,
130
+ "single_word": false,
131
+ "special": false
132
+ },
133
+ "100272": {
134
+ "content": "<tool_response>",
135
+ "lstrip": false,
136
+ "normalized": false,
137
+ "rstrip": false,
138
+ "single_word": false,
139
+ "special": false
140
+ },
141
+ "100273": {
142
+ "content": "</tool_response>",
143
+ "lstrip": false,
144
+ "normalized": false,
145
+ "rstrip": false,
146
+ "single_word": false,
147
+ "special": false
148
+ },
149
+ "100274": {
150
+ "content": "<think>",
151
+ "lstrip": false,
152
+ "normalized": false,
153
+ "rstrip": false,
154
+ "single_word": false,
155
+ "special": false
156
+ },
157
+ "100275": {
158
+ "content": "</think>",
159
+ "lstrip": false,
160
+ "normalized": false,
161
+ "rstrip": false,
162
+ "single_word": false,
163
+ "special": false
164
+ },
165
+ "100276": {
166
+ "content": "<think_on>",
167
+ "lstrip": false,
168
+ "normalized": false,
169
+ "rstrip": false,
170
+ "single_word": false,
171
+ "special": true
172
+ },
173
+ "100277": {
174
+ "content": "<think_off>",
175
+ "lstrip": false,
176
+ "normalized": false,
177
+ "rstrip": false,
178
+ "single_word": false,
179
+ "special": true
180
+ },
181
+ "100278": {
182
+ "content": "<schema>",
183
+ "lstrip": false,
184
+ "normalized": false,
185
+ "rstrip": false,
186
+ "single_word": false,
187
+ "special": true
188
+ },
189
+ "100279": {
190
+ "content": "</schema>",
191
+ "lstrip": false,
192
+ "normalized": false,
193
+ "rstrip": false,
194
+ "single_word": false,
195
+ "special": true
196
+ },
197
+ "100280": {
198
+ "content": "<tools>",
199
+ "lstrip": false,
200
+ "normalized": false,
201
+ "rstrip": false,
202
+ "single_word": false,
203
+ "special": true
204
+ },
205
+ "100281": {
206
+ "content": "</tools>",
207
+ "lstrip": false,
208
+ "normalized": false,
209
+ "rstrip": false,
210
+ "single_word": false,
211
+ "special": true
212
+ },
213
+ "100282": {
214
+ "content": "<documents>",
215
+ "lstrip": false,
216
+ "normalized": false,
217
+ "rstrip": false,
218
+ "single_word": false,
219
+ "special": true
220
+ },
221
+ "100283": {
222
+ "content": "</documents>",
223
+ "lstrip": false,
224
+ "normalized": false,
225
+ "rstrip": false,
226
+ "single_word": false,
227
+ "special": true
228
+ },
229
+ "100284": {
230
+ "content": "<|unused_15|>",
231
+ "lstrip": false,
232
+ "normalized": false,
233
+ "rstrip": false,
234
+ "single_word": false,
235
+ "special": true
236
+ },
237
+ "100285": {
238
+ "content": "<|unused_16|>",
239
+ "lstrip": false,
240
+ "normalized": false,
241
+ "rstrip": false,
242
+ "single_word": false,
243
+ "special": true
244
+ },
245
+ "100286": {
246
+ "content": "<|unused_17|>",
247
+ "lstrip": false,
248
+ "normalized": false,
249
+ "rstrip": false,
250
+ "single_word": false,
251
+ "special": true
252
+ },
253
+ "100287": {
254
+ "content": "<|unused_18|>",
255
+ "lstrip": false,
256
+ "normalized": false,
257
+ "rstrip": false,
258
+ "single_word": false,
259
+ "special": true
260
+ },
261
+ "100288": {
262
+ "content": "<|unused_19|>",
263
+ "lstrip": false,
264
+ "normalized": false,
265
+ "rstrip": false,
266
+ "single_word": false,
267
+ "special": true
268
+ },
269
+ "100289": {
270
+ "content": "<|unused_20|>",
271
+ "lstrip": false,
272
+ "normalized": false,
273
+ "rstrip": false,
274
+ "single_word": false,
275
+ "special": true
276
+ },
277
+ "100290": {
278
+ "content": "<|unused_21|>",
279
+ "lstrip": false,
280
+ "normalized": false,
281
+ "rstrip": false,
282
+ "single_word": false,
283
+ "special": true
284
+ },
285
+ "100291": {
286
+ "content": "<|unused_22|>",
287
+ "lstrip": false,
288
+ "normalized": false,
289
+ "rstrip": false,
290
+ "single_word": false,
291
+ "special": true
292
+ },
293
+ "100292": {
294
+ "content": "<|unused_23|>",
295
+ "lstrip": false,
296
+ "normalized": false,
297
+ "rstrip": false,
298
+ "single_word": false,
299
+ "special": true
300
+ },
301
+ "100293": {
302
+ "content": "<|unused_24|>",
303
+ "lstrip": false,
304
+ "normalized": false,
305
+ "rstrip": false,
306
+ "single_word": false,
307
+ "special": true
308
+ },
309
+ "100294": {
310
+ "content": "<|unused_25|>",
311
+ "lstrip": false,
312
+ "normalized": false,
313
+ "rstrip": false,
314
+ "single_word": false,
315
+ "special": true
316
+ },
317
+ "100295": {
318
+ "content": "<|unused_26|>",
319
+ "lstrip": false,
320
+ "normalized": false,
321
+ "rstrip": false,
322
+ "single_word": false,
323
+ "special": true
324
+ },
325
+ "100296": {
326
+ "content": "<|unused_27|>",
327
+ "lstrip": false,
328
+ "normalized": false,
329
+ "rstrip": false,
330
+ "single_word": false,
331
+ "special": true
332
+ },
333
+ "100297": {
334
+ "content": "<|unused_28|>",
335
+ "lstrip": false,
336
+ "normalized": false,
337
+ "rstrip": false,
338
+ "single_word": false,
339
+ "special": true
340
+ },
341
+ "100298": {
342
+ "content": "<|unused_29|>",
343
+ "lstrip": false,
344
+ "normalized": false,
345
+ "rstrip": false,
346
+ "single_word": false,
347
+ "special": true
348
+ },
349
+ "100299": {
350
+ "content": "<|unused_30|>",
351
+ "lstrip": false,
352
+ "normalized": false,
353
+ "rstrip": false,
354
+ "single_word": false,
355
+ "special": true
356
+ },
357
+ "100300": {
358
+ "content": "<|unused_31|>",
359
+ "lstrip": false,
360
+ "normalized": false,
361
+ "rstrip": false,
362
+ "single_word": false,
363
+ "special": true
364
+ },
365
+ "100301": {
366
+ "content": "<|unused_32|>",
367
+ "lstrip": false,
368
+ "normalized": false,
369
+ "rstrip": false,
370
+ "single_word": false,
371
+ "special": true
372
+ },
373
+ "100302": {
374
+ "content": "<|unused_33|>",
375
+ "lstrip": false,
376
+ "normalized": false,
377
+ "rstrip": false,
378
+ "single_word": false,
379
+ "special": true
380
+ },
381
+ "100303": {
382
+ "content": "<|unused_34|>",
383
+ "lstrip": false,
384
+ "normalized": false,
385
+ "rstrip": false,
386
+ "single_word": false,
387
+ "special": true
388
+ },
389
+ "100304": {
390
+ "content": "<|unused_35|>",
391
+ "lstrip": false,
392
+ "normalized": false,
393
+ "rstrip": false,
394
+ "single_word": false,
395
+ "special": true
396
+ },
397
+ "100305": {
398
+ "content": "<|unused_36|>",
399
+ "lstrip": false,
400
+ "normalized": false,
401
+ "rstrip": false,
402
+ "single_word": false,
403
+ "special": true
404
+ },
405
+ "100306": {
406
+ "content": "<|unused_37|>",
407
+ "lstrip": false,
408
+ "normalized": false,
409
+ "rstrip": false,
410
+ "single_word": false,
411
+ "special": true
412
+ },
413
+ "100307": {
414
+ "content": "<|unused_38|>",
415
+ "lstrip": false,
416
+ "normalized": false,
417
+ "rstrip": false,
418
+ "single_word": false,
419
+ "special": true
420
+ },
421
+ "100308": {
422
+ "content": "<|unused_39|>",
423
+ "lstrip": false,
424
+ "normalized": false,
425
+ "rstrip": false,
426
+ "single_word": false,
427
+ "special": true
428
+ },
429
+ "100309": {
430
+ "content": "<|unused_40|>",
431
+ "lstrip": false,
432
+ "normalized": false,
433
+ "rstrip": false,
434
+ "single_word": false,
435
+ "special": true
436
+ },
437
+ "100310": {
438
+ "content": "<|unused_41|>",
439
+ "lstrip": false,
440
+ "normalized": false,
441
+ "rstrip": false,
442
+ "single_word": false,
443
+ "special": true
444
+ },
445
+ "100311": {
446
+ "content": "<|unused_42|>",
447
+ "lstrip": false,
448
+ "normalized": false,
449
+ "rstrip": false,
450
+ "single_word": false,
451
+ "special": true
452
+ },
453
+ "100312": {
454
+ "content": "<|unused_43|>",
455
+ "lstrip": false,
456
+ "normalized": false,
457
+ "rstrip": false,
458
+ "single_word": false,
459
+ "special": true
460
+ },
461
+ "100313": {
462
+ "content": "<|unused_44|>",
463
+ "lstrip": false,
464
+ "normalized": false,
465
+ "rstrip": false,
466
+ "single_word": false,
467
+ "special": true
468
+ },
469
+ "100314": {
470
+ "content": "<|unused_45|>",
471
+ "lstrip": false,
472
+ "normalized": false,
473
+ "rstrip": false,
474
+ "single_word": false,
475
+ "special": true
476
+ },
477
+ "100315": {
478
+ "content": "<|unused_46|>",
479
+ "lstrip": false,
480
+ "normalized": false,
481
+ "rstrip": false,
482
+ "single_word": false,
483
+ "special": true
484
+ },
485
+ "100316": {
486
+ "content": "<|unused_47|>",
487
+ "lstrip": false,
488
+ "normalized": false,
489
+ "rstrip": false,
490
+ "single_word": false,
491
+ "special": true
492
+ },
493
+ "100317": {
494
+ "content": "<|unused_48|>",
495
+ "lstrip": false,
496
+ "normalized": false,
497
+ "rstrip": false,
498
+ "single_word": false,
499
+ "special": true
500
+ },
501
+ "100318": {
502
+ "content": "<|unused_49|>",
503
+ "lstrip": false,
504
+ "normalized": false,
505
+ "rstrip": false,
506
+ "single_word": false,
507
+ "special": true
508
+ },
509
+ "100319": {
510
+ "content": "<|unused_50|>",
511
+ "lstrip": false,
512
+ "normalized": false,
513
+ "rstrip": false,
514
+ "single_word": false,
515
+ "special": true
516
+ },
517
+ "100320": {
518
+ "content": "<|unused_51|>",
519
+ "lstrip": false,
520
+ "normalized": false,
521
+ "rstrip": false,
522
+ "single_word": false,
523
+ "special": true
524
+ },
525
+ "100321": {
526
+ "content": "<|unused_52|>",
527
+ "lstrip": false,
528
+ "normalized": false,
529
+ "rstrip": false,
530
+ "single_word": false,
531
+ "special": true
532
+ },
533
+ "100322": {
534
+ "content": "<|unused_53|>",
535
+ "lstrip": false,
536
+ "normalized": false,
537
+ "rstrip": false,
538
+ "single_word": false,
539
+ "special": true
540
+ },
541
+ "100323": {
542
+ "content": "<|unused_54|>",
543
+ "lstrip": false,
544
+ "normalized": false,
545
+ "rstrip": false,
546
+ "single_word": false,
547
+ "special": true
548
+ },
549
+ "100324": {
550
+ "content": "<|unused_55|>",
551
+ "lstrip": false,
552
+ "normalized": false,
553
+ "rstrip": false,
554
+ "single_word": false,
555
+ "special": true
556
+ },
557
+ "100325": {
558
+ "content": "<|unused_56|>",
559
+ "lstrip": false,
560
+ "normalized": false,
561
+ "rstrip": false,
562
+ "single_word": false,
563
+ "special": true
564
+ },
565
+ "100326": {
566
+ "content": "<|unused_57|>",
567
+ "lstrip": false,
568
+ "normalized": false,
569
+ "rstrip": false,
570
+ "single_word": false,
571
+ "special": true
572
+ },
573
+ "100327": {
574
+ "content": "<|unused_58|>",
575
+ "lstrip": false,
576
+ "normalized": false,
577
+ "rstrip": false,
578
+ "single_word": false,
579
+ "special": true
580
+ },
581
+ "100328": {
582
+ "content": "<|unused_59|>",
583
+ "lstrip": false,
584
+ "normalized": false,
585
+ "rstrip": false,
586
+ "single_word": false,
587
+ "special": true
588
+ },
589
+ "100329": {
590
+ "content": "<|unused_60|>",
591
+ "lstrip": false,
592
+ "normalized": false,
593
+ "rstrip": false,
594
+ "single_word": false,
595
+ "special": true
596
+ },
597
+ "100330": {
598
+ "content": "<|unused_61|>",
599
+ "lstrip": false,
600
+ "normalized": false,
601
+ "rstrip": false,
602
+ "single_word": false,
603
+ "special": true
604
+ },
605
+ "100331": {
606
+ "content": "<|unused_62|>",
607
+ "lstrip": false,
608
+ "normalized": false,
609
+ "rstrip": false,
610
+ "single_word": false,
611
+ "special": true
612
+ },
613
+ "100332": {
614
+ "content": "<|unused_63|>",
615
+ "lstrip": false,
616
+ "normalized": false,
617
+ "rstrip": false,
618
+ "single_word": false,
619
+ "special": true
620
+ },
621
+ "100333": {
622
+ "content": "<|unused_64|>",
623
+ "lstrip": false,
624
+ "normalized": false,
625
+ "rstrip": false,
626
+ "single_word": false,
627
+ "special": true
628
+ },
629
+ "100334": {
630
+ "content": "<|unused_65|>",
631
+ "lstrip": false,
632
+ "normalized": false,
633
+ "rstrip": false,
634
+ "single_word": false,
635
+ "special": true
636
+ },
637
+ "100335": {
638
+ "content": "<|unused_66|>",
639
+ "lstrip": false,
640
+ "normalized": false,
641
+ "rstrip": false,
642
+ "single_word": false,
643
+ "special": true
644
+ },
645
+ "100336": {
646
+ "content": "<|unused_67|>",
647
+ "lstrip": false,
648
+ "normalized": false,
649
+ "rstrip": false,
650
+ "single_word": false,
651
+ "special": true
652
+ },
653
+ "100337": {
654
+ "content": "<|unused_68|>",
655
+ "lstrip": false,
656
+ "normalized": false,
657
+ "rstrip": false,
658
+ "single_word": false,
659
+ "special": true
660
+ },
661
+ "100338": {
662
+ "content": "<|unused_69|>",
663
+ "lstrip": false,
664
+ "normalized": false,
665
+ "rstrip": false,
666
+ "single_word": false,
667
+ "special": true
668
+ },
669
+ "100339": {
670
+ "content": "<|unused_70|>",
671
+ "lstrip": false,
672
+ "normalized": false,
673
+ "rstrip": false,
674
+ "single_word": false,
675
+ "special": true
676
+ },
677
+ "100340": {
678
+ "content": "<|unused_71|>",
679
+ "lstrip": false,
680
+ "normalized": false,
681
+ "rstrip": false,
682
+ "single_word": false,
683
+ "special": true
684
+ },
685
+ "100341": {
686
+ "content": "<|unused_72|>",
687
+ "lstrip": false,
688
+ "normalized": false,
689
+ "rstrip": false,
690
+ "single_word": false,
691
+ "special": true
692
+ },
693
+ "100342": {
694
+ "content": "<|unused_73|>",
695
+ "lstrip": false,
696
+ "normalized": false,
697
+ "rstrip": false,
698
+ "single_word": false,
699
+ "special": true
700
+ },
701
+ "100343": {
702
+ "content": "<|unused_74|>",
703
+ "lstrip": false,
704
+ "normalized": false,
705
+ "rstrip": false,
706
+ "single_word": false,
707
+ "special": true
708
+ },
709
+ "100344": {
710
+ "content": "<|unused_75|>",
711
+ "lstrip": false,
712
+ "normalized": false,
713
+ "rstrip": false,
714
+ "single_word": false,
715
+ "special": true
716
+ },
717
+ "100345": {
718
+ "content": "<|unused_76|>",
719
+ "lstrip": false,
720
+ "normalized": false,
721
+ "rstrip": false,
722
+ "single_word": false,
723
+ "special": true
724
+ },
725
+ "100346": {
726
+ "content": "<|unused_77|>",
727
+ "lstrip": false,
728
+ "normalized": false,
729
+ "rstrip": false,
730
+ "single_word": false,
731
+ "special": true
732
+ },
733
+ "100347": {
734
+ "content": "<|unused_78|>",
735
+ "lstrip": false,
736
+ "normalized": false,
737
+ "rstrip": false,
738
+ "single_word": false,
739
+ "special": true
740
+ },
741
+ "100348": {
742
+ "content": "<|unused_79|>",
743
+ "lstrip": false,
744
+ "normalized": false,
745
+ "rstrip": false,
746
+ "single_word": false,
747
+ "special": true
748
+ },
749
+ "100349": {
750
+ "content": "<|unused_80|>",
751
+ "lstrip": false,
752
+ "normalized": false,
753
+ "rstrip": false,
754
+ "single_word": false,
755
+ "special": true
756
+ },
757
+ "100350": {
758
+ "content": "<|unused_81|>",
759
+ "lstrip": false,
760
+ "normalized": false,
761
+ "rstrip": false,
762
+ "single_word": false,
763
+ "special": true
764
+ },
765
+ "100351": {
766
+ "content": "<|unused_82|>",
767
+ "lstrip": false,
768
+ "normalized": false,
769
+ "rstrip": false,
770
+ "single_word": false,
771
+ "special": true
772
+ }
773
+ },
774
+ "bos_token": "<|end_of_text|>",
775
+ "clean_up_tokenization_spaces": false,
776
+ "eos_token": "<|end_of_text|>",
777
+ "extra_special_tokens": {},
778
+ "model_max_length": 1000000000000000019884624838656,
779
+ "pad_token": "<|pad|>",
780
+ "padding_side": "left",
781
+ "tokenizer_class": "GPT2Tokenizer",
782
+ "unk_token": "<|unk|>"
783
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff