mszymanska commited on
Commit
cd3ef8c
Β·
verified Β·
1 Parent(s): 9c3d56e

docs: add Tested inference path section, reorganize model card

Browse files
Files changed (1) hide show
  1. README.md +15 -10
README.md CHANGED
@@ -61,6 +61,20 @@ inference: false
61
 
62
  This card only reports metadata present in the Hugging Face repository, existing card frontmatter, or public config files. Missing benchmark, dataset, or training-run details are left explicit rather than reconstructed.
63
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
64
  ## Usage
65
 
66
  ### CLI
@@ -93,7 +107,7 @@ print(response)
93
 
94
  ## Example output
95
 
96
- No public sample output is currently declared for this checkpoint. Run the usage example above against your own prompt or audio/image input to inspect behavior.
97
 
98
  ## Quantization notes
99
 
@@ -126,15 +140,6 @@ No public sample output is currently declared for this checkpoint. Run the usage
126
  note = {MLX checkpoint published by LibraxisAI}
127
  }
128
  ```
129
-
130
- ## Inference tested on
131
-
132
- [`LibraxisAI/mlx-batch-server`](https://github.com/LibraxisAI/mlx-batch-server)
133
-
134
- ## Related
135
-
136
- - Base model: [`huihui-ai/Huihui4-48B-A4B-abliterated`](https://huggingface.co/huihui-ai/Huihui4-48B-A4B-abliterated)
137
-
138
  ---
139
 
140
  πš…πš’πš‹πšŽπšŒπš›πšŠπšπšπšŽπš. with AI Agents by VetCoders (c)2024-2026 LibraxisAI
 
61
 
62
  This card only reports metadata present in the Hugging Face repository, existing card frontmatter, or public config files. Missing benchmark, dataset, or training-run details are left explicit rather than reconstructed.
63
 
64
+ ## Tested inference path
65
+
66
+ > **Inference for this checkpoint has been tested with [`LibraxisAI/mlx-batch-server`](https://github.com/LibraxisAI/mlx-batch-server).**\
67
+ > This is the recommended tested path for operator-controlled local inference on Apple Silicon.
68
+
69
+ | Aspect | Status |
70
+ |---|---|
71
+ | Tested runtime | `LibraxisAI/mlx-batch-server` |
72
+ | Target hardware | Apple Silicon |
73
+ | Inference mode | Local / self-hosted |
74
+ | Hugging Face Hosted Inference | Disabled for this repository (`inference: false`) |
75
+
76
+ This does not claim compatibility with every possible serving stack. It documents the path that has been exercised for this published checkpoint.
77
+
78
  ## Usage
79
 
80
  ### CLI
 
107
 
108
  ## Example output
109
 
110
+ No public sample output is currently declared for this checkpoint.
111
 
112
  ## Quantization notes
113
 
 
140
  note = {MLX checkpoint published by LibraxisAI}
141
  }
142
  ```
 
 
 
 
 
 
 
 
 
143
  ---
144
 
145
  πš…πš’πš‹πšŽπšŒπš›πšŠπšπšπšŽπš. with AI Agents by VetCoders (c)2024-2026 LibraxisAI