Update README.md to add SGLang deployment option
Browse files
README.md
CHANGED
|
@@ -221,6 +221,13 @@ Please refer to the [inference](inference/README.md) folder for detailed instruc
|
|
| 221 |
|
| 222 |
For local deployment, we recommend setting the sampling parameters to `temperature = 1.0, top_p = 1.0`. For the Think Max reasoning mode, we recommend setting the context window to at least **384K** tokens.
|
| 223 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 224 |
## License
|
| 225 |
|
| 226 |
This repository and the model weights are licensed under the [MIT License](LICENSE).
|
|
|
|
| 221 |
|
| 222 |
For local deployment, we recommend setting the sampling parameters to `temperature = 1.0, top_p = 1.0`. For the Think Max reasoning mode, we recommend setting the context window to at least **384K** tokens.
|
| 223 |
|
| 224 |
+
SGLang provides day-0 support of DeepSeek-V4. See the [SGLang Cookbook](https://docs.sglang.io/cookbook/autoregressive/DeepSeek/DeepSeek-V4) for up-to-date details.
|
| 225 |
+
|
| 226 |
+
## License
|
| 227 |
+
|
| 228 |
+
This repository and the model weights are licensed under the [MIT License](LICENSE).
|
| 229 |
+
|
| 230 |
+
|
| 231 |
## License
|
| 232 |
|
| 233 |
This repository and the model weights are licensed under the [MIT License](LICENSE).
|