Update vLLM install instructions to use nightly wheel
Browse filesLaguna XS.2 support landed upstream in vllm-project/vllm#41129, so the install-from-source dance is no longer needed. Point users at the nightly wheel until the next vLLM release.
README.md
CHANGED
|
@@ -130,12 +130,10 @@ Laguna XS.2 is supported in vLLM and Transformers, and TRT-LLM thanks to the sup
|
|
| 130 |
Serve Laguna XS.2 locally with vLLM and query it from any OpenAI-compatible client (see [Controlling reasoning](#controlling-reasoning) for tool calls, streaming, and reasoning extraction):
|
| 131 |
|
| 132 |
> [!NOTE]
|
| 133 |
-
> Laguna XS.2 support
|
| 134 |
|
| 135 |
```shell
|
| 136 |
-
|
| 137 |
-
git fetch origin pull/41129/head:laguna && git checkout laguna
|
| 138 |
-
pip install -e .
|
| 139 |
|
| 140 |
vllm serve poolside/Laguna-XS.2 \
|
| 141 |
--max-model-len 131072 \
|
|
|
|
| 130 |
Serve Laguna XS.2 locally with vLLM and query it from any OpenAI-compatible client (see [Controlling reasoning](#controlling-reasoning) for tool calls, streaming, and reasoning extraction):
|
| 131 |
|
| 132 |
> [!NOTE]
|
| 133 |
+
> Laguna XS.2 support has been merged into vLLM ([vllm-project/vllm#41129](https://github.com/vllm-project/vllm/pull/41129)) and will ship in the next release. Until then, install a nightly wheel:
|
| 134 |
|
| 135 |
```shell
|
| 136 |
+
pip install vllm --pre --extra-index-url https://wheels.vllm.ai/nightly
|
|
|
|
|
|
|
| 137 |
|
| 138 |
vllm serve poolside/Laguna-XS.2 \
|
| 139 |
--max-model-len 131072 \
|