joerowell commited on
Commit
598f732
·
verified ·
1 Parent(s): 06367f7

Update vLLM install instructions to use nightly wheel

Browse files

Laguna XS.2 support landed upstream in vllm-project/vllm#41129, so the install-from-source dance is no longer needed. Point users at the nightly wheel until the next vLLM release.

Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -130,12 +130,10 @@ Laguna XS.2 is supported in vLLM and Transformers, and TRT-LLM thanks to the sup
130
  Serve Laguna XS.2 locally with vLLM and query it from any OpenAI-compatible client (see [Controlling reasoning](#controlling-reasoning) for tool calls, streaming, and reasoning extraction):
131
 
132
  > [!NOTE]
133
- > Laguna XS.2 support is on the open vLLM PR ([vllm-project/vllm#41129](https://github.com/vllm-project/vllm/pull/41129)); install from source until it lands in a release.
134
 
135
  ```shell
136
- git clone https://github.com/vllm-project/vllm.git && cd vllm
137
- git fetch origin pull/41129/head:laguna && git checkout laguna
138
- pip install -e .
139
 
140
  vllm serve poolside/Laguna-XS.2 \
141
  --max-model-len 131072 \
 
130
  Serve Laguna XS.2 locally with vLLM and query it from any OpenAI-compatible client (see [Controlling reasoning](#controlling-reasoning) for tool calls, streaming, and reasoning extraction):
131
 
132
  > [!NOTE]
133
+ > Laguna XS.2 support has been merged into vLLM ([vllm-project/vllm#41129](https://github.com/vllm-project/vllm/pull/41129)) and will ship in the next release. Until then, install a nightly wheel:
134
 
135
  ```shell
136
+ pip install vllm --pre --extra-index-url https://wheels.vllm.ai/nightly
 
 
137
 
138
  vllm serve poolside/Laguna-XS.2 \
139
  --max-model-len 131072 \