Skip to content

Commit

Permalink
Update link to LlamaStack remote vLLM guide in serving_with_llamastac…
Browse files Browse the repository at this point in the history
…k.rst (#11112)

Signed-off-by: Yuan Tang <[email protected]>
  • Loading branch information
terrytangyuan authored Dec 12, 2024
1 parent 8fb26da commit 24a36d6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/source/serving/serving_with_llamastack.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Then start Llama Stack server pointing to your vLLM server with the following co
config:
url: http://127.0.0.1:8000
Please refer to `this guide <https://github.com/meta-llama/llama-stack/blob/main/docs/source/getting_started/distributions/self_hosted_distro/remote_vllm.md>`_ for more details on this remote vLLM provider.
Please refer to `this guide <https://llama-stack.readthedocs.io/en/latest/distributions/self_hosted_distro/remote-vllm.html>`_ for more details on this remote vLLM provider.

Inference via Embedded vLLM
---------------------------
Expand Down

0 comments on commit 24a36d6

Please sign in to comment.