[Docs] add Dynamo/aibrix integration and kubeai/aks link (#32767)
Signed-off-by: Paco Xu <paco.xu@daocloud.io>
This commit is contained in:
5
docs/deployment/integrations/aibrix.md
Normal file
5
docs/deployment/integrations/aibrix.md
Normal file
@@ -0,0 +1,5 @@
|
||||
# AIBrix
|
||||
|
||||
[AIBrix](https://github.com/vllm-project/aibrix) is a cloud-native control plane that integrates with vLLM to simplify Kubernetes deployment, scaling, routing, and LoRA adapter management for large language model inference.
|
||||
|
||||
For installation and usage instructions, please refer to the [AIBrix documentation](https://aibrix.readthedocs.io/).
|
||||
7
docs/deployment/integrations/dynamo.md
Normal file
7
docs/deployment/integrations/dynamo.md
Normal file
@@ -0,0 +1,7 @@
|
||||
# NVIDIA Dynamo
|
||||
|
||||
[NVIDIA Dynamo](https://github.com/ai-dynamo/dynamo) is an open-source framework for distributed LLM inference that can run vLLM on Kubernetes with flexible serving architectures (e.g. aggregated/disaggregated, optional router/planner).
|
||||
|
||||
For Kubernetes deployment instructions and examples (including vLLM), see the [Deploying Dynamo on Kubernetes](https://github.com/ai-dynamo/dynamo/blob/main/docs/kubernetes/README.md) guide.
|
||||
|
||||
Background reading: InfoQ news coverage — [NVIDIA Dynamo simplifies Kubernetes deployment for LLM inference](https://www.infoq.com/news/2025/12/nvidia-dynamo-kubernetes/).
|
||||
@@ -5,6 +5,7 @@
|
||||
Please see the Installation Guides for environment specific instructions:
|
||||
|
||||
- [Any Kubernetes Cluster](https://www.kubeai.org/installation/any/)
|
||||
- [AKS](https://www.kubeai.org/installation/aks/)
|
||||
- [EKS](https://www.kubeai.org/installation/eks/)
|
||||
- [GKE](https://www.kubeai.org/installation/gke/)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user