Logo
Explore Help
Register Sign In
biondizzle/vllm
1
0
Fork 0
You've already forked vllm
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
39cefbdf17e2e906e0eae3e82bd601f66137deb4
vllm/.buildkite
History
Michael Goin 2f32a68d75 [CI] Update several models in registry that are available online now (#30514)
Signed-off-by: mgoin <mgoin64@gmail.com>
Signed-off-by: Michael Goin <mgoin64@gmail.com>
Co-authored-by: Isotr0py <2037008807@qq.com>
2025-12-12 18:28:13 -08:00
..
image_build
[ci] Refactor CI file structure (#29343)
2025-12-08 17:25:43 -09:00
lm-eval-harness
[CI/Build][AMD] Add Llama4 Maverick FP8 to AMD CI (#28695)
2025-12-04 16:07:20 -08:00
performance-benchmarks
[vLLM Benchmark Suite] Add default parameters section and update CPU benchmark cases (#29381)
2025-12-02 09:00:23 +00:00
scripts
[CI] refine more logic when generating and using nightly wheels & indices, add cuda130 build for aarch64, specify correct manylinux version (#30341)
2025-12-12 00:42:30 +08:00
test_areas
[ci] Refactor CI file structure (#29343)
2025-12-08 17:25:43 -09:00
check-wheel-size.py
[CI] Raise VLLM_MAX_SIZE_MB to 500 due to failing Build wheel - CUDA 12.9 (#26722)
2025-10-14 10:52:05 -07:00
ci_config.yaml
[ci] Refactor CI file structure (#29343)
2025-12-08 17:25:43 -09:00
release-pipeline.yaml
[CI/Build] Add x86 CPU wheel release pipeline (#28848)
2025-12-12 19:21:35 +00:00
test-amd.yaml
[ROCm][CI] Use mi325_4 agent pool for V1 e2e tests (#30526)
2025-12-12 01:37:24 +00:00
test-pipeline.yaml
[CI] Update several models in registry that are available online now (#30514)
2025-12-12 18:28:13 -08:00
Powered by Gitea Version: 1.25.2 Page: 87ms Template: 4ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API