This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
15ae8e0784d3889c6aa2c487ca00df4e3fde6f44
vllm
/
vllm
/
entrypoints
/
cli
History
Iceber Gu
e0d6b4a867
[CLI] add --max-tokens to
vllm complete
(
#28109
)
...
Signed-off-by: Iceber Gu <
caiwei95@hotmail.com
>
2025-11-07 12:21:40 +00:00
..
benchmark
[Frontend] Add
vllm bench sweep
to CLI (
#27639
)
2025-10-29 05:59:48 -07:00
__init__.py
[Frontend] Add
vllm bench sweep
to CLI (
#27639
)
2025-10-29 05:59:48 -07:00
collect_env.py
[Chore]:Extract math and argparse utilities to separate modules (
#27188
)
2025-10-26 04:03:32 -07:00
main.py
[Chore]:Extract math and argparse utilities to separate modules (
#27188
)
2025-10-26 04:03:32 -07:00
openai.py
[CLI] add --max-tokens to
vllm complete
(
#28109
)
2025-11-07 12:21:40 +00:00
run_batch.py
[Chore]:Extract math and argparse utilities to separate modules (
#27188
)
2025-10-26 04:03:32 -07:00
serve.py
[V0 deprecation] Remove VLLM_USE_V1 usage in most modules (
#27955
)
2025-11-04 20:51:16 -08:00
types.py
[Chore]:Extract math and argparse utilities to separate modules (
#27188
)
2025-10-26 04:03:32 -07:00