This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
a2268617cfe91c4eebed1944327d8869ad628b8b
vllm
/
vllm
/
entrypoints
/
openai
/
engine
History
Csrayz
bc2c0c86ef
[Frontend] Fix usage incorrectly returned with empty stream_options` (
#36379
)
...
Signed-off-by: Csrayz <
33659823+Csrayz@users.noreply.github.com
>
2026-03-13 03:33:04 +00:00
..
__init__.py
[Refactor] [6/N] to simplify the vLLM openai chat_completion serving architecture (
#32240
)
2026-01-13 13:01:39 +00:00
protocol.py
[Frontend] Fix usage incorrectly returned with empty stream_options` (
#36379
)
2026-03-13 03:33:04 +00:00
serving.py
[Bugfix] Fix crash when tool_choice=required exceeds max_tokens (
#36841
)
2026-03-12 03:28:45 -07:00