This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
72d9c316d3f6ede485146fe5aabd4e61dbc59069
vllm
/
vllm
/
entrypoints
History
Ricky Xu
584f0ae40d
[V1] Make AsyncLLMEngine v1-v0 opaque (
#11383
)
...
Signed-off-by: Ricky Xu <
xuchen727@hotmail.com
>
2024-12-21 15:14:08 +08:00
..
openai
[V1] Make AsyncLLMEngine v1-v0 opaque (
#11383
)
2024-12-21 15:14:08 +08:00
__init__.py
Change the name to vLLM (
#150
)
2023-06-17 03:07:40 -07:00
api_server.py
[Bugfix] Fix request cancellation without polling (
#11190
)
2024-12-17 12:26:32 -08:00
chat_utils.py
[Frontend] Add OpenAI API support for input_audio (
#11027
)
2024-12-16 22:09:58 -08:00
launcher.py
[Core][Bugfix][Perf] Introduce
MQLLMEngine
to avoid
asyncio
OH (
#8157
)
2024-09-18 13:56:58 +00:00
llm.py
[Feature] Add load generation config from model (
#11164
)
2024-12-19 10:50:38 +00:00
logger.py
[Frontend] API support for beam search (
#9087
)
2024-10-05 23:39:03 -07:00
utils.py
[Bugfix] Fix request cancellation without polling (
#11190
)
2024-12-17 12:26:32 -08:00