This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
ea37530b474fa738a99a53a8975af4e389b968c7
vllm
/
examples
/
pooling
History
junuxyz
c61a98f529
[CI][BugFix] ShellCheck cleanup to remove baseline and preserve runtime behavior (
#34514
)
...
Signed-off-by: junuxyz <
216036880+junuxyz@users.noreply.github.com
>
2026-02-17 12:22:56 +00:00
..
classify
[Doc] Update usage of
--limit-mm-per-prompt
(
#34148
)
2026-02-09 21:12:13 -08:00
embed
[CI][BugFix] ShellCheck cleanup to remove baseline and preserve runtime behavior (
#34514
)
2026-02-17 12:22:56 +00:00
plugin
(bugfix): Fixed encode in LLM entrypoint for IOProcessr plugin prompts (
#34618
)
2026-02-16 07:33:55 -08:00
pooling
[Frontend][last/5] Make pooling entrypoints request schema consensus. (
#31127
)
2026-02-09 06:42:38 +00:00
score
[new model] add COLQwen3 code & Inference (
#34398
)
2026-02-14 12:15:19 +08:00
token_classify
[Frontend][2/n] Make pooling entrypoints request schema consensus | ChatRequest (
#32574
)
2026-01-22 10:32:44 +00:00
token_embed
[new model] add COLQwen3 code & Inference (
#34398
)
2026-02-14 12:15:19 +08:00