This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
60d17251c920ae3c9d02e4b4101b738e4905aee4
vllm
/
tests
/
models
/
multimodal
History
Isotr0py
b952f4d3c3
[v1] Add PrefixLM support to FlexAttention backend (
#27938
)
...
Signed-off-by: Isotr0py <
mozf@mail2.sysu.edu.cn
>
2025-12-07 15:51:36 +00:00
..
generation
[v1] Add PrefixLM support to FlexAttention backend (
#27938
)
2025-12-07 15:51:36 +00:00
pooling
Support tokenization_kwargs override (
#29794
)
2025-12-06 09:12:53 +00:00
processing
Revert "[Renderer] Separate out
RendererConfig
from
ModelConfig
(
#30145
)" (
#30199
)
2025-12-07 00:00:22 -08:00
__init__.py
[CI/Build] Move model-specific multi-modal processing tests (
#11934
)
2025-01-11 13:50:05 +08:00
test_mapping.py
Revert "[Renderer] Separate out
RendererConfig
from
ModelConfig
(
#30145
)" (
#30199
)
2025-12-07 00:00:22 -08:00