This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
bf6a3d0ff5a69e0a30567f2ad417530c002eaa4e
vllm
/
vllm
/
model_executor
/
models
/
transformers
History
Cyrus Leung
d0e186c16f
[V0 Deprecation] Remove unused
context_len
and
seq_len
from M-RoPE (
#28395
)
...
Signed-off-by: DarkLight1337 <
tlleungac@connect.ust.hk
>
2025-11-11 00:30:06 +08:00
..
__init__.py
Refactor Transformers backend to use mixins (
#26906
)
2025-10-16 21:50:39 +00:00
base.py
[Bugfix] Fix encoder-only model support for transformers backend (
#28021
)
2025-11-04 22:24:41 -08:00
causal.py
Refactor Transformers backend to use mixins (
#26906
)
2025-10-16 21:50:39 +00:00
legacy.py
Fix pooling adapters for Transformers backend (
#27338
)
2025-10-23 20:23:55 -07:00
moe.py
[BugFix] Support EP/DP + EPLB with MTP (
#25311
)
2025-11-05 15:22:17 +00:00
multimodal.py
[V0 Deprecation] Remove unused
context_len
and
seq_len
from M-RoPE (
#28395
)
2025-11-11 00:30:06 +08:00
pooling.py
Fix pooling adapters for Transformers backend (
#27338
)
2025-10-23 20:23:55 -07:00
utils.py
Fix issues from
#28242
(
#28257
)
2025-11-07 04:23:17 +00:00