This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
663
Commits
2
Branches
140
Tags
00efdc84baf313cb775ca99a011b0e9a13539bdd
Commit Graph
3 Commits
Author
SHA1
Message
Date
wbn
dacaf5a400
Replace head_mapping params with num_kv_heads to attention kernel. (
#1997
)
...
Co-authored-by: wangguoya <
wangguoya@baidu.com
> Co-authored-by: Yang Zhao <
zhaoyangstar@foxmail.com
>
2023-12-10 10:12:53 -08:00
Yanming W
e0c6f556e8
[Build] Avoid building too many extensions (
#1624
)
2023-11-23 16:31:19 -08:00
Woosuk Kwon
928de46888
Implement PagedAttention V2 (
#1348
)
2023-10-16 00:59:57 -07:00