This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
12,987
Commits
2
Branches
140
Tags
d084e9fca7d5d40cbb62eb5fe8ab5cbc6c769cf0
Commit Graph
1 Commits
Author
SHA1
Message
Date
xuebwang-amd
5a1271d83a
[Quantization] fix attention quantization of gpt_oss model (
#27334
)
...
Signed-off-by: xuebwang-amd <
xuebwang@amd.com
>
2025-11-11 12:06:00 -05:00