This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
56fe4c297c7d9d872eccc19e3edbf1d75e1a30e2
vllm
/
vllm
/
model_executor
/
layers
/
quantization
/
kernels
History
Robert Shaw
56fe4c297c
[TPU][Quantization] TPU
W8A8
(
#11785
)
...
Co-authored-by: Woosuk Kwon <
woosuk.kwon@berkeley.edu
>
2025-01-08 19:33:29 +00:00
..
mixed_precision
[TPU][Quantization] TPU
W8A8
(
#11785
)
2025-01-08 19:33:29 +00:00
scaled_mm
[TPU][Quantization] TPU
W8A8
(
#11785
)
2025-01-08 19:33:29 +00:00
__init__.py
[TPU][Quantization] TPU
W8A8
(
#11785
)
2025-01-08 19:33:29 +00:00