[Quantization] Bump to use latest bitsandbytes (#20424)

Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
This commit is contained in:
Jee Jee Li
2025-07-03 21:58:46 +08:00
committed by GitHub
parent 7f0367109e
commit 1819fbda63
8 changed files with 14 additions and 14 deletions

View File

@@ -10,7 +10,7 @@ Compared to other quantization methods, BitsAndBytes eliminates the need for cal
Below are the steps to utilize BitsAndBytes with vLLM.
```bash
pip install bitsandbytes>=0.45.3
pip install bitsandbytes>=0.46.1
```
vLLM reads the model's config file and supports both in-flight quantization and pre-quantized checkpoint.