[Misc] Update awq and awq_marlin to use vLLMParameters (#7422)

This commit is contained in:
Dipika Sikka
2024-08-13 17:08:20 -04:00
committed by GitHub
parent d3bdfd3ab9
commit b1e5afc3e7
4 changed files with 73 additions and 82 deletions

View File

@@ -12,4 +12,6 @@ compressed-tensors, nm-testing/tinyllama-oneshot-w4a16-group128-v2, main
compressed-tensors, nm-testing/tinyllama-oneshot-w8a16-per-channel, main
compressed-tensors, nm-testing/Meta-Llama-3-8B-FP8-compressed-tensors-test, main
compressed-tensors, nm-testing/Phi-3-mini-128k-instruct-FP8, main
compressed-tensors, neuralmagic/Phi-3-medium-128k-instruct-quantized.w4a16, main
compressed-tensors, neuralmagic/Phi-3-medium-128k-instruct-quantized.w4a16, main
awq, casperhansen/mixtral-instruct-awq, main
awq_marlin, casperhansen/mixtral-instruct-awq, main