Skip to content

Support GPTQ/Marlin format quantization (4bit weight, f16 input) #284

Support GPTQ/Marlin format quantization (4bit weight, f16 input)

Support GPTQ/Marlin format quantization (4bit weight, f16 input) #284

Annotations

7 warnings

The logs for this run have expired and are no longer available.