Skip to content

Support bf16 inputs for GPTQ/Marlin format quantization (#90) #287

Support bf16 inputs for GPTQ/Marlin format quantization (#90)

Support bf16 inputs for GPTQ/Marlin format quantization (#90) #287

Annotations

6 warnings

This job succeeded