Parameterizing Asymmetric Quantization Ranges for Stable and Efficient Quantization-Aware Training
Different parameterizations of asymmetric quantization ranges, including scale/offset, min/max, and beta/gamma, exhibit varying behaviors during quantization-aware training. Careful selection and tuning of the parameterization can significantly impact the stability and efficiency of the training process.