FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer

54Citations
Citations of this article
55Readers
Mendeley users who have this article in their library.

Abstract

Network quantization significantly reduces model inference complexity and has been widely used in real-world deployments. However, most existing quantization methods have been developed mainly on Convolutional Neural Networks (CNNs), and suffer severe degradation when applied to fully quantized vision transformers. In this work, we demonstrate that many of these difficulties arise because of serious inter-channel variation in LayerNorm inputs, and present, Power-of-Two Factor (PTF), a systematic method to reduce the performance degradation and inference complexity of fully quantized vision transformers. In addition, observing an extreme non-uniform distribution in attention maps, we propose Log-Int-Softmax (LIS) to sustain that and simplify inference by using 4-bit quantization and the BitShift operator. Comprehensive experiments on various transformer-based architectures and benchmarks show that our Fully Quantized Vision Transformer (FQ-ViT) outperforms previous works while even using lower bitwidth on attention maps. For instance, we reach 84.89% top-1 accuracy with ViT-L on ImageNet and 50.8 mAP with Cascade Mask R-CNN (Swin-S) on COCO. To our knowledge, we are the first to achieve lossless accuracy degradation (∼1%) on fully quantized vision transformers. The code is available at https://github.com/megvii-research/FQ-ViT.

Cite

CITATION STYLE

APA

Lin, Y., Zhang, T., Sun, P., Li, Z., & Zhou, S. (2022). FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1173–1179). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/164

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free