Histogram-equalized quantization for logic-gated residual neural networks - CEA Grenoble Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Histogram-equalized quantization for logic-gated residual neural networks

Résumé

Adjusting the quantization according to the data or to the model loss seems mandatory to enable a high accuracy in the context of quantized neural networks. This work presents Histogram-Equalized Quantization (HEQ), an adaptive framework for linear and symmetric quantization. HEQ automatically adapts the quantization thresholds using a unique step size optimization. We empirically show that HEQ achieves state-of-the-art performances on CFAR-10. Experiments on the STL-10 dataset even show that HEQ enables a proper training of our proposed logic-gated (OR, MUX) residual networks with a higher accuracy at a lower hardware complexity than previous work.
Fichier principal
Vignette du fichier
ISCAS_Final_Submission.pdf (706 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

cea-04556166 , version 1 (23-04-2024)

Identifiants

Citer

Van Thien Nguyen, William Guicquero, Gilles Sicard. Histogram-equalized quantization for logic-gated residual neural networks. ISCAS 2022 - 2022 IEEE International Symposium on Circuits and Systems, May 2022, Austin, France. pp.1289-1293, ⟨10.1109/ISCAS48785.2022.9937290⟩. ⟨cea-04556166⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More