To conduct the low-bit quantization for each image individually, we develop a dynamic quantization scheme for exploring their optimal bit-widths. Experimental results show that our method can be easily embedded with mainstream quantization frameworks and boost their performance.
Paper:Zhenhua Liu, Yunhe Wang, Kai Han, Siwei Ma and Wen Gao. "Instance-Aware Dynamic Neural Network Quantization", CVPR 2022.
A bit-controller is employed to generate the bit-width of each layer for different samples and the bit-controller is jointly optimized with the main network. You can find the details in the paper.
Dataset used: ImageNet2012
DynamicQuant
├── src
└── dataset.py # dataset loader
└── gumbelsoftmax.py # implementation of gumbel softmax
└── quant.py # dynamic quantization
└── resnet.py # resnet network
├── eval.py # inference entry
├── readme.md # Readme
After installing MindSpore via the official website, you can start evaluation as follows:
python eval.py --dataset_path [DATASET]
result: {'acc': 0.6901} ckpt= ./resnet18_dq.ckpt
Checkpoint can be downloaded at https://download.mindspore.cn/model_zoo/research/cv/DynamicQuant/.
Please check the official homepage.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。