期刊文献+

一种移动卷积神经网络的FPGA实现 被引量:6

Hardware implementation of a convolutional neural network for mobile terminal based on FPGA
下载PDF
导出
摘要 卷积神经网络是深度学习的一种重要模型,广泛应用于图像处理等领域.常用的神经网络模型因结构复杂,参数众多,不适于放在移动端运行.本文基于模块化和硬件复用的思想,给出了一种基于FPGA的手写数字字符识别网络的硬件实现,基于MobileNet的原理改进结构,在实现了算法硬件加速的同时,有效地降低了网络的参数数量和整体运算量.基于MNIST数据集的实验结果表明,对比传统结构的神经网络,改进结构的参数量减少了23.26%,计算量减少了31.32%,在保持速度不变的前提下,用更少的资源和更低的功耗实现了整个网络. Convolutional neural networks are an important model of deep learning and are widely used in image processing and other fields. The commonly used neural network model is complex and has many parameters, which is not suitable for running on the mobile end. Based on the idea of modularization and hardware reuse, this paper presents a hardware implementation of handwritten digital character recognition network based on FPGA. Based on the principle of MobileNet, the structure is improved, and the algorithm hardware acceleration is realized, and the number of parameters of the network and the overall calculation amount are effectively reduced. The experimental results based on the MNIST dataset show that compared with the traditional neural network, the parameter size of the improved structure is reduced by 23.26%, and the calculation amount is reduced by 31.32%. The entire network is implemented with less resources and lower power consumption while maintaining the same speed.
作者 李炳辰 黄鲁 LI Bing-chen;HUANG Lu(School of Microelectronics, University of Science and Technology of China,Hefei 230026, China)
出处 《微电子学与计算机》 北大核心 2019年第9期7-11,共5页 Microelectronics & Computer
基金 国家自然基金面上项目(61874102)
关键词 FPGA 卷积神经网络 硬件加速 MobileNet 移动端 FPGA convolutional neural network hardware acceleration MobileNet mobile terminal
  • 相关文献

参考文献2

二级参考文献28

  • 1MarkoffJ. How many computers to identify a cat?[NJ The New York Times, 2012-06-25.
  • 2MarkoffJ. Scientists see promise in deep-learning programs[NJ. The New York Times, 2012-11-23.
  • 3李彦宏.2012百度年会主题报告:相信技术的力量[R].北京:百度,2013.
  • 410 Breakthrough Technologies 2013[N]. MIT Technology Review, 2013-04-23.
  • 5Rumelhart D, Hinton G, Williams R. Learning representations by back-propagating errors[J]. Nature. 1986, 323(6088): 533-536.
  • 6Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks[J]. Science. 2006, 313(504). Doi: 10. 1l26/science. 1127647.
  • 7Dahl G. Yu Dong, Deng u, et a1. Context-dependent pre?trained deep neural networks for large vocabulary speech recognition[J]. IEEE Trans on Audio, Speech, and Language Processing. 2012, 20 (1): 30-42.
  • 8Jaitly N. Nguyen P, Nguyen A, et a1. Application of pretrained deep neural networks to large vocabulary speech recognition[CJ //Proc of Interspeech , Grenoble, France: International Speech Communication Association, 2012.
  • 9LeCun y, Boser B, DenkerJ S. et a1. Backpropagation applied to handwritten zip code recognition[J]. Neural Computation, 1989, I: 541-551.
  • 10Large Scale Visual Recognition Challenge 2012 (ILSVRC2012)[OLJ.[2013-08-01J. http://www. image?net.org/challenges/LSVRC/2012/.

共引文献636

同被引文献29

引证文献6

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部