Efficient Design and Optimization of Artificial Neural Networks: SW and HW

Date

May 2023

Journal Title

Journal ISSN

Volume Title

Publisher

item.page.doi

Abstract

Artificial Neural Networks (ANNs) have achieved significant advancements in machine intelligence due to their ability to learn complex tasks. However, their deployment on devices with limited resources or real-time applications requires consideration of the network's physical size or area. High-Level Synthesis (HLS) is a design methodology that simplifies the implementation of complex algorithms like ANN design by using high-level programming languages. This work investigates the construction of efficient ANNs by combining techniques such as architecture search, pruning, quantization, and compression to optimize the network's architecture, input/output data size, and computation precision. The optimization framework minimizes the area of the equivalent SystemC ANN model by reducing the bit width allocated for either the neurons or weights. The primary focus is on the application of handwriting recognition using the MNIST dataset, which serves as a prototype problem for understanding neural networks in general. The contributions of this thesis include an explorer to perform architecture search and an optimization framework to minimize the network's area, providing valuable insights into accuracy and hardware efficiency trade-offs.

Description

Keywords

Engineering, Electronics and Electrical

item.page.sponsorship

Rights

Citation