ϟ
 
DOI: 10.1109/tvlsi.2021.3097341
OpenAccess: Closed
This work is not Open Acccess. We may still have a PDF, if this is the case there will be a green box below.

Dynamic Block-Wise Local Learning Algorithm for Efficient Neural Network Training

Gwangho Lee,Sunwoo Lee,Dongsuk Jeon

Backpropagation
Computer science
Block (permutation group theory)
2021
In the backpropagation algorithm, the error calculated from the output of the neural network should backpropagate the layers to update the weights of each layer, making it difficult to parallelize the training process and requiring frequent off-chip memory access. Local learning algorithms locally generate error signals which are used for weight updates, removing the need for backpropagation of error signals. However, prior works rely on large, complex auxiliary networks for reliable training, which results in large computational overhead and undermines the advantages of local learning. In this work, we propose a local learning algorithm that significantly reduces computational complexity as well as improves training performance. Our algorithm combines multiple consecutive layers into a block and performs local learning on a block-by-block basis, while dynamically changing block boundaries during training. In experiments, our approach achieves 95.68% and 79.42% test accuracy on the CIFAR-10 and CIFAR-100 datasets, respectively, using a small fully connected layer as auxiliary networks, closely matching the performance of the backpropagation algorithm. Multiply-accumulate (MAC) operations and off-chip memory access also reduce by up to 15% and 81% compared to backpropagation.
Loading...
    Cite this:
Generate Citation
Powered by Citationsy*
    Dynamic Block-Wise Local Learning Algorithm for Efficient Neural Network Training” is a paper by Gwangho Lee Sunwoo Lee Dongsuk Jeon published in 2021. It has an Open Access status of “closed”. You can read and download a PDF Full Text of this paper here.