Manhattan Rule for Robust In-Situ Training of Memristive Deep Neural Network Accelerators
Zhang, Ellina, Cai, Jack, Amirsoleimani, Amirali, Rahimi Azghadi, Mostafa, Genov, Roman, and Ahmadi, Majid (2024) Manhattan Rule for Robust In-Situ Training of Memristive Deep Neural Network Accelerators. In: Proceedings of the Midwest Symposium on Circuits and Systems. pp. 1324-1328. From: MWSCAS 2024: IEEE 67th International Midwest Symposium on Circuits and Systems, 11-14 August 2024, Springfield, MA, USA.
|
PDF (Published Version)
- Published Version
Restricted to Repository staff only |
Abstract
In this work, we propose an alternative training approach for memristive circuits - the Manhattan rule training - which utilizes only sign information for weight updates. We present an in-depth analysis in both in-situ and ex-situ settings and show that not only does our method simplify circuit design but it also improves neural network robustness against device non-idealities. Using the MemTorch and our custom in-situ training framework, we implemented the Manhattan rule for MNIST classification and ECG signal detection tasks and achieved close to state-of-the-art performance under noise. Our work also provides a thorough comparison of Manhattan and conventional training methods under the effects of various device non-idealities, giving a crucial benchmark useful for the design of biomedical neural circuits.
| Item ID: | 87458 |
|---|---|
| Item Type: | Conference Item (Research - E1) |
| ISBN: | 9798350387179 |
| ISSN: | 1548-3746 |
| Keywords: | Deep Neural Network, Ex-situ, In-situ, Inference, Manhattan Training, Memristor |
| Copyright Information: | ©2024 IEEE |
| Date Deposited: | 03 Dec 2025 02:56 |
| FoR Codes: | 40 ENGINEERING > 4008 Electrical engineering > 400801 Circuits and systems @ 30% 46 INFORMATION AND COMPUTING SCIENCES > 4611 Machine learning > 461104 Neural networks @ 70% |
| SEO Codes: | 28 EXPANDING KNOWLEDGE > 2801 Expanding knowledge > 280115 Expanding knowledge in the information and computing sciences @ 100% |
| More Statistics |
