TY - JOUR
T1 - Low-energy robust neuromorphic computation using synaptic devices
AU - Kuzum, Duygu
AU - Jeyasingh, Rakesh Gnana David
AU - Wong, H. S.Philip
N1 - Funding Information: Manuscript received July 18, 2012; accepted August 22, 2012. Date of publication October 25, 2012; date of current version November 16, 2012. The review of this paper was arranged by Editor A. C. Seabaugh. This work was supported in part by DARPA Synapse, by the National Science Foundation, and by the Nanoelectronics Research Initiative of the Semiconductor Research Corporation.
PY - 2012
Y1 - 2012
N2 - Brain-inspired computing is an emerging field, which aims to reach brainlike performance in real-time processing of sensory data. The challenges that need to be addressed toward reaching such a computational system include building a compact massively parallel architecture with scalable interconnection devices, ultralow-power consumption, and robust neuromorphic computational schemes for implementation of learning in hardware. In this paper, we discuss programming strategies, material characteristics, and spike schemes, which enable implementation of symmetric and asymmetric synaptic plasticity with devices using phase-change materials. We demonstrate that energy consumption can be optimized by tuning the device operation regime and the spike scheme. Our simulations illustrate that a crossbar array consisting of synaptic devices and neurons can achieve hippocampus-like associative learning with symmetric synapses and sequence learning with asymmetric synapses. Pattern completion for patterns with 50% missing elements is achieved via associative learning with symmetric plasticity. Robustness of learning against input noise, variation in sensory data, and device resistance variation are investigated through simulations.
AB - Brain-inspired computing is an emerging field, which aims to reach brainlike performance in real-time processing of sensory data. The challenges that need to be addressed toward reaching such a computational system include building a compact massively parallel architecture with scalable interconnection devices, ultralow-power consumption, and robust neuromorphic computational schemes for implementation of learning in hardware. In this paper, we discuss programming strategies, material characteristics, and spike schemes, which enable implementation of symmetric and asymmetric synaptic plasticity with devices using phase-change materials. We demonstrate that energy consumption can be optimized by tuning the device operation regime and the spike scheme. Our simulations illustrate that a crossbar array consisting of synaptic devices and neurons can achieve hippocampus-like associative learning with symmetric synapses and sequence learning with asymmetric synapses. Pattern completion for patterns with 50% missing elements is achieved via associative learning with symmetric plasticity. Robustness of learning against input noise, variation in sensory data, and device resistance variation are investigated through simulations.
KW - Hopfield network
KW - neuromorphic
KW - phase-change materials
KW - plasticity
KW - spike-timing-dependent plasticity (STDP)
KW - synaptic device
UR - http://www.scopus.com/inward/record.url?scp=84870294722&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84870294722&partnerID=8YFLogxK
U2 - 10.1109/TED.2012.2217146
DO - 10.1109/TED.2012.2217146
M3 - Article
SN - 0018-9383
VL - 59
SP - 3489
EP - 3494
JO - IEEE Transactions on Electron Devices
JF - IEEE Transactions on Electron Devices
IS - 12
M1 - 6340321
ER -