Keyphrases
DNN Training
100%
Transposable
100%
In-memory Computing
100%
Phase Change Random Access Memory (PCRAM)
100%
Memory Architecture
100%
Energy Efficient
100%
Weight Update
40%
Memory Array
26%
Low-precision
20%
Bitcell
20%
Row-column
13%
Bit Precision
13%
Non-ideal Characteristics
13%
Electrical Engineering
13%
2T1R
13%
Learning Algorithm
13%
Computer Engineering
13%
Summer Months
13%
Memristor
13%
On chip
13%
Multiple Rows
6%
IBM-1
6%
Low-precision ADC
6%
Write Driver
6%
Bitline
6%
Propa
6%
Binary Phase
6%
Parallel Activation
6%
Stochastic Rounding
6%
Memory Computing
6%
Rounding Method
6%
Chunk-based
6%
Pulse Width
6%
Analog Computing
6%
Hybrid Design
6%
Gradient Accumulation
6%
Forward-backward
6%
On-device Training
6%
Area Footprint
6%
Memory Access Cost
6%
90-nm CMOS Technology
6%
Tempe
6%
Backward Propagation
6%
Nonlinearity
6%
Compact Model
6%
Learning Task
6%
Multiple Devices
6%
Array Size
6%
In-memory
6%
Drone
6%
Analog Data
6%
Backpropagation
6%
Data Storage
6%
Multi-bit
6%
Comm
6%
Neuromorphic
6%
Synapse
6%
Energy Engineering
6%
Device-independent
6%
Analog Memory
6%
Storage Devices
6%
Model Size
6%
Design Decisions
6%
Good Accuracy
6%
Continual Learning
6%
Highly Parallel
6%
Design Trade
6%
Emerging NVM
6%
Practical Learning
6%
Detailed Model
6%
Postsynaptic Density Protein 95 (PSD-95)
6%
Arizona State University
6%
Energy Requirement
6%
Computing Architecture
6%
Newark
6%
Binary Data
6%
One-phase
6%
Dynamic Range
6%
Stochasticity
6%
New Jersey
6%
Special Purpose
6%
Write Energy
6%
Institutes of Technology
6%
Computer Science
Memory Architecture
100%
Deep Neural Network
100%
Energy Efficient
100%
Memory Array
44%
Learning Algorithm
22%
Computer Architecture
22%
Backpropagation
11%
Data Storage Device
11%
Multiple Device
11%
Integrated Circuits
11%
Chip Memory Access
11%
Bitline
11%
Information Storage
11%
Emerging Application
11%
Approximation (Algorithm)
11%
Special Purpose
11%
Synaptic Weight
11%