Hardware perceptron and ReLU measurements and results
Data files
Aug 11, 2021 version files 70.73 MB
-
AC_100K.csv
58.23 KB
-
AC_10Meg.csv
36.54 KB
-
AC_1Meg.csv
58.39 KB
-
AC_5Meg.csv
58.54 KB
-
AC_delay.csv
58.64 KB
-
Accuracy_NonLinearity.csv
75 B
-
Accuracy_ReLUs_Mismatch.csv
4.80 KB
-
Accuracy_vs_ReLUs_Gain_Error.csv
217 B
-
AccuracyVsNoise.csv
321 B
-
AccuracyVsNoiseZoomIn.csv
798 B
-
DC_Gain100_25Times.csv
1.49 KB
-
DC_Gain100.csv
359 B
-
DC_Gain100K.csv
164 B
-
DC_Gain10K.csv
153 B
-
DC_Gain1K.csv
177 B
-
Experimental.csv
11.45 KB
-
G_slow.csv
3.45 MB
-
mnist_test_8x8_0d2Labels.csv
20 KB
-
mnist_training_8x8_0d2Data_0d2.csv
33.59 MB
-
mnist_training_8x8_0d2Data.csv
32.98 MB
-
mnist_training_8x8_0d2Labels.csv
120 KB
-
READMe.txt
3.67 KB
-
SimulatedArray.csv
11.44 KB
-
Single_Device_Forming.csv
11.60 KB
-
Single_Device_IV.csv
239.35 KB
-
Software.csv
11.41 KB
-
TestAccuracyVsHiddenNeurons.csv
225 B
-
TrainingAccuracyVsHiddenNeurons.csv
163 B
Abstract
Recently, in-memory analog computing through memristive crossbar arrays attracted a lot of attention due to its efficient power consumption, area, and computing throughput. Using this computing method, different types of neural networks can be implemented for different applications. In such neural networks, memristors represent the synapses. However, in previous work, digital processors have been used to implement the activation functions or neurons. Implementing neurons using analog-based hardware further improves the power consumption, area, and throughput by removing unnecessary data conversions and communication.
In this study, we designed a ReLU activation function and built a fully hardware-based two-layer fully connected perceptron using memristive arrays, and verified the operation by classifying downsampled MNIST images. We measured the DC and AC characteristics of our designed ReLU, forming, set, and reset behavior of the memristive arrays, and the perceptron behavior during training and inference in the classification task. We also studied the non-idealities related to both the ReLU design and memristors which is significantly critical in future integrated designs. Moreover, the downsampled 8*8 MNIST images that we generated from the original MNIST dataset are included in the data which can be used in future studies with the limited size of the network.