Hardware perceptron and ReLU measurements and results
Kiani, Fatemeh; Xia, Qiangfei (2021), Hardware perceptron and ReLU measurements and results, Dryad, Dataset, https://doi.org/10.5061/dryad.w3r2280rf
Recently, in-memory analog computing through memristive crossbar arrays attracted a lot of attention due to its efficient power consumption, area, and computing throughput. Using this computing method, different types of neural networks can be implemented for different applications. In such neural networks, memristors represent the synapses. However, in previous work, digital processors have been used to implement the activation functions or neurons. Implementing neurons using analog-based hardware further improves the power consumption, area, and throughput by removing unnecessary data conversions and communication.
In this study, we designed a ReLU activation function and built a fully hardware-based two-layer fully connected perceptron using memristive arrays, and verified the operation by classifying downsampled MNIST images. We measured the DC and AC characteristics of our designed ReLU, forming, set, and reset behavior of the memristive arrays, and the perceptron behavior during training and inference in the classification task. We also studied the non-idealities related to both the ReLU design and memristors which is significantly critical in future integrated designs. Moreover, the downsampled 8*8 MNIST images that we generated from the original MNIST dataset are included in the data which can be used in future studies with the limited size of the network.