In this paper, we show for the first time how unavoidable device variability of emerging non-volatile resistive memory devices can be exploited to design efficient low-power, low-footprint Extreme Learning Machine (ELM) architectures. In particular we utilize the uncontrollable Off-state resistance (Roff/HRS) spreads, of nanoscale filamentary- resistive memory devices, to realize random input weights and random hidden neuron biases; a characteristic requirement of ELM.
We propose a novel RRAM-ELM architecture. To validate our approach, experimental data from different filamentary- resistive switching devices (CBRAM, OXRAM) is used for full network simulations. Learning capability of our RRAM-ELM architecture is illustrated with the help of two real world applications- (i) diabetes diagnosis test (classification) and (ii) SinC curve fitting (regression).