The paper in Nature Communications is here: go.nature.com/2OJ1HzG
Memristors and Resistive Neural Network (ResNN)
Memristor was predicted by Leon Chua in 1971 and linked to practical devices at Hewlett-Packard Labs by R. Stanley Williams and colleagues in 2008. The memristance, establishing the missing relationship between the magnetic flux and the electrical charge, bears the same unit as resistance but features a dynamic amplitude depending on the history of the applied voltage. Being extensively studied in the last decade, memristors have been identified as one of the leading candidates for potential applications to the next-generation memory and in-memory computing. Specifically, their characteristics of weighting input voltage pulses and voltage-dependent conductance modulation closely resemble the behaviors of a biological synapse. On the other hand, their ability of spatiotemporally integrating input voltage pulses and switching ON upon the threshold mimics the temporal leaky integrate-and-fire behavior of a neuron. Such properties lead to the demonstration of memristor-based neuronal networks, which are Resistive Neural Networks (ResNN). Unfortunately, such a ResNN converts electricity to unwanted heat during operations. In addition, the memristors are passive elements, which dissipate the energy of the signals propagating in the networks without recycling them.
Memcapacitor and Capacitive Neural Network (CapNN)
In 2010, Massimiliano Di Ventra, Yuriy Pershin, and Leon Chua reported that the memory characteristics could also be possessed by the capacitive and inductive systems, leading to the so-called memcapacitors and meminductors, respectively. Inspired by this, we used a simple idea that a fuse, embodied by a diffusive memristor (volatile bi-directional resistance switch with large ON/OFF ratio), could bypass one of the two capacitors to achieve the capacitive switching. Such a pseudo-memcapacitor, in addition to its hysteresis loop in the charge vs. voltage plot under quasi-static voltage sweeps, features a unique volatile accumulative switching thanks to the Ag migration dynamics in the dielectrics, which enables the device to emulate the temporal integration of a neuron with high fidelity. By integrating the pseudo-memcapacitor onto the gate of a transistor, we developed a neuro-transistor, which is not only much smaller than the previously reported CMOS neurons, but also possesses diffusion dynamics that is critical for neuromorphic functions lacked in the CMOS components. In contrast to the artificial neurons based on two-terminal passive memristors, such as phase change, redox, and magnetoresistive devices reported in the literature so far, our neuro-transistor is demonstrated to be active and capable of propagating signals from one neural layer to the next without extra peripheral circuits, which is a must for multilayer deep neural network applications.
The first integrated Capacitive Neural network and Computing
Together with pseudo-memcapacitive synapses, a fully integrated capacitive neural network was built and used to successfully demonstrate efficient vector-matrix multiplications directly utilizing laws of physics. Compared to the prevalent restive neural networks, the capacitive neural networks are envisioned to resemble the bio-counterparts more closely and consume less energy due to the capacitive nature of the building blocks.
The memcapacitive implementation of a neural network provides an unexplored alternative, as competitive as the resistive neural networks if not more, for hardware implementation of neuromorphic computing. The promise for better emulation fidelity and improved power efficiency could spur new research directions for memory device and machine learning communities.