Gaussian synapses for probabilistic neural networks

Gaussian synapses for probabilistic neural networks
Like

The exponential growth in computational power in the past 5 decades was driven by scaling of complementary metal oxide semiconductor (CMOS) technology. Scaling has three major aspects, energy scaling ensures constant computational power budget, size scaling ensures more transistors in the same area resulting in higher computation density, and complexity scaling ensures improvements in architecture for higher computing efficiency. Unfortunately, the current “Dark Silicon” era is deprived of these quintessential scaling trends owing to fundamental limitations at the material, device and architectural levels and awaits innovation to restore the growth in the computation power.  The current research involving neuromorphic computing has great potentials to further the computational capabilities, which has led me to look into neural networks. The ability of the brain to process large amounts of information and seamlessly arrive at  conclusions in problems like pattern classification, while consuming a miniscule amount of power (~20W) with a relatively small area is extremely fascinating. This is achieved through a highly complex architecture involving billions of neurons connected through trillions of synapses, which results in highly parallel computation. Additionally, brain employs analog in-memory computation which enhances its energy efficiency compared the deterministic digital computing which uses von Neumann architecture, where the memory and computing is separated. Compared to the current state-of-the art super-computers, brain demonstrates extreme energy and area efficiency while trading for speed and information storage capacity.  Being part of a device group, the decline in scaling prompted us to think of novel device ideas which can be implemented in neural networks to reinstate different aspects of scaling. This resulted in the conception of idea of the “Gaussian synapse” using heterogeneous integration of two-dimensional (2D) field-effect transistors (FETs) biased in their respective subthreshold regimes. The Gaussian synapse facilitates energy scaling, whereas the 2D materials enables size scaling without losing electrostatic control and finally, probabilistic neural networks (PNNs) enable complexity scaling due to its ability to seamlessly capture non-linear decision boundaries using fewer components than typical artificial neural networks (ANNs). The Gaussian synapse is achieved through the series connection of n-type molybdenum disulfide (MoS2) and p-type black phosphorus (BP) FETs resulting in a Gaussian transfer function. Subsequently, the FETs are top-gated to introduce dynamic control of amplitude, mean and standard deviation of the Gaussian function, which enables its use in PNNs. The Gaussian synapses are then used for classification of EEG signals into different brain waves characterized by their frequency as alpha, beta, gamma, delta and theta waves. Looking ahead, ultra-low power devices with in-memory computing will further drive the scaling requirements to achieve highly efficient Gaussian synapses.

https://www.nature.com/articles/s41467-019-12035-6

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Subscribe to the Topic

Electrical and Electronic Engineering
Technology and Engineering > Electrical and Electronic Engineering

Related Collections

With collections, you can get published faster and increase your visibility.

Applied Sciences

This collection highlights research and commentary in applied science. The range of topics is large, spanning all scientific disciplines, with the unifying factor being the goal to turn scientific knowledge into positive benefits for society.

Publishing Model: Open Access

Deadline: Ongoing