Despite incredible advances in the last few decades where microchips have gotten smaller, faster and cheaper with each passing year, the technology industry is fast approaching a potential post-Moore’s Law scenario, where the doubling of computing power every two years will eventually be hampered by how many transistors companies can cram into the nanoscale dimensions of a chip. So it makes sense that some of these companies are investing in alternatives like neuromorphic computing, where chips are built to perform computations much more efficiently, the way a human brain would.
Researchers at Stanford University and the U.S. Department of Energy’s Sandia National Laboratories recently pushed this possibility one step closer to reality with the development of an “artificial synapse” that mimics the energy-efficient way a synapse in a human brain might operate, process and store memory. The flexible component could someday be an integral part of an interface that connects the human brain to a computer, all the while using much less energy than conventional silicon-based microchips.
“It works like a real synapse but it’s an organic electronic device that can be engineered,” explained Alberto Salleo, associate professor of materials science and engineering at Stanford and senior author of the paper, which was published in Nature. “It’s an entirely new family of devices because this type of architecture has not been shown before. For many key metrics, it also performs better than anything that’s been done before with inorganics.”
According to the findings, the device is a form of organic transistor, which the team is calling an electrochemical neuromorphic organic device (ENODe). Made out of inexpensive organic materials — mostly hydrogen and carbon — the device is designed to be potentially compatible with the human brain.
Inspired by the Human Brain
The neuromorphic approach to computing is inspired by how the human brain works. In a biological brain, when something is learned, electrical impulses flow from one neuron to another across a synapse, which includes a pre-synaptic ending that consists of neurotransmitters, and a postsynaptic ending that has receptor sites for those neurotransmitters. A small gap or synaptic cleft exists between these pre-synaptic and post-synaptic endings. The most energy that’s required is when these impulses make the first crossing between neurons; after that, less energy is needed, resulting in a model that is energy-efficient and adaptable.
The human brain also differs from the conventional von Neumann architecture in that the processing of information creates memory, rather than the compartmentalized processing of information and then storing it in memory. An artificial synapse that imitates the function of a biological synapse could have big implications for future developments in artificial intelligence.
“Deep learning algorithms are very powerful but they rely on processors to calculate and simulate the electrical states and store them somewhere else, which is inefficient in terms of energy and time,” points out Yoeri van de Burgt, lead author of the paper. “Instead of simulating a neural network, our work is trying to make a neural network.”
Interestingly, the team’s design for the artificial synapse resembles that of a battery, utilizing thin polymer films with three terminals that are connected through a salty liquid. Protons flow through these terminals, which are analogous to the pre-synaptic and post-synaptic endings of a biological synapse. Unlike digital transistors, which can only exist in either a 1 or 0 state, the team was able to program 500 states of conductivity into their artificial synapse, all within a range of about 1 volt, with only a voltage of around 0.5 millivolts to switch between states. That’s about one-tenth of what’s used by conventional hardware. However, the artificial synapse still uses about 10,000 times more energy than its biological counterpart.
When tested in a simulation approximating that of a neural network, the artificial synapse proved that it could recognize the handwritten numbers of 0 through 9 with an accuracy of up to 97 percent. The team now plans to build more of these devices and testing them in arrays with other datasets.
The possibilities of using the artificial synapse for interfacing with functioning human neurons is still admittedly some ways off, but this is yet another development that offers a promising alternative to silicon-based computing. At the minimum, these artificial synapses may be an integral part of a computer that functions more like a human brain, which could help accelerate better processing of visual and auditory data in applications such voice-controlled interfaces in autonomous cars. With accidents still a distinct possibility with self-driving cars, making the onboard system even more efficient and responsive would no doubt be a welcome improvement.
Images: Stanford University
The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: MADE, Real.