Cognitive computing: IBM uses phase-change material to model your brain's neurons

Big Blue's getting in your head, man

Photo credit: Shutterstock

IBM scientists claimed – for the first time – to have created artificial spiking neurons using a phase-change material, opening up the possibilities of building a neural network that could be used for AI.

The brain is the biggest inspiration for researchers working in cognitive computing: the exact mechanisms that describe how a brain learns remains a mystery, but the whitecoats know it operates much better than any computer.

To capture the essence of intelligence, researchers turned to mimicking the brain. IBM has built artificial neurons that can fire and carry an electric pulse, recreating the biological processes happening in grey matter.

Biology versus technology

In biological neurons, two thin lipid layers keep any electrical charge within the cell. If an impulse carried along by a dendrite – the long spines of the neuron – is large enough, it can excite the electrical potential in between the lipid layers and the neuron fires a zap of electricity.

Photo credit: Shutterstock

In artificial neurons, however, the lipid layers are replaced with electrodes and another layer of a chalcogenide-based phase-change material sandwiched in between. Input signals pass through the artificial neuron and increased the electric potential of the electrodes. If the voltage pulses are great enough, the electric current passing through will melt the phase-change material, increasing its conductivity.

The electrical current flowing through it increases. Once the conductance reaches a threshold level, the electric pulse becomes large enough to fire and the phase-change device resets. The chalcogenide-based material returns back to its crystalline phase.

The “integrate-and-fire” mechanism is consistent across a range of timescales and frequencies similar to the brain (108 nanoseconds corresponding to a 10Hz update frequency) and beyond.

The ability for the phase-change material to reset means that the artificial neuron can be reused. The switching cycles between a crystalline and amorphous phase can be repeated 1012 times, corresponding to over 300 years of operation if the artificial neuron was working at a frequency of 100Hz, Big Blue's paper stated.

Another feature IBM has achieved when imitating biology is the randomness of artificial neurons – a feature known as stochasticity.

The positions of the atoms in the material are never the same after the artificial neuron goes through the integrate-and-fire process. The change in phase alters the thickness of the material after it is fired and reset every time, which means each firing event is slightly different.

This is where neuromorphic computing diverges from conventional computing, Tomas Tuma, lead author of the study and researcher working at IBM’s Zurich Research Laboratory, told The Register.

“Conventional computers are never perfect, but any randomness is suppressed. In neuromorphic computing, however, we don’t mind the randomness. Actually, this random behaviour is parallel to the brain. Not all the neurons in the brain work the same, some are dead or not as effective,” Tuma said.

Stochasticity is actually essential in harnessing the full power of neural networks, Evangelos Eleftheriou, co-author of the study and IBM Fellow at the Zurich Research Laboratory, added.

Machine intelligence

A single neuron is not as effective as a network of neurons for unsupervised learning – a type of machine learning used in AI.

The artificial neuron has the potential to detect correlations in large streams of data that act as the input signal. IBM used 1,000 streams of binary events, where 100 were mutually correlated as the input signal.

Initially, the neuron fired at a high rate as it tried to find the correlations between the signal. But over time, the system evolves and the feedback loop means uncorrelated signals are more likely to be depressed and correlated signals begin to take over.

The strength of correlation between signals results in a growing electric pulse which will eventually lead to a large spike once the 100 correlated signals have been singled out.

Neural networks have a greater ability to make sense of data quickly as they reach higher levels of computing power generated from many neurons. The input signal could be signals fed from other neurons, which would result in a more thorough and quicker search for correlations between data. The output signals that trickled down from the neurons would become increasingly refined as it passed through layers of neurons.

Diagram of how artificial neuron works. Photo credit: Nature Nanotechnology and Tuma et al.

Pattern recognition is the main aim in machine learning, said Professor Leslie Smith, a researcher working in the Cognitive Computation research group at the University of Stirling, who was not involved in IBM’s research.

“You want neurons firing as it means that patterns can be spotted in real time,” Smith told The Register. “For example, you don’t want autonomous vehicles to analyse images one by one. It needs to be looking and analysing data all the time to adapt to its surroundings,” Smith said.

Neuromorphic computing is a relatively new area and is growing in popularity. "It grew in the 1960s with Marvin Minsky and Seymore Papert, then died down in the 1980's. But it's coming back into vogue again," Smith said.

The number of possible applications for neuromorphic computing stretches beyond AI. It could also be used in the Internet of Things craze, sensors that operate on cognitive computing could collect and analyze volumes of weather data collected at the edge for faster forecasts, IBM said.

The need to deal with huge sets of data quickly and at low power is becoming increasingly apparent. “We are in the cognitive era of computing,” Tuma told The Register. “In the future, neuromorphic chips could even be used as co-processors for a variety of applications to assist the main processor," he claimed. ®

The research has been published in this month's Nature Nanotechnology journal.


Biting the hand that feeds IT © 1998–2017