Scientists create a new material for computer chips that mimics the way the human brain works, according to a study published in the journal Science Advances. The authors from the University of Cambridge, UK, defend that it could slash the amount of energy used by artificial intelligence by a significant margin.
AI systems currently consume vast amounts of electricity because they rely on chips that constantly move data back and forth between memory and processing units. As AI becomes more widespread across industries, this energy demand is growing rapidly. This new research centres on a new type of component called a memristor, designed to process and store information in a way that closely resembles how neurons connect and communicate in the brain. Brain-inspired computing could cut that energy use by as much as 70% by handling both memory and processing in the same place, just as the brain does.
“Energy consumption is one of the key challenges in current AI hardware,” said lead author Dr Babak Bakhit, from Cambridge’s Department of Materials Science and Metallurgy. “To address that, you need devices with extremely low currents, excellent stability, outstanding uniformity across switching cycles and devices, and the ability to switch between many distinct states.”
Most existing memristors work by forming tiny conductive pathways inside a material, but these pathways tend to behave unpredictably and require relatively high voltages to operate. The Cambridge team took a different approach, developing a new thin film made from hafnium oxide (with small amounts of strontium and titanium added) that switches states in a far more controlled and stable way.
“Filamentary devices suffer from random behaviour,” said Bakhit. “But because our devices switch at the interface, they show outstanding uniformity from cycle to cycle and from device to device.”
In tests, the new devices used switching currents about a million times lower than some conventional alternatives, and were able to hold hundreds of distinct, stable states, which is key for the kind of adaptable, learning hardware that next-generation AI would need. The devices also replicated a fundamental learning mechanism seen in the brain, where connections between neurons strengthen or weaken depending on the timing of signals.
“These are the properties you need if you want hardware that can learn and adapt, rather than just store bits,” said Bakhit.
There is still one significant hurdle to clear: the manufacturing process currently requires temperatures of around 700°C, which is higher than standard chip-making processes can accommodate. “This is currently the main challenge in our device fabrication process,” Bakhit said. “But we’re now working on ways to bring the temperature down to make it more compatible with standard industry processes.”
The breakthrough came after nearly three years of failed experiments, with the first promising results emerging only late last year. “I spent almost three years on this,” said Bakhit. “There were a huge number of failures. But at the end of November, we saw the first really good results. It’s still early days of course, but if we can solve the temperature issue, this technology could be game-changing because the energy consumption is so much lower and at the same time, the device performance is highly promising.”
If the team can solve the temperature problem and integrate the devices into standard chips, the technology could represent a major step forward in making AI far less energy-hungry.
Babak Bakhit et al., HfO2-based memristive synapses with asymmetrically extended p-n heterointerfaces for highly energy-efficient neuromorphic hardware.Sci. Adv.12,eaec2324(2026).DOI:10.1126/sciadv.aec2324