Thrivous

Welcome to Thrivous!

The Free Energy Principle for Thinking Machines

25 January 2022
Giulio Prisco

Neural Net

Researchers led by RIKEN Center for Brain Science have shown that the free-energy principle can explain how neural networks are optimized for efficiency.

The free-energy principle is a mathematical optimization principle that many systems, including living systems, are thought to follow. These systems are thought to always minimize a mathematical function called free energy. While at this moment the free-energy principle is not rigorously derivable from known science, it seems to work in many important cases.

The free-energy principle is based on Bayesian inference. It's a mathematical technique that facilitates probability estimates. And it can help with planning optimization and decision making, as new information becomes available.

Living systems are continuously updated by new incoming sensory data. And they remember their own past decisions. This, the researchers have shown, is also the case with neural networks.

Biological optimization makes our bodies and behavior as efficient as possible. In the brain, neural networks are optimized to allow efficient control of behavior and transmission of information. This is done while still maintaining the ability to adapt and reconfigure to changing environments.

The researchers compared the free-energy principle with well-established rules that control how the strength of neural connections within a network can be altered by changes in sensory input.

In his recent and widely praised book, “Being You: A New Science of Consciousness” (2021), neuroscientist Anil Seth describes the free-energy principle. He writes that it's “a piece of mathematical philosophy” according to which “organisms maintain themselves in the low-entropy states that ensure their continued existence by actively minimising this measurable quantity called free energy.” And it's “basically the same thing as sensory prediction error.”

There's also a forthcoming book, “Active Inference: The Free Energy Principle in Mind, Brain, and Behavior” (2022), co-authored by Karl Friston. He introduced the free-energy principle. And his book is likely to provide an end-to-end explanation.

The researchers have demonstrated that standard neural networks “perform planning and adaptive behavioral control by taking their previous 'decisions' into account," says researcher Takuya Isomura in a press release issued by RIKEN. "Importantly, they do so the same way that they would when following the free-energy principle."

An open access study is published in Communications Biology. It shows how the free-energy principle is the basis for any neural network that minimizes energy cost. Then, as proof-of-concept, it shows how a free-energy minimizing neural network can solve mazes.

This research points the way to a better understanding of general neural networks. It suggests a better understanding of brain networks and their impairments and malfunctions in disorders such as schizophrenia. And it may contribute to the development of thinking machines based on neural networks.

“Our findings guarantee that an arbitrary neural network can be cast as an agent that obeys the free-energy principle, providing a universal characterization for the brain,” adds Isomura. “Our theory can dramatically reduce the complexity of designing self-learning neuromorphic hardware to perform various types of tasks, which will be important for a next-generation artificial intelligence.”

More Articles

Don't miss a beat! In our Pulse Newsletter, Thrivous curates the most important news on health science and human enhancement, so you can stay informed without wasting time on hype and trivia. It's part of the free Thrivous newsletter. Subscribe now to receive email about human enhancement, nootropics, and geroprotectors, as well as company news and deals.

Read more articles at Thrivous, the human enhancement company. You can browse recent articles in Thrivous Views. See other Pulse Newsletter articles. Or check out an article below.