Search FQXi


RECENT FORUM POSTS

Balybin Urievich: "As for me, teaching the younger generation is one of the main tasks of..." in The Present State of...

John Cox: "Okay. In Topology, all points on a sphere are vector positions and..." in The Quantum Identity...

Vending Ways: "Vending Ways is a UAE-based leading distributor of Coin Operated Washing..." in MCQST2021 | The universe...

Steve Dufourny: "David Bohm developed a method for comcrete dialogues rather than debates..." in Global Collaboration

Steve Dufourny: "One of the problems is the difficulty to unite and convice. The majority..." in Global Collaboration

Thomas Ray: "Kobi, You write as if information accumulates like physical grains of sand..." in Mathematical Models of...

Thomas Ray: "Nicholas, Indeed. This sums up pretty much what I had in mind when I..." in Mathematical Models of...

Thomas Ray: "I think of it this way: Just as the quality "color" is known not by the..." in The Quantum Identity...


RECENT ARTICLES
click titles to read articles

The Math of Consciousness: Q&A with Kobi Kremnitzer
A meditating mathematician is developing a theory of conscious experience to help understand the boundary between the quantum and classical world.

Can We Feel What It’s Like to Be Quantum?
Underground experiments in the heart of the Italian mountains are testing the links between consciousness and collapse theories of quantum physics.

The Thermodynamic Limits of Intelligence: Q&A with David Wolpert
Calculating the energy needed to acquire and compute information could help explain the (in)efficiency of human brains and guide the search for extra-terrestrial intelligence.

Gambling Against the Second Law
Using precision thermometry to make mini heat engines, that might, momentarily, bust through the thermodynamic limit.

Mind and Machine: What Does It Mean to Be Sentient?
Using neural networks to test definitions of 'autonomy.'


FQXI ARTICLE
May 20, 2022

The Thermodynamic Limits on Intelligence: Q&A with David Wolpert
Calculating the energy needed to acquire and compute information could help explain the (in)efficiency of human brains and guide the search for extra-terrestrial intelligence.
by Miriam Frankel
FQXi Awardees: David Wolpert
March 25, 2022
Bookmark and Share


Searching For Signs of Intelligence
Credit: sdecoret, Shutterstock
The question of just what exactly distinguishes living matter from dead lumps of atoms, such as rock, is one of the greatest mysteries of physics. Even our best theories can’t really describe life, agency, consciousness or intelligence. But, with the help of an FQXi grant of over $118,000, David Wolpert, a physicist at the Santa Fe Institute, New Mexico, is trying to crack the puzzle with information theory and statistical physics. He wants to understand what constitutes and limits intelligence—defined as the ability to acquire information from surroundings and use it for computation in order to stay alive. The findings may one day help explain why human brains aren’t more efficient—and how to best search for intelligent life in the universe.

You propose that intelligence is intimately tied to information gathering. As such, it makes sense that information theory, first developed by Claude Shannon in the 1940s, is a good approach to understanding it. But you are also combining this analysis with statistical physics, an approach normally used by physicists working on thermodynamics—the science of heat and energy transfer—and trying to consider the properties of large groups of atoms. What’s the benefit of considering intelligence in terms of statistical physics or thermodynamics? And what challenges come with applying it to living systems?

When it comes to life, it’s very natural to use statistical physics because you’re trying to generalize from many different biochemical systems—all of which are far away from thermodynamic equilibrium, and rely on thermodynamic phenomena to stay that way, on what are called ’equilibrium systems.’ Think for example of a cup of hot tea cooling down to the same temperature as its surroundings—a state of equilibrium—allowing no change after that. But if you look at everything around us—all the interesting systems—they’re not at equilibrium.

Do you mean, for example, biological processes such as those responsible for pumping certain ions into our cells and thereby creating a higher concentration of such ions inside them compared to outside—a state out of equilibrium?

Yes, or, as another example, the biological process of a journalist interviewing a nerd over Zoom. None of these processes are in equilibrium. In fact, arguably, equilibrium means death. Certainly, equilibrium means no intelligence.

We might gain much
greater insights into
living organisms here
on Earth as well into
the origins of life.
- David Wolpert
Fortunately though, there was actually a bit of a revolution in statistical physics, starting in the 21st century, with the birth of the field of non-equilibrium statistical physics, or "stochastic thermodynamics" as it is often called. This new field was not just complicated math built on the previous maths. It was more like "wait, let’s look at this thing a little differently"—and then when you’ve done that, you’ve knocked open the piñata, and all this formalism falls out. It has basically started to explode. And it provides us with a way of doing statistical physics for evolving systems that are arbitrarily far away from thermal equilibrium.

So how can this help us understand intelligence?

When you’re getting "semantic information"—information which carries meaning for a given system—from the environment, you are outside of thermal equilibrium. If we were to stop that information from coming in, if we were to intervene in the dynamic process coupling the system to its environment, that system would relax to thermal equilibrium (Kolchinsky, A. & Wolpert, D. H. Interface Focus. 82018004120180041 (2018)). Imagine you are a cell and you can go right or left—and there’s more food to your right and less food to your left. You probe to see in both directions where the amount of food is going up. If you don’t do this, you’ll run out of food and die.

You are trying to work out how much energy it takes to maintain intelligence in this way under various limiting constraints. Why?


David Wolpert
Santa Fe Institute
Whenever you do a very simple computational operation, such as the very simple operations underlying the functioning of neurons in the brain, there’s a minimal thermodynamic cost, that is, a minimal amount of energy required. We know the minimum energetic cost of simply erasing one bit—0 or 1—of information is equivalent to something called the Boltzmann’s constant (which is tiny on the order of
10-23) multiplied by the temperature and the natural logarithm of 2 (roughly 0.69). This is teeny. Even if you scale it up to the human brain or to living systems, it’s insignificant. So why is it that with systems such as the human brain, where there’s massive evolutionary incentives to reduce the cost of intelligence, the best we can do is get it down to the brain consuming 20% of all your calories? Or as another example, why is it that the digital computers we use, such as your phone and your Mac, use so much energy despite all these brilliant computer engineers trying to use as little energy as possible? Why is it that cells require so much energy to do the very, very simple kinds of intelligence that they do? It must be because you can’t actually get anywhere close to the tiny theoretical limit. There must be other constraints on how the system can behave that keep it from being able to get anywhere close to the theoretical minimum energetic cost.

What might those constraints be?

Your brain is constrained in a way that whatever intelligent actions you do, it has to be built out of neurons—with noisy kind of stuff going on. That’s a massive constraint. And digital engineers making your phone and your Mac are under the constraint that they have to use what’s called CMOS (complementary metal oxide semiconductor) technology. Those constraints must be what’s causing the energetic needs of intelligent systems to be so large. We know what these constraints are in many real systems. But for much less sophisticated systems, where we’re dealing with much simpler sets of more fundamental constraints, this relationship between energy, constraints and intelligence is still very, very complicated.

Can this help us better understand human intelligence?

We might gain much greater insights into living organisms here on Earth as well into the origins of life. If we can understand the relationship between the constraints, the energetics and intelligence, we might be able to understand questions such as "Why the hell can’t we have brains that are much less costly?" There are great incentives to evolution to keep us as dumb as possible. So to understand the evolutionary history of humans, why evolution decided with us, to keep going despite the costs, we need to learn about the thermodynamic characteristics, the energetic constraints and energetic requirement of intelligent systems.

Could this help build better intelligent systems, such as AI?

Hopefully, it could one day help us build more energetically efficient ones.

What other insights might this line of research lead to?

If we could understand this, and the general physics of it, that would give us a way to be able to understand, for example, what we’re looking at when we send probes to the Jovian atmosphere. If there are living intelligent beings on Jupiter, they’re not going to be like us, made out of organic chemical molecules. They’ll be something else—something very strange. If there are creatures living in the photosphere of the sun— building huge civilizations there—they would not be like us. However, if we can find systems whose physics is the same as the physics that we identified for intelligent systems, then we’ll be in a better place to discover it.

Comment on this Article

Please read the important Introduction that governs your participation in this community. Inappropriate language will not be tolerated and posts containing such language will be deleted. Otherwise, this is a free speech Forum and all are welcome!
  • Please enter the text of your post, then click the "Submit New Post" button below. You may also optionally add file attachments below before submitting your edits.

  • HTML tags are not permitted in posts, and will automatically be stripped out. Links to other web sites are permitted. For instructions on how to add links, please read the link help page.

  • You may use superscript (10100) and subscript (A2) using [sup]...[/sup] and [sub]...[/sub] tags.

  • You may use bold (important) and italics (emphasize) using [b]...[/b] and [i]...[/i] tags.

  • You may also include LateX equations into your post.

Insert LaTeX Equation [hide]

LaTeX equations may be displayed in FQXi Forum posts by including them within [equation]...[/equation] tags. You may type your equation directly into your post, or use the LaTeX Equation Preview feature below to see how your equation will render (this is recommended).

For more help on LaTeX, please see the LaTeX Project Home Page.

LaTeX Equation Preview



preview equation
clear equation
insert equation into post at cursor


Your name: (optional)






Recent Comments


Random query: Has anyone thought to apply this complex energy consumption analysis to the way cancer tumors? @BabyBoomerWritr

read all article comments

Please enter your e-mail address:
Note: Joining the FQXi mailing list does not give you a login account or constitute membership in the organization.