Home > Programs > Zenith Grants > Zenith Grant Awardee

Zenith Grant Awardee

John Bechhoefer

Simon Fraser University

Co-Investigators

David Sivak, Simon Fraser University; Susanne Still, University of Hawaii at Manoa

Project Title

Maxwell's demon in the real world: Experiments on the constraints governing information processing

Project Summary

Why is information processing so costly? 5% of US energy consumption is devoted to computing requirements, and 20% of calories in a human body are devoted to the operation of the brain. These numbers are much greater than the fundamental limits implied by thermodynamics, but why? Some of the energy consumption may be unavoidable, as the most lenient limits require that a device operate very slowly, whereas practical concerns favor faster, costlier operations. But other inefficiencies are potentially cured by a better understanding of the underlying mechanisms. This is our focus.

More than a hundred and fifty years ago, Maxwell first proposed a thought experiment that captivated both physicists and the public alike. In modern language, Maxwell (and later Szilard and others) argued that acquiring information about a system can allow work to be extracted from a single reservoir of heat, something the second law of thermodynamics seems to forbid. The way out of this paradox is to understand that “information is physical”: running the device that acquires and memorizes information about a system also requires work. Indeed, the second law implies that it requires as much or more work than can be extracted using the information. But other factors, such as the speed of operation and details of system design can limit the efficiency of information when used to fuel devices.

We will study these factors using a “feedback trap,” an instrument we have been developing over the last ten years to study and control the motion of microscopic particles. Using light, we can create essentially arbitrary potentials that apply any desired forces. We will then introduce a remarkable experimental realization, where by simply observing a weight on a spring as it bounces up and down, we can lift it without effort. Such a device is a ratchet: it converts thermal fluctuations into stored work, which may later be used as desired to power other machines; in effect, the ratchet is fueled by information. We will investigate its efficiency in converting this information as fuel into stored work, focusing on constraints that prevent maximum efficiency from being realized. One recently identified structural constraint limiting efficiency is a mismatch between what can be seen and what can be done to a system; we will conduct a diverse array of experiments on simple microscopic systems where we can impose and then study these structural constraints in a systematic way. The insights gained in the above experiments will then be leveraged to construct experimental versions of the two elementary logical operations underlying computers; we will test their reversibility under slow operation and study the minimum costs when constrained to operate at a finite speed.

Our work will lead to the creation of real-world, thermodynamically ultra-efficient, information-processing systems. These physical realizations will clarify the way forward to constructing technological systems and to understanding biological information processing and will serve as exemplars that demand engagement in philosophical discussions, thereby clarifying the role information plays in thermodynamics.

Technical Abstract

Recent advances in theory and experiment have established that the second law of thermodynamics— properly extended to account for information—sets ultimate limits on the conversion of information to the ordered flow of energy, i.e., work. This improved understanding enables the construction of small-scale information engines and may lead to deeper insights into biological systems, such as molecular motors. However, real systems are subject to constraints that may not allow them—not even in principle—to achieve the ultimate limits. We propose to investigate tighter, more realistic bounds on information-to-work conversion, set by two generic factors that decrease efficiency in information-powered motors and ratchets: temporal (finite-time) and structural constraints.

To that end, we will perform a suite of connected experiments on a classical system, a Brownian particle diffusing in water (the heat bath) in a controlled potential created by a novel feedback trap. This methodology achieves arbitrary potentials and force fields at nm lengths and ms times.

Finite-time constraints occur when a protocol manipulating a system must finish rapidly. Ultimate thermodynamic limits require very slow manipulations, yet real-world motors and ratchets need efficient operation at meaningfully rapid rates. In finite-time operation, thermodynamic costs depend on the protocol, but there will exist an optimal protocol that minimizes cost (e.g., work supplied) for the chosen completion time. We will test predictions for optimal bit-erasure and work-extraction protocols in finite time.

Structural constraints arise when measurements of a system are not “well aligned” with its manipulations, making some of the acquired information “irrelevant”—useless for work extraction. We will investigate cases such as a combined Szilard engine / measuring device where the imposed potential is misaligned with the x and y coordinates of a two-dimensional potential that encodes the engine and measuring device, respectively. We will also examine a ratchet with continuous state variables and measurements, where the mismatch is between measurement scale (resolution) and manipulation scale (size of the controllable region of state space). By varying the degree of mismatch, we can quantitatively demonstrate its consequences in terms of reduced efficiency of operation.

An important application of these ideas is to computing. We will create Brownian versions of a complete set of logical gates (capable, when compounded, of realizing any logical operation). We will verify that sufficiently slow operations are reversible, then investigate the finite-time and structural limitations due to mismatch.

The result of our work will be to finally understand why so much energy is required in practice to carry out computation and information processing, whether in technological or biological contexts. ∼5% of US energy consumption is devoted to computing requirements, and ∼20% of calories in a human body are devoted to the operation of the brain, each much higher than ultimate thermodynamic costs. Our experiments will test theories accounting for these higher costs and show how information-processing costs may be reduced through better-informed design.

Skip to content