Contests Home

Current Essay Contest

Previous Contests

**Undecidability, Uncomputability, and Unpredictability Essay Contest**

*December 24, 2019 - April 24, 2020*

Contest Partners: Fetzer Franklin Fund, and The Peter and Patricia Gruber Foundation

read/discuss • winners

**What Is “Fundamental”**

*October 28, 2017 to January 22, 2018*

*Sponsored by the Fetzer Franklin Fund and The Peter & Patricia Gruber Foundation*

read/discuss • winners

**Wandering Towards a Goal**

How can mindless mathematical laws give rise to aims and intention?

*December 2, 2016 to March 3, 2017*

Contest Partner: The Peter and Patricia Gruber Fund.

read/discuss • winners

**Trick or Truth: The Mysterious Connection Between Physics and Mathematics**

*Contest Partners: Nanotronics Imaging, The Peter and Patricia Gruber Foundation, and The John Templeton Foundation*

Media Partner: Scientific American

read/discuss • winners

**How Should Humanity Steer the Future?**

*January 9, 2014 - August 31, 2014*

*Contest Partners: Jaan Tallinn, The Peter and Patricia Gruber Foundation, The John Templeton Foundation, and Scientific American*

read/discuss • winners

**It From Bit or Bit From It**

*March 25 - June 28, 2013*

*Contest Partners: The Gruber Foundation, J. Templeton Foundation, and Scientific American*

read/discuss • winners

**Questioning the Foundations**

Which of Our Basic Physical Assumptions Are Wrong?

*May 24 - August 31, 2012*

*Contest Partners: The Peter and Patricia Gruber Foundation, SubMeta, and Scientific American*

read/discuss • winners

**Is Reality Digital or Analog?**

*November 2010 - February 2011*

*Contest Partners: The Peter and Patricia Gruber Foundation and Scientific American*

read/discuss • winners

**What's Ultimately Possible in Physics?**

*May - October 2009*

*Contest Partners: Astrid and Bruce McWilliams*

read/discuss • winners

**The Nature of Time**

*August - December 2008*

read/discuss • winners

Current Essay Contest

Previous Contests

Contest Partners: Fetzer Franklin Fund, and The Peter and Patricia Gruber Foundation

read/discuss • winners

read/discuss • winners

How can mindless mathematical laws give rise to aims and intention?

Contest Partner: The Peter and Patricia Gruber Fund.

read/discuss • winners

Media Partner: Scientific American

read/discuss • winners

read/discuss • winners

read/discuss • winners

Which of Our Basic Physical Assumptions Are Wrong?

read/discuss • winners

read/discuss • winners

read/discuss • winners

read/discuss • winners

FQXi ESSAY CONTEST

October 29, 2020

2020

First Prizes

Undecidability and indeterminism

Essay Abstract

The famous theorem of Bell (1964) left two loopholes for determinism underneath quantum mechanics, viz. non-local deterministic hidden variable theories (like Bohmian mechanics) or theories denying free choice of experimental settings (like 't Hooft's cellular automaton interpretation of quantum mechanics). However, a precise analysis of the role of randomness in quantum theory and especially its undecidability closes these loopholes, so that-accepting the statistical predictions of quantum mechanics-determinism is excluded full stop. The main point is that Bell's theorem does not exploit the full empirical content of quantum mechanics, which consists of long series of outcomes of repeated measurements (idealized as infinite binary sequences). It only extracts the long-run relative frequencies derived from such series, and hence merely asks hidden variable theories to reproduce certain single-case Born probabilities. For the full outcome sequences of a fair quantum coin flip, quantum mechanics predicts that these sequences (almost surely) have a typicality property called 1-randomness in logic, which is definable via computational incompressibility à la Kolmogorov and is much stronger than e.g. uncomputability. Chaitin's remarkable version of Gödel's (first) incompleteness theorem implies that 1-randomness is unprovable (even in set theory). Combined with a change of emphasis from single-case Born probabilities to randomness properties of outcome sequences, this is the key to the above claim.

Authors Bio

Klaas Landsman (1963) is a professor of mathematical physics at Radboud University (Nijmegen, the Netherlands). He was a postdoc at DAMTP in Cambridge from 1989-1997. He mainly works in mathematical physics, mathematics (notably non-commutative geometry), and foundations of physics. His book Foundations of Quantum Theory: From Classical Concepts to Operator Algebras (Springer, 2017, Open Access) combines these interests. He is associate editor of Foundations of Physics and of Studies in History and Philosophy of Modern Physics and is a member of FQXi.

download essay • discuss essay • back to top

The famous theorem of Bell (1964) left two loopholes for determinism underneath quantum mechanics, viz. non-local deterministic hidden variable theories (like Bohmian mechanics) or theories denying free choice of experimental settings (like 't Hooft's cellular automaton interpretation of quantum mechanics). However, a precise analysis of the role of randomness in quantum theory and especially its undecidability closes these loopholes, so that-accepting the statistical predictions of quantum mechanics-determinism is excluded full stop. The main point is that Bell's theorem does not exploit the full empirical content of quantum mechanics, which consists of long series of outcomes of repeated measurements (idealized as infinite binary sequences). It only extracts the long-run relative frequencies derived from such series, and hence merely asks hidden variable theories to reproduce certain single-case Born probabilities. For the full outcome sequences of a fair quantum coin flip, quantum mechanics predicts that these sequences (almost surely) have a typicality property called 1-randomness in logic, which is definable via computational incompressibility à la Kolmogorov and is much stronger than e.g. uncomputability. Chaitin's remarkable version of Gödel's (first) incompleteness theorem implies that 1-randomness is unprovable (even in set theory). Combined with a change of emphasis from single-case Born probabilities to randomness properties of outcome sequences, this is the key to the above claim.

Authors Bio

Klaas Landsman (1963) is a professor of mathematical physics at Radboud University (Nijmegen, the Netherlands). He was a postdoc at DAMTP in Cambridge from 1989-1997. He mainly works in mathematical physics, mathematics (notably non-commutative geometry), and foundations of physics. His book Foundations of Quantum Theory: From Classical Concepts to Operator Algebras (Springer, 2017, Open Access) combines these interests. He is associate editor of Foundations of Physics and of Studies in History and Philosophy of Modern Physics and is a member of FQXi.

download essay • discuss essay • back to top

Undecidability and unpredictability: not limitations, but triumphs of science

Essay Abstract

It is a widespread belief that results like Goedel’s incompleteness theorems or the intrinsic randomness of quantum mechanics represent fundamental limitations to humanity’s strive for scientific knowledge. As the argument goes, there are truths that we can never uncover with our scientific methods, hence we should be humble and acknowledge a reality beyond our scientific grasp. Here, I argue that this view is wrong. It originates in a naive form of metaphysics that sees the physical and Platonic worlds as a collection of things with definite properties such that all answers to all possible questions exist ontologically somehow, but are epistemically inaccessible. This view is not only a priori philosophically questionable, but also at odds with modern physics. Hence, I argue to replace this perspective by a worldview in which a structural notion of ‘real patterns’, not ‘things’ are regarded as fundamental. Instead of a limitation of what we can know, undecidability and unpredictability then become mere statements of undifferentiation of structure. This gives us a notion of realism that is better informed by modern physics, and an optimistic outlook on what we can achieve: we can know what there is to know, despite the apparent barriers of undecidability results.

Authors Bio

Markus P. Mueller obtained his PhD in 2007 at the Technical University of Berlin. After a postdoctoral position at the Perimeter Institute for Theoretical Physics (where he is still a Visiting Felllow), he has been a Junior Research Group Leader at Heidelberg University, Germany. From 2015-2017, he has been an Assistant Professor at the Departments of Applied Mathematics and Philosophy at the University of Western Ontario, where he was holding a Canada Research Chair in the Foundations of Physics. Since 2017, he has been a Group Leader at the Institute for Quantum Optics and Quantum Information (IQOQI) in Vienna.

download essay • discuss essay • back to top

It is a widespread belief that results like Goedel’s incompleteness theorems or the intrinsic randomness of quantum mechanics represent fundamental limitations to humanity’s strive for scientific knowledge. As the argument goes, there are truths that we can never uncover with our scientific methods, hence we should be humble and acknowledge a reality beyond our scientific grasp. Here, I argue that this view is wrong. It originates in a naive form of metaphysics that sees the physical and Platonic worlds as a collection of things with definite properties such that all answers to all possible questions exist ontologically somehow, but are epistemically inaccessible. This view is not only a priori philosophically questionable, but also at odds with modern physics. Hence, I argue to replace this perspective by a worldview in which a structural notion of ‘real patterns’, not ‘things’ are regarded as fundamental. Instead of a limitation of what we can know, undecidability and unpredictability then become mere statements of undifferentiation of structure. This gives us a notion of realism that is better informed by modern physics, and an optimistic outlook on what we can achieve: we can know what there is to know, despite the apparent barriers of undecidability results.

Authors Bio

Markus P. Mueller obtained his PhD in 2007 at the Technical University of Berlin. After a postdoctoral position at the Perimeter Institute for Theoretical Physics (where he is still a Visiting Felllow), he has been a Junior Research Group Leader at Heidelberg University, Germany. From 2015-2017, he has been an Assistant Professor at the Departments of Applied Mathematics and Philosophy at the University of Western Ontario, where he was holding a Canada Research Chair in the Foundations of Physics. Since 2017, he has been a Group Leader at the Institute for Quantum Optics and Quantum Information (IQOQI) in Vienna.

download essay • discuss essay • back to top

Second Prize

Noisy Deductive Reasoning: How Humans Construct Math, and How Math Constructs Universes

Essay Abstract

We present a computational model of mathematical reasoning according to which mathematics is a fundamentally stochastic process. That is, on our model, whether or not a given formula is deemed a theorem in some axiomatic system is not a matter of certainty, but is instead governed by a probability distribution. We then show that this framework gives a compelling account of several aspects of mathematical practice. These include: 1) the way in which mathematicians generate research programs, 2) the role of abductive reasoning in mathematics, 3) the way in which multiple proofs of a proposition can strengthen our degree of belief in that proposition, 4) the nature of the hypothesis that there are multiple formal systems that are isomorphic to physically possible universes, and 5) the prior distribution that a Bayes rational mathematician ought to have over possible mathematical systems. Thus, by embracing a model of mathematics as not perfectly predictable, we generate a new and fruitful perspective on the epistemology and practice of mathematics.

Authors Bio

David Wolpert is a professor at the Santa Fe Institute, external faculty at the Complexity Science Hub in Vienna, and adjunct professor at ASU. He is the author of three books (and co-editor of several more), over 200 papers, has three patents, is an associate editor at over half a dozen journals, has received numerous awards, and is a fellow of the IEEE. David Kinney is an Omidyar Postdoctoral Fellow at the Santa Fe Institute. He received his PhD in Philosophy in 2019 from the London School of Economics. His work focuses on formal epistemology and philosophy of science.

download essay • discuss essay • back to top

We present a computational model of mathematical reasoning according to which mathematics is a fundamentally stochastic process. That is, on our model, whether or not a given formula is deemed a theorem in some axiomatic system is not a matter of certainty, but is instead governed by a probability distribution. We then show that this framework gives a compelling account of several aspects of mathematical practice. These include: 1) the way in which mathematicians generate research programs, 2) the role of abductive reasoning in mathematics, 3) the way in which multiple proofs of a proposition can strengthen our degree of belief in that proposition, 4) the nature of the hypothesis that there are multiple formal systems that are isomorphic to physically possible universes, and 5) the prior distribution that a Bayes rational mathematician ought to have over possible mathematical systems. Thus, by embracing a model of mathematics as not perfectly predictable, we generate a new and fruitful perspective on the epistemology and practice of mathematics.

Authors Bio

David Wolpert is a professor at the Santa Fe Institute, external faculty at the Complexity Science Hub in Vienna, and adjunct professor at ASU. He is the author of three books (and co-editor of several more), over 200 papers, has three patents, is an associate editor at over half a dozen journals, has received numerous awards, and is a fellow of the IEEE. David Kinney is an Omidyar Postdoctoral Fellow at the Santa Fe Institute. He received his PhD in Philosophy in 2019 from the London School of Economics. His work focuses on formal epistemology and philosophy of science.

download essay • discuss essay • back to top

Third Prizes

Indeterminism, causality and information: Has physics ever been deterministic?

Essay Abstract

A tradition handed down among physicists maintains that classical physics is a perfectly deterministic theory capable of predicting the future with absolute certainty, independently of any interpretations. It also tells that it was quantum mechanics that introduced fundamental indeterminacy into physics. We show that there exist alternative stories to be told in which classical mechanics, too, can be interpreted as a fundamentally indeterministic theory. On the one hand, this leaves room for the many possibilities of an open future, yet, on the other, it brings into classical physics some of the conceptual issues typical of quantum mechanics, such as the measurement problem. We discuss here some of the issues of an alternative, indeterministic classical physics and their relation to the theory of information and the notion of causality.

Authors Bio

Flavio Del Santo is a PhD student in theoretical physics at the University of Vienna and the Institute of Quantum Optics and Quantum Indormation Vienna. His main research interests comprize foundations of quantum mechanics, and history and philosophy of science.

download essay • discuss essay • back to top

A tradition handed down among physicists maintains that classical physics is a perfectly deterministic theory capable of predicting the future with absolute certainty, independently of any interpretations. It also tells that it was quantum mechanics that introduced fundamental indeterminacy into physics. We show that there exist alternative stories to be told in which classical mechanics, too, can be interpreted as a fundamentally indeterministic theory. On the one hand, this leaves room for the many possibilities of an open future, yet, on the other, it brings into classical physics some of the conceptual issues typical of quantum mechanics, such as the measurement problem. We discuss here some of the issues of an alternative, indeterministic classical physics and their relation to the theory of information and the notion of causality.

Authors Bio

Flavio Del Santo is a PhD student in theoretical physics at the University of Vienna and the Institute of Quantum Optics and Quantum Indormation Vienna. His main research interests comprize foundations of quantum mechanics, and history and philosophy of science.

download essay • discuss essay • back to top

A Gödelian Hunch from Quantum Theory

Essay Abstract

What if the paradoxical nature of quantum theory could find its source in some undecidability analog to that of Gödel's incompleteness theorem ? This essay aims at arguing for such Gödelian hunch already suggested by Szangolies via two case studies. Firstly, using a narrative based on the Newcomb problem, the theological motivational origin of quantum contextuality is introduced in order to show how this result might be related to a Liar-like undecidability. A topological generalization of contextuality by Abramsky et al. in which the logical structure of quantum contextuality is compared with ``Liar cycles'' is also presented. Secondly, the measurement problem is analyzed as emerging from a logical error. A personal analysis of the related Wigner's friend thought experiment and and a recent paradox by Frauchiger and Renner is presented, by introducing the notion of ``meta-contextuality'' as a Liar-like feature underlying the neo-Copenhagen interpretations of quantum theory. Finally, this quantum Gödelian hunch opens a discussion of the paradoxical nature of quantum physics and the emergence of time itself from self-contradiction.

Authors Bio

I am a 2nd year Phd student, studying quantum foundations. During my master studies, I worked on quantum contextuality, supervised by Alexei Grinbaum in CEA Paris-Saclay ; and I did my master thesis on the superpositions of quantum causal orders, supervised by Cyril Branciard in Institut Neel (Grenoble). My thesis, supervised by Cyril Branciard, aims at clarifying theoretically a conceptual connexion between quantum contextuality and quantum causality.

download essay • discuss essay • back to top

What if the paradoxical nature of quantum theory could find its source in some undecidability analog to that of Gödel's incompleteness theorem ? This essay aims at arguing for such Gödelian hunch already suggested by Szangolies via two case studies. Firstly, using a narrative based on the Newcomb problem, the theological motivational origin of quantum contextuality is introduced in order to show how this result might be related to a Liar-like undecidability. A topological generalization of contextuality by Abramsky et al. in which the logical structure of quantum contextuality is compared with ``Liar cycles'' is also presented. Secondly, the measurement problem is analyzed as emerging from a logical error. A personal analysis of the related Wigner's friend thought experiment and and a recent paradox by Frauchiger and Renner is presented, by introducing the notion of ``meta-contextuality'' as a Liar-like feature underlying the neo-Copenhagen interpretations of quantum theory. Finally, this quantum Gödelian hunch opens a discussion of the paradoxical nature of quantum physics and the emergence of time itself from self-contradiction.

Authors Bio

I am a 2nd year Phd student, studying quantum foundations. During my master studies, I worked on quantum contextuality, supervised by Alexei Grinbaum in CEA Paris-Saclay ; and I did my master thesis on the superpositions of quantum causal orders, supervised by Cyril Branciard in Institut Neel (Grenoble). My thesis, supervised by Cyril Branciard, aims at clarifying theoretically a conceptual connexion between quantum contextuality and quantum causality.

download essay • discuss essay • back to top

Undecidability, Fractal Geometry and the Unity of Physics

Essay Abstract

An uncomputable class of geometric model is described and used as part of a possible framework for drawing together the three great but largely disparate theories of 20th Century physics: general relativity, quantum theory and chaos theory. This class of model derives from the fractal invariant sets of certain nonlinear deterministic dynamical systems. It is shown why such subsets of state-space can be considered formally uncomputable, in the same sense that the Halting Problem is undecidable. In this framework, undecidability is only manifest in propositions about the physical consistency of putative hypothetical states. By contrast, physical processes occurring in space-time continue to be represented computably. This dichotomy provides a non-conspiratorial approach to the violation of Statistical Independence in the Bell Theorem, thereby pointing to a possible causal deterministic description of quantum physics.

Authors Bio

Tim Palmer is a Royal Society (350th Anniversary) Research Professor in the Physics Department at the University of Oxford. Tim's PhD (under Dennis Sciama) provided the first quasi-local expression for gravitational energy-momentum in general relativity. Through most of his research career, Tim worked on the chaotic dynamics of the climate system and pioneered the development of ensemble methods for weather and climate prediction, for which he won the Institute of Physics's Dirac Gold Medal. However, Tim has retained an interest in foundations of physics and published a number of papers on non-computability in quantum physics (the first in 1995).

download essay • discuss essay • back to top

An uncomputable class of geometric model is described and used as part of a possible framework for drawing together the three great but largely disparate theories of 20th Century physics: general relativity, quantum theory and chaos theory. This class of model derives from the fractal invariant sets of certain nonlinear deterministic dynamical systems. It is shown why such subsets of state-space can be considered formally uncomputable, in the same sense that the Halting Problem is undecidable. In this framework, undecidability is only manifest in propositions about the physical consistency of putative hypothetical states. By contrast, physical processes occurring in space-time continue to be represented computably. This dichotomy provides a non-conspiratorial approach to the violation of Statistical Independence in the Bell Theorem, thereby pointing to a possible causal deterministic description of quantum physics.

Authors Bio

Tim Palmer is a Royal Society (350th Anniversary) Research Professor in the Physics Department at the University of Oxford. Tim's PhD (under Dennis Sciama) provided the first quasi-local expression for gravitational energy-momentum in general relativity. Through most of his research career, Tim worked on the chaotic dynamics of the climate system and pioneered the development of ensemble methods for weather and climate prediction, for which he won the Institute of Physics's Dirac Gold Medal. However, Tim has retained an interest in foundations of physics and published a number of papers on non-computability in quantum physics (the first in 1995).

download essay • discuss essay • back to top

Fourth Prizes

Unpredictability and Randomness

Essay Abstract

Randomness is somewhat opposite to determinism. This essay tries to put this two on the same page. It argues the premise where randomness is a consequence of a deterministic process. It also provides yet another viewpoint on the hidden variable theory.

Authors Bio

Engineering, Computer Science, PhD candidate

download essay • discuss essay • back to top

Randomness is somewhat opposite to determinism. This essay tries to put this two on the same page. It argues the premise where randomness is a consequence of a deterministic process. It also provides yet another viewpoint on the hidden variable theory.

Authors Bio

Engineering, Computer Science, PhD candidate

download essay • discuss essay • back to top

Why is the universe comprehensible?

Essay Abstract

Why is the universe comprehensible? How is it that we can come to know its regularities well-enough to exploit them for our own gain? In this essay I argue that the nature of our comprehension lies in the mutually agreed upon methodology we use to attain it and on the basic stability of the universe. But I also argue that the very act of comprehension itself places constraints on what we can comprehend by forcing us to establish a context for our knowledge. In this way the universe has managed to conspire to make itself objectively comprehensible to subjective observers.

Authors Bio

Ian Durham is a physicist with Saint Anselm College who studies the foundations of physics, formal models of consciousness and free will, and relativistic quantum information. Incomprehensibly he has served as department chair for nearly a decade without committing a felony.

download essay • discuss essay • back to top

Why is the universe comprehensible? How is it that we can come to know its regularities well-enough to exploit them for our own gain? In this essay I argue that the nature of our comprehension lies in the mutually agreed upon methodology we use to attain it and on the basic stability of the universe. But I also argue that the very act of comprehension itself places constraints on what we can comprehend by forcing us to establish a context for our knowledge. In this way the universe has managed to conspire to make itself objectively comprehensible to subjective observers.

Authors Bio

Ian Durham is a physicist with Saint Anselm College who studies the foundations of physics, formal models of consciousness and free will, and relativistic quantum information. Incomprehensibly he has served as department chair for nearly a decade without committing a felony.

download essay • discuss essay • back to top

Discretionary Prize for a interesting literary discourse

Computational Complexity as Anthropic Principle

Essay Abstract

Discoveries over the last two centuries such as that of deterministic chaos, computational complexity, black hole information paradox, have shown us that Laplacian demons are impossible. This need not be seen as a defeat, for we might be able to use this as a way to constrain our theories and thus bring us closer to a more accurate picture of our world.

Authors Bio

Rick Searle is a writer and educator living in central Pennsylvania. He is an affiliate scholar for the Institute for Ethics and Emerging Technology where his essays occur regularly and a member of The Future of Life Institute. He is the author and editor of the book Rethinking Machine Ethics in the Age of Ubiquitous Technology. He blogs at Utopia or Dystopia: where past meets future.

download essay • discuss essay • back to top

Discoveries over the last two centuries such as that of deterministic chaos, computational complexity, black hole information paradox, have shown us that Laplacian demons are impossible. This need not be seen as a defeat, for we might be able to use this as a way to constrain our theories and thus bring us closer to a more accurate picture of our world.

Authors Bio

Rick Searle is a writer and educator living in central Pennsylvania. He is an affiliate scholar for the Institute for Ethics and Emerging Technology where his essays occur regularly and a member of The Future of Life Institute. He is the author and editor of the book Rethinking Machine Ethics in the Age of Ubiquitous Technology. He blogs at Utopia or Dystopia: where past meets future.

download essay • discuss essay • back to top

Discretionary Prize for a creative approach to the problem

Epistemic Horizons: This Sentence is 1/√2(|True> + |False>)

Essay Abstract

In [Found. Phys. 48.12 (2018): 1669], the notion of 'epistemic horizon' was introduced as an explanation for many of the puzzling features of quantum mechanics. There, it was shown that Lawvere's theorem, which forms the categorical backdrop to phenomena such as Gödelian incompleteness, Turing undecidability, Russell's paradox and others, applied to a measurement context, yields bounds on the maximum knowledge that can be obtained about a system. We give a brief presentation of the framework, and then proceed to study it in the particular setting of Bell's theorem. Closing the circle in the antihistorical direction, we then proceed to use the obtained insights to shed some light on the supposed incompleteness of quantum mechanics itself, as famously argued by Einstein, Podolsky, and Rosen.

Authors Bio

Jochen Szangolies acquired a PhD in quantum information theory at the Heinrich-Heine-University in Düsseldorf. He has worked on quantum contextuality, quantum correlations and their detection, as well as the foundations of quantum mechanics. He is the author of "Testing Quantum Contextuality: The Problem of Compatibility".

download essay • discuss essay • back to top

In [Found. Phys. 48.12 (2018): 1669], the notion of 'epistemic horizon' was introduced as an explanation for many of the puzzling features of quantum mechanics. There, it was shown that Lawvere's theorem, which forms the categorical backdrop to phenomena such as Gödelian incompleteness, Turing undecidability, Russell's paradox and others, applied to a measurement context, yields bounds on the maximum knowledge that can be obtained about a system. We give a brief presentation of the framework, and then proceed to study it in the particular setting of Bell's theorem. Closing the circle in the antihistorical direction, we then proceed to use the obtained insights to shed some light on the supposed incompleteness of quantum mechanics itself, as famously argued by Einstein, Podolsky, and Rosen.

Authors Bio

Jochen Szangolies acquired a PhD in quantum information theory at the Heinrich-Heine-University in Düsseldorf. He has worked on quantum contextuality, quantum correlations and their detection, as well as the foundations of quantum mechanics. He is the author of "Testing Quantum Contextuality: The Problem of Compatibility".

download essay • discuss essay • back to top