CATEGORY:
Undecidability, Uncomputability, and Unpredictability Essay Contest (2019-2020)
[back]
TOPIC:
Undecidability, Fractal Geometry and the Unity of Physics by Tim Palmer
[refresh]
Login or
create account to post reply or comment.
Author Tim Palmer wrote on Jan. 25, 2020 @ 17:48 GMT
Essay AbstractAn uncomputable class of geometric model is described and used as part of a possible framework for drawing together the three great but largely disparate theories of 20th Century physics: general relativity, quantum theory and chaos theory. This class of model derives from the fractal invariant sets of certain nonlinear deterministic dynamical systems. It is shown why such subsets of state-space can be considered formally uncomputable, in the same sense that the Halting Problem is undecidable. In this framework, undecidability is only manifest in propositions about the physical consistency of putative hypothetical states. By contrast, physical processes occurring in space-time continue to be represented computably. This dichotomy provides a non-conspiratorial approach to the violation of Statistical Independence in the Bell Theorem, thereby pointing to a possible causal deterministic description of quantum physics.
Author BioTim Palmer is a Royal Society (350th Anniversary) Research Professor in the Physics Department at the University of Oxford. Tim's PhD (under Dennis Sciama) provided the first quasi-local expression for gravitational energy-momentum in general relativity. Through most of his research career, Tim worked on the chaotic dynamics of the climate system and pioneered the development of ensemble methods for weather and climate prediction, for which he won the Institute of Physics's Dirac Gold Medal. However, Tim has retained an interest in foundations of physics and published a number of papers on non-computability in quantum physics (the first in 1995).
Download Essay PDF File
Note: This Essay PDF was replaced on 2020-04-17 07:52:00 UTC.
David Brown wrote on Jan. 26, 2020 @ 01:54 GMT
From page 6, "In a local deterministic theory, each pair of entangled particles is described by a supplementary variable λ, often referred to as a hidden variable ... " — is this meant to be the definition of a "local deterministic theory" or is it a statement about the conventional wisdom of physicists concerning a "local deterministic theory"? Is it possible that time, space, energy, quantum...
view entire post
From page 6, "In a local deterministic theory, each pair of entangled particles is described by a supplementary variable λ, often referred to as a hidden variable ... " — is this meant to be the definition of a "local deterministic theory" or is it a statement about the conventional wisdom of physicists concerning a "local deterministic theory"? Is it possible that time, space, energy, quantum information, and the concept of “variable” are merely approximations that are not quite correct? Consider 2 questions: (1) Is Milgrom the Kepler of contemporary cosmology? (2) Are Riofrio, Sanejouand, and Pipino neglected geniuses instead of crackpots as Motl and others believe them to be? Consider Fredkin’s idea: Infinities, infinitesimals, perfectly continuous variables, and local sources of randomness do not occur in nature. I have presented various speculations: dark-matter-compensation-constant = (3.9±.5) * 10^-5, supersymmetry does not occur in nature, magnetic monopoles do not occur in free space, measurement is a natural process that separates the boundary of the multiverse from the interior of the multiverse, and so on. Motl thinks that I am crackpot, and so do Steven Weinberg, Sheldon Glashow, Frank Wilczek, and even Milgrom himself. They might be correct. There might be scientific proof that I am a crackpot: I claim that the Gravity Probe science team misinterpreted their own experiment. According to the Wikipedia article “Gravity Probe B” (as of 25 January 2020), “Analysis team realised that more error analysis was necessary (particularly around the polhode motion of the gyros) …”
Let us consider Clifford Will’s article:
”Viewpoint: Finally, results from Gravity Probe B” May 31, 2011, Physics, (physics.aps.org)There is the statement: “Finally, during a planned 40-day, end-of-mission calibration phase, the team discovered that when the spacecraft was deliberately pointed away from the guide star by a large angle, the misalignment induced much large torques on the rotors than expected. From this they inferred that even the very small misalignments that occurred during the science phase of the mission induced torques that were probably several hundred times larger than the designers had estimated.” The Gravity Probe B science attribute the problem to “random patches of electrostatic potential fixed to the surface of each rotor, and similar patches on the inner surface of its spherical housing … The team was able to account for these effects in a parametrized model.” I attempted to convince the Gravity Probe B science team that Milgrom is the Kepler of contemporary cosmology, the 4 ultra-precise gyroscopes worked correctly, the gyroscopes indicated dark-matter-compensation-constant is nonzero (contrary to Newton and Einstein), and their parametrized model is a post-hoc rationalization for the expected outcome. The Gravity Probe B science team (or at least 3 of them) dismissed my opinion. My guess is that quantum information reduces to Fredkin-Wolfram information, and MOND is a direct consequence of string theory with the finite nature hypothesis. See the section “Abnormal Acceleration of the Pioneer 10, 11, Galileo and Ulysses Probes” in Pipino’s 2019 article “Evidences for Varying Speed of Light with Time”.
view post as summary
report post as inappropriate
Author Tim Palmer replied on Jan. 27, 2020 @ 19:25 GMT
You can take it as a statement about conventional wisdom, allowing me to relate my uncomputable model to more conventional computable models.
Dizhechko Boris Semyonovich wrote on Jan. 27, 2020 @ 16:55 GMT
Dear Tim Palmer, Your essay is the most verbose and most abstract of those that I have seen here. Fractals, attractor, Cantor set, p-adic numbers are very cool. To say a lot and to show your awareness of everything is a feature of scientific luminaries. I cannot compete with you with my neo-Cartesian generalization of modern physics, which is based on the identity of Descartes' space and matter and which claims that space moves because it is matter. Only in my essay I briefly show that the principle of uncertainty takes the opposite meaning, i.e. becomes the principle of definiteness of points of space, which is matter; further I show the relationship of the probability density of states with the Lorentz factor; I further explain the formula mass-energy of equivalence by the fact that for each corpuscle there is a flow of forces equal to the product of the Planck constant and the speed of light - ch (Casimir force); further I propose the definition of mass as a stream of centrifugal acceleration through a closed surface of a corpuscle, etc.
I invite you to discuss my essay, in which I show the successes of the new Cartesian generalization of modern physics, based on the identity of space and matter of Descartes: “The transformation of uncertainty into certainty. The relationship of the Lorentz factor with the probability density of states. And more from a new Cartesian generalization of modern physics. by Dizhechko Boris Semyonovich. "
report post as inappropriate
Author Tim Palmer replied on Jan. 27, 2020 @ 19:26 GMT
Thank you. I look forward to reading your essay.
Jochen Szangolies wrote on Jan. 28, 2020 @ 07:37 GMT
Dear Tim,
congratulations on an eminently readable and engaging essay on a difficult topic! I will need some time to fully digest your arguments, but I wanted to leave a few preliminary comments---also because our two approaches have some overlap, in particular as regards undecidability and Bell/EPR.
I'll state my biases upfront: I'm skeptical of any sort of 'completion' of quantum...
view entire post
Dear Tim,
congratulations on an eminently readable and engaging essay on a difficult topic! I will need some time to fully digest your arguments, but I wanted to leave a few preliminary comments---also because our two approaches have some overlap, in particular as regards undecidability and Bell/EPR.
I'll state my biases upfront: I'm skeptical of any sort of 'completion' of quantum mechanics by hidden (or not-so-hidden) variables, that is, viewing quantum mechanics as a statistical theory of some deeper level of reality, and I'm in particular skeptical of superdeterminism.
That said, I'm always happy to see somebody giving an 'alternative' approach a strong outing---and that you certainly do. I do see now that I had dismissed these topics perhaps too quickly, so I've got to thank you for that. But on to some more detailed points.
You note the similarity between the Liouville equation and von Neumann's equation; you probably know this, but that similarity can be made much more explicit by considering phase-space quantization. There, the Moyal equation emerges as a deformation of the Liouville equation (with deformation parameter hbar), and contains the same empirical content as von Neumann's (they are linked by the Wigner-Weyl transformation).
I'm of two minds whether this supports your contention, or not. On the one hand, you can explicitly link the quantum evolution to that of a stochastic system; on the other, the deformation by hbar essentially means that there's no finer grain to the phase space, you can't localize the state of the system any further.
I'd be interested in how your approach works with things like the Pusey-Barrett-Rudolph theorem, that's generally thought to exclude viewing quantum mechanics as a stochastic theory of some more fundamental variables---although, as usual, and as you seem to be adept at exploiting, there are various assumptions and caveats with all 'no-go' theorems. I think there's an assumption that successive preparations of a system can be made independently; I'm not sure, but maybe that fails, taking one out of the invariant set.
I'll also have to have a more thorough look at how your model system is supposed to yield Bell inequality violation. Of course, having a Hilbert space formulation is not sufficient---you can formulate classical mechanics in Hilbert space, too (the Koopmann-von Neumann formalism).
I've got to go now, I'll add more later when I have some time. In the meantime, congratulations on a very strong and interesting entry into this contest!
view post as summary
report post as inappropriate
Author Tim Palmer replied on Jan. 28, 2020 @ 08:03 GMT
Thanks indeed for these very kind remarks.
A few comments. I do not really view my approach as a completion of quantum mechanics in the sense of providing extra structure to be added to the quantum theoretic formalism. As mentioned in the Appendix to the essay, the closed Hilbert Space of quantum mechanics only arises in the singular limit where my finite fractal parameter p is set equal to infinity, and this is an unphysical limit! Hence, rather than complete quantum mechanics, my own view is that, guided by quantum theory, we have to go back to basics taking ideas based around non-computability and fractal geometry seriously!
You are right to be sceptical of superdeterminism. However, the reasons to be sceptical - e.g. that it would imply statistically inequivalent sub-ensembles of particle pairs in a Bell experiment, simply do not apply to this model. Instead, I focus on a violation of Statistical Independence which only has implications when considering hypothetical counterfactual measurements in a Bell experiment. This interpretation of the violation of SI only makes sense in the type of non-computable model proposed.
In fact this same point will also lead to a negation of the Pusey Barrett Rudolph theorem, through a violation of Preparation Independence. However, once again such a violation only occurs when considering counterfactual alternative preparations.
The bottom line here (something I focus on in the essay) is that we have to think very carefully about what we mean by things like free choice and causality in these quantum no-go theorems: counterfactual-based and space-time based definitions (c.f Newton clapping his hands in the quad) are inequivalent in the type of non-computable model I am proposing here.
Joe A Nahhas replied on Jan. 29, 2020 @ 00:39 GMT
I can produce general relativity experimental numbers and special relativity experimental numbers 5000 times using any of 5000 physical sciences laws and using any level mathematics including 5th grade arithmetic and I can produce entire Einstein's relativity theory from Newton's equation contrary to what main stream scientists claim and I can produce it 5000 times as visual effects between (27.321 days, 365.256 days) motion (PHD dissertation subject 1990 University of Michigan Nuclear engineering department) I introduced "Hacking Physical Reality" and ended "Nobel Physics" decades ago. I know I sound unbelievable but it is a fact and is a well established fact.
report post as inappropriate
Jochen Szangolies replied on Jan. 29, 2020 @ 16:57 GMT
I've been thinking about differences and similarities between our respective models. I focus on a function which assigns values for all measurements and all states of a certain system, and show that there must be measurements such that this function is undefined---which yields the backdrop for Bell inequality violations. This also needs a restriction on admissible counterfactuals---otherwise, the...
view entire post
I've been thinking about differences and similarities between our respective models. I focus on a function which assigns values for all measurements and all states of a certain system, and show that there must be measurements such that this function is undefined---which yields the backdrop for Bell inequality violations. This also needs a restriction on admissible counterfactuals---otherwise, the EPR argument simply yields value-definiteness for complementary observables. I argue that different post-measurement states support different counterfactuals, and that thus, it is not permissible to make inferences about the value that a measurement in a different direction would have yielded, while leaving the outcome at the distant particle invariant. The measurement that has actually been performed is part of the antecedent conditions necessary to make inferences about the value at the distant particle, thus, in a situation in which that measurement is changed, one cannot expect to be able to similarly make these inferences.
In your model, it seems to me, counterfactuality---or the inhibition thereof---enters one step earlier: it is not even permissible to draw conclusions based on the possibility of having made a different measurement, because the state of the world in which one would have made that measurement does not lie on the invariant set. Your invariant set then plays sort of the same role as my f(n,k) does---only I am considering which observables on the object system can be assigned definite values, while you are essentially considering simultaneously admissible pairs of (measurement direction, outcome).
If that's right (and do correct me if it's not), then maybe our approaches are not so far from each other. Perhaps my model could be accompanied with a restriction that only those observables for which a definite value can be obtained do, in fact, get measured; on the other hand, you could perhaps hold that only observables for which the pair (measurement direction, outcome) lies on the invariant set have definite values, while a measurement in a different direction yields a random outcome---and perhaps a change in the invariant set, sort of analogous to an updating of the post-measurement state. In this way, one could perhaps trade superdeterminism for what might end up being a sort of nonlocal influence.
One question, related to whether your approach is a 'completion' of quantum mechanics: do you assign definite values to observables for which the quantum state does not allow a definite prediction? I mean, certainly, the admissible measurements can't simply be the eigenbasis of the state, that is, those where the quantum state yields a definite value. Then, in some sense, it seems to me that your approach does amount to a completion in at least a 'partial' way, in that there exist possible measurements that have a definite outcome on the invariant set, but whose value can't be predicted from the quantum state of the system, which thus yields an incomplete description. Or am I way off-base there?
view post as summary
report post as inappropriate
Author Tim Palmer replied on Jan. 30, 2020 @ 14:57 GMT
Regarding your question, the reason I do not consider my approach to be a completion of quantum mechanics, is that (in my approach) there is a class of complex Hilbert States (those with irrational squared amplitudes or irrational phases) that are not probabilistic representations of any underpinning deterministic states. In this sense, large parts of the complex Hilbert Space of quantum theory have no underpinning ontic representation at all. In this sense it is less a matter of completing quantum mechanics, as thinning out the state space of quantum mechanics to leave only those quantum states that can then be given a probabilistic representation for some underlying deterministic ontology. Giving up arithmetic closure at level of Hilbert states is not a problem, since arithmetic closure can be reinstated at the deeper deterministic level (e.g. through the ring-theoretic properties of p-adic integers).
Jochen Szangolies replied on Feb. 2, 2020 @ 15:47 GMT
Well, the issue of completion, to me, is whether an approach assigns values to quantities that the usual quantum formalism leaves indeterminate (where quantum mechanics therefore is incomplete), and this I think yours does. After all, if the values of measurements that are present within the invariant set were still left undetermined, and their outcome irreducibly probabilistic, it seems to me not...
view entire post
Well, the issue of completion, to me, is whether an approach assigns values to quantities that the usual quantum formalism leaves indeterminate (where quantum mechanics therefore is incomplete), and this I think yours does. After all, if the values of measurements that are present within the invariant set were still left undetermined, and their outcome irreducibly probabilistic, it seems to me not much would be won.
I have to say, I still can't shake some uneasyness regarding superdeterminism. I'm not bothered by the lack of free choice/free will, but I am not sure if such a theory can ever be considered empirically adequate in any sense. Usually, if we perform a measurement, we consider ourselves to be acquiring new information about the system; but it seems to me that in a superdeterministic world, there is formally no information gain at all---the outcome of the measurement does not reduce my uncertainty about the world anymore than the fact that I perform that measurement does. So how do we really learn anything about the world at all?
Furthermore, it seems that one can cook up some superdeterministic scheme for any metaphysical predilection one is reluctant to give up. Take a good old-fashioned Mach-Zehnder interferometer setup: the fact that one of the detectors always stays dark tells you that there must be interference between photons taking 'both paths', if you will permit me this inexact phrasing.
Suppose now we could switch either of the two detectors on at any given time, and whenever we do so, we observe a detection event at the 'bright' detector. We could interpret that as evidence for interference---but equally well, for a superdeterministic rule that exactly correlates which detector we switch on with the path the photon took.
There are also worries of empirical underdetermination that seem to me to go beyond the standard Duhem-Quine notion. It's not surprising that I can explain the same data with different theories; but superdeterminism introduces another layer into that trouble. Usually, theories explaining the same evidence have to have some broad notion of (perhaps categorical) equivalence, but superdeterminism decouples the explanatory notions from the empirical evidence---the ultimate layer may thus take on radically different forms, each with some superdeterministic selection rule tuned to yield the observed evidence from that.
Another issue is that of empirical justification. We take our faith in a theory to be reasonable based on the empirical data corroborating it; but the data does not corroborate a superdeterministic theory in the same way, because our observations are not independent of the data. Hence, are we ever justified in believing a superdeterministic explanation? How could evidence ever justify our belief in a theory that ultimately questions that very evidence?
view post as summary
report post as inappropriate
Eckard Blumschein replied on Feb. 7, 2020 @ 11:44 GMT
"... the closed Hilbert Space of quantum mechanics only arises in the singular limit where my finite fractal parameter p is set equal to infinity, and this is an unphysical limit! Hence, rather than complete quantum mechanics, my own view is that, guided by quantum theory, we have to go back to basics ..."
There may you meet the humble effort of an old engineer:
Eckard Blumschein
report post as inappropriate
hide replies
Joe A Nahhas wrote on Jan. 29, 2020 @ 00:23 GMT
Physical reality can be hacked. First method of hacking physical reality is visual hacking. Visual hacking of physical reality is a display of physical objects motion in real time and physical objects motion has nowhere to hide caught naked for the first time since the beginning of time on display and in real time.
1 – Visual Hacking of Earth’s motion or a display of Earth’s motion in real time = 27.321 days cycle wrongly assigned to the Moon.
2 – Hacking the Sun’s motion or a display of the Sun’s motion in real time = 365.256 days cycle wrongly assigned to Earth.
3 – Physical sciences 5000 laws of physics, astronomy, physical chemistry, physical biology, physical engineering and technology in its entirety is based on light sources as a measuring tool and as used it only measures physics lab physical motion or Earth’s motion in 27.321 days.
4 – The (27.321 days, 365.256 days) Time cycles distance equivalence cycles = (R meters, C meters); R = Earth’s theoretical radius = 6371000 meters and C = 299792458 meter claimed as light speed/second
4 – The Space – Time errors = NASA’s space data sheets
5 – The Inverse Space – Time errors = CERN’s atomic/nuclear data
Meaning: Physical Sciences 5000 physics laws can be produced as (27.321days, 365.256 days, 6371000 meters, 299792458 meters) space –time errors is the subject of this contest of Extermination of Modern and Nobel Prize winners physics and physicists from 1610 Copernicus to 2020 Nobel winners using any level mathematics including 5th grade arithmetic and starting with Physics Most Erroneous Equation E = MC2. Are you ready to hack and strip the incontestable truth of physical reality?
report post as inappropriate
Jonathan J. Dickau wrote on Jan. 30, 2020 @ 19:33 GMT
I very much like this idea Tim...
But I will have to re-read your paper a few times to fully grasp the depth of your reasoning. It seems reminiscent of some of the wild-sounding ideas about Cantorian space from Mohammed El Naschie when he was editing 'Chaos, Solitons, & Fractals' but with a different flavor. I think maybe your ideas have a more solid basis, but with El Naschie it is hard to tell - because so many of his references are self-citations from earlier work, hidden behind a pay wall.
I also talk about fractals in my essay, but the context is rather different. For what it is worth; I like the work of Nottale on Scale Relativity, and I admire the breadth of its explanatory power as a model, though I don't think he got every detail right. When sent a copy by the publisher of his book for review; I enthusiastically recommended its publication. And it inspired my departed colleague Ray Munroe, who I think used it in an FQXi essay.
More later,
Jonathan
report post as inappropriate
Author Tim Palmer replied on Jan. 31, 2020 @ 07:49 GMT
I think the mathematics in my talk is pretty solid. As to the physics, well at the end of the day it will come down to experiment. I expect the crucial experiment to test invariant set theory will lie in the field of table-top experiments which probe the accuracy of quantum theory in domains where the self gravitation of a quantum system is not negligible. For example, based on the idea that gravity represents a clustering of states on the invariant set, the theory predicts that gravity is inherently decoherent and cannot itself encode entanglement.
Jonathan J. Dickau replied on Jan. 31, 2020 @ 15:30 GMT
I like that answer Tim...
There was recently published a paper describing an experiment that claimed to disprove objective reality, using a system with 6 entangled qubits. I think this is wrong. There are too many co-linear points, and the entire apparatus is co-planar. There are also 6 points instead of the 7 required by projective geometry. An experiment designed to correct these flaws could also search for the effects you describe. A ball of osmium placed at one end of the bench could be used to detect gravity-induced decoherence, and so on.
In other words; I think it could be done.
All the Best,
Jonathan
report post as inappropriate
Jonathan J. Dickau replied on Jan. 31, 2020 @ 15:41 GMT
For what it's worth...
I had some interaction with Phil Pearle, when he was first developing statevector reduction theory, which later blossomed into CSL. I have followed that evolution somewhat. But I recall a recent paper by Ivan Agullo that also talked about gravity-induced decoherence and broken EM symmetry, which I will try to find.
I'd love to discuss this further. I will try to read your paper again first.
Best,
Jonathan
report post as inappropriate
Robert H McEachern wrote on Feb. 1, 2020 @ 20:53 GMT
Tim,
On page 6 of your essay, you state that "The principal obstacle in drawing together chaos and quantum theory is therefore not the linearity of the Schrodinger equation, but the Bell Theorem."
You appear to be unaware of the fact that Bell's theorem only applies to entangled,
perfectly identical particles, like identical twins. There is no evidence that such
idealized particles actually exist in the real world. Consequently, it is easy to
demonstrate that entangled, non-identical, "fraternal twin" particles, will reproduce the observed "Bell correlations", with supposedly impossible-to-obtain detection efficiencies, and without any need for hidden variables, non-locality or any other non-classical explanation. This has a direct bearing on your issue of "drawing together chaos and quantum theory", since the underlying cause for the "quantum" behaviors, turns out to be, one single-bit-of-information removed from chaos (unrepeatable behavior).
Rob McEachern
report post as inappropriate
Lorraine Ford wrote on Feb. 5, 2020 @ 12:27 GMT
Tim Palmer,
You recently co-wrote an arxiv paper titled Rethinking Superdeterminism together with physicist Sabine Hossenfelder [1].
I happen to think it is rather strange for an internationally renowned meteorologist to think that the climate and everything else is superdetermined anyway. Here is an exchange I had today with your co-author Sabine Hossenfelder about whether the fires and the destruction in Australia are/were superdetermined [2]:
Lorraine Ford 1:31 AM, February 05, 2020Re "your paper with Dr. H[ossenfelder]" (on superdeterminism): I hope Dr. H[ossenfelder] and Dr. P[almer] are enjoying the smell of burnt koala flesh and fur wafting over from Australia. It was all superdetermined, according to them.
Sabine Hossenfelder 2:34 AM, February 05, 2020Lorraine, You think you are witty. You are wrong.
Lorraine Ford 3:16 AM, February 05, 2020Sabine, I DON'T think I'm witty. I'm Australian, living with smoke-hazy skies, the horror of a billion animal deaths, let alone the people who have died, and more than 10 million acres of land burnt. You are saying that this was all superdetermined.
Sabine Hossenfelder 4:12 AM, February 05, 2020Lorraine, Correct. If you have a point to make, then make it and stop wasting our time.
1. https://arxiv.org/abs/1912.06462v2
2. http://backreaction.blogspot.com/2020/02/guest-post-undecida
bility.html
report post as inappropriate
Author Tim Palmer replied on Feb. 5, 2020 @ 13:28 GMT
Lorraine
Perhaps the most important thing to say in relation to my essay is that there is a difference between "superdeterminism" and "determinism". The former questions whether it is the case that, in a hidden variable model of the Bell experiment, the distribution of hidden variables are independent of the measurement settings. Without bizarre conspiracies, such distributions certainly...
view entire post
Lorraine
Perhaps the most important thing to say in relation to my essay is that there is a difference between "superdeterminism" and "determinism". The former questions whether it is the case that, in a hidden variable model of the Bell experiment, the distribution of hidden variables are independent of the measurement settings. Without bizarre conspiracies, such distributions certainly are independent in classical models. However, in my essay I discuss a non-classical hidden-variable model which has properties of non-computability (in relation to state-space geometry) where this independence can be violated without conspiracy.
This has nothing to do with the Australian bush fires. In discussing the climate system (which is essentially a classical system) the concept of superdeterminism never arises explicitly. However, as a classical system it is implicitly not superdeterministic (we are not aware of any bizarre conspiracies in the climate system).
However, I think you are a little confused between the issues of determinism and superdeterminism. Somewhat perversely, given the name of the word, it is possible for a superdeterministic model to actually not be deterministic!
Instead, I think the question you are asking is about determinism, e.g. whether it was "predetermined" that the 2019/20 Australian bush fires would occur, ten million, or indeed ten billion years ago? Put another way, was the information that led to these fires somehow contained on spacelike hypersurfaces in the distant past. I sense that you feel it is somehow ridiculous to think that this is so, and I know colleagues who think like you. However, not everyone does and logically there is nothing to disprove the notion that the information was indeed contained on these hypersurfaces (albeit in a very inaccessible highly intertwined form).
However, this is not a discussion I would wish to have on these pages, not least because it rather deviates from the point of the essay which is that undecidability and non-computability provide a novel means to violate the Statistical Independence assumption in the Bell Theorem, without invoking conspiracy or violating the statistical equivalence of real-world sub-ensembles of particles.
Hope this helps.
Perhaps, since you mentioned the bush fires, I could tell you that I have a proposal for the Australian government if they want to reduce the risk of these fires in the future. My idea was published in the Sydney Morning Herald (and other Australian outlets) last week:
https://www.smh.com.au/environment/climate-change/we-sh
ould-be-turning-our-sunshine-into-jet-fuel-20200123-p53u09.h
tml
view post as summary
Lorraine Ford replied on Feb. 6, 2020 @ 00:43 GMT
Tim Palmer,
Thanks for your detailed reply. I think I WAS a little confused about the difference between determinism and superdeterminism: thanks for explaining. However, you are still in effect saying that every single koala death by fire was pre-determined.
I will put the determinism issue another way, in terms of the problem of decidability: how we make decisions, and how we...
view entire post
Tim Palmer,
Thanks for your detailed reply. I think I WAS a little confused about the difference between determinism and superdeterminism: thanks for explaining. However, you are still in effect saying that every single koala death by fire was pre-determined.
I will put the determinism issue another way, in terms of the problem of decidability: how we make decisions, and how we symbolically represent decision making. The issue is: what exactly do the symbols and numbers of physics represent, and what do the yellow blobs (Figure 3) represent, i.e. what is their
deeper meaning? I have made various versions of the following case several times on the backreaction blogspot:
According to physics there are no IF…THEN…. algorithmic steps in the laws of nature, there are only lawful relationships that are representable by equations. Try to do IF…THEN… with equations. You can’t. So according to deterministic physics, you CAN’T make decisions, you CAN’T learn from your mistakes, and you CAN’T be responsible for your actions.
Where are the models showing how IF…THEN… is done, using nothing but equations? IF…THEN… is about outcomes (the THEN… bit) that arise from
logical analysis of situations (the IF… bit), but equations can’t represent logical analysis. IF…THEN… is about non-deterministic outcomes, because logical analysis of situations is non-deterministic: there are no laws covering logical analysis. And you can’t derive IF…THEN… from equations.
The point of what I’m saying is this: If physicists need to use IF…THEN… logical analysis to represent the world (e.g. your Figure 1), then they are assuming that there exists a logical aspect of the world that is not representable by deterministic equations, and not derivable from deterministic equations. Your idea is that the world is deterministic, but the fact that you need to use symbolic representations of logical analysis and logical steps to represent the world contradicts the idea that the world is deterministic.
(Please don’t appeal to computer models. As a former computer programmer and analyst, I know that computers are 100% deterministic: they don’t do IF…THEN… logical analysis. They deterministically process symbolic representations of IF…THEN… steps, which deterministically process symbolic representations of information.)
view post as summary
report post as inappropriate
Author Tim Palmer replied on Feb. 6, 2020 @ 07:44 GMT
We are going a bit off topic here. However, as I discuss in my essay, one can view free will as an absence of constraints that would otherwise prevent one from doing what one wants to do, a definition that is compatible with determinism. From this one could form a theory of how we make decisions based on maximising some objective function which somehow encodes our desires. This does allow one to learn from previous bad decisions, since such previous experiences would provide us with data that a certain type of decision, if repeated, would lead to a reduction, not an increase, in that objective function.
However, we are veering into an area that has exercised philosophers for thousands of years and I suggest this is not the right place to discuss such matters. Of course, I respect your alternative point of view - there are many eminent philosophers and scientists who would agree with you.
Lorraine Ford replied on Feb. 6, 2020 @ 09:53 GMT
Tim,
What do your diagrams and equations actually represent? Do they represent a world ruled by equations or do they represent a world that is taking logical steps and performing logical analysis? Where does the logical analysis and steps that you are personally taking end? Do the logical steps in Figure 1 represent logical steps
you are taking (e.g. to solve a problem or equation) or do the logical steps in Figure 1 represent logical steps the world is taking?
report post as inappropriate
hide replies
Domenico Oricchio wrote on Feb. 5, 2020 @ 17:46 GMT
I have a problem with the idea that the chaos is incompatible with relativistic invariance; I cant’t give an example now, but a differential equation that is relativistic invariant and chaotic could be possible: I am thinking that in the solution set of the Einstein Field Equation there could be a solution that covers the space with a non-integer dimension, thus obtaining chaos for the metric tensors dynamics. I think that, for example, the Black Hole merger has an attractor (fixed point or almost limit cycle).
An Einstein field equation with weak field is a linearizable theory, so that there is an approximation nearly linear.
I don’t understand: is a quantum non-locality the effect of the quantum field theory? The gauge boson interact between parts of the system, that transmit quantum information. So that to say that a system must satisfy bell's theorem is not equivalent to say: must a gauge boson exist?
report post as inappropriate
Author Tim Palmer replied on Feb. 6, 2020 @ 07:51 GMT
Actually what I say is that chaos is only superficially incompatible with relativistic invariance. The key is to "geometrise" chaos and that can be done by considering the invariant sets of chaos. I then try to show that these invariant sets may in turn help make chaos compatible with quantum theory.
My own view is that the resolution of the Bell Theorem is not through quantum field theory, since that is an extension of quantum theory. Rather my belief is that there is a deeper deterministic formalism based on non-computable fractal invariant sets which has quantum theory as a singular limit.
I am currently working on an extension of these invariant set ideas to incorporate the formalism of relativistic quantum field theory.
Domenico Oricchio replied on Feb. 6, 2020 @ 14:43 GMT
Your essay is interesting.
Reading it made me think.
For example if there was a chaotic state in general relativity, then is it possible that Hausdorff's measure of particle trajectory an relativistic invariant? If it were not so, then there would be an observer for whom the relativist trajectory is non-fractal, but this seems unlikely to me (it's like a change of topology, to change from a chaotic trajectory to a non-chaotic trajectory).
Also for the Bell theorem (or the Einstein-Podolsky-Rosen paradox), it is possible to study the Feynmann diagram for the cross section in the scattering of two polarized Dirac particles (I read today the results in Greiner book) and to obtain the probability of the final state (with elicities). If there are interaction, so gauge bosons, then there is not an instantaneous effects; the collapse of Alice state communicate the state to Bob using the gauge bosons interaction, with the light speed.
report post as inappropriate
Author Tim Palmer wrote on Feb. 6, 2020 @ 07:42 GMT
We are going a bit off topic here. However, as I discuss in my essay, one can view free will as an absence of constraints that would otherwise prevent one from doing what one wants to do, a definition that is compatible with determinism. From this one could form a theory of how we make decisions based on maximising some objective function which somehow encodes our desires. This does allow one to...
view entire post
We are going a bit off topic here. However, as I discuss in my essay, one can view free will as an absence of constraints that would otherwise prevent one from doing what one wants to do, a definition that is compatible with determinism. From this one could form a theory of how we make decisions based on maximising some objective function which somehow encodes our desires. This does allow one to learn from previous bad decisions, since such previous experiences would provide us with data that a certain type of decision, if repeated, would lead to a reduction, not an increase, in that objective function.
However, we are veering into an area that has exercised philosophers for thousands of years and I suggest this is not the right place to discuss such matters. Of course, I respect your alternative point of view - there are many eminent philosophers and scientists who would agree with you.
view post as summary
Colin Walker wrote on Feb. 6, 2020 @ 23:20 GMT
Hi Tim,
It is quite a revolutionary program you have embarked on, overthrowing the infinitesimal and subverting the continuum. Your standard of rationality includes its mathematical definition: that any rational quantity can be expressed as a ratio of whole numbers. The conviction that the infinite and the infinitesimal have no place in physics goes well with the idea that appropriate...
view entire post
Hi Tim,
It is quite a revolutionary program you have embarked on, overthrowing the infinitesimal and subverting the continuum. Your standard of rationality includes its mathematical definition: that any rational quantity can be expressed as a ratio of whole numbers. The conviction that the infinite and the infinitesimal have no place in physics goes well with the idea that appropriate mathematics ought to be involved. Nearly all of the infinities have been expelled from physics.
But there is still the century-old foundational problem with infinity in physics that appropriate mathematics might help resolve. It strikes me as ironic that Hilbert's admonishment about how "the infinite is nowhere to be found in reality" still stands today, considering his name is on the Einstein-Hilbert action which is involved in the unlimited gravitational energy required for inflation. This point is made clear by Paul Steinhardt who, when asked where the energy for inflation comes from, confirmed that it comes from a bottomless supply of gravitational energy. Since inflation requires infinite energy, the theory is inadmissible by Hilbert's standard of rationality, and so is the general theory of relativity which is supposed to deliver that energy.
On the presumption that it is essentially classical Newtonian gravitational potential energy which is apparently the source of the unlimited energy, it should be of interest that a relativistic version of gravitational potential energy can be constructed from a consideration of the composition of relativistic gravitational redshift due to a sphere.
For example, given a test particle of mass m, the classical element of potential energy due to a spherical shell of matter is du = -F(r) dr where F(r) is the force of gravity at radius r. The redshift due to the shell is given by dz = du / mc^2. The total redshift, z, from all shells can be composed relativistically as the product, 1 + z = PRODUCT[1 + dz] = exp[INTEGRAL dz], using Wikipedia's Pi notation (here "PRODUCT") for the Volterra product integral. The composite redshift due to a complete sphere of mass M, at radius R, is then z = exp(GM/Rc^2) - 1, not the conventional relativistic (1 - 2GM/Rc^2)^{-1/2} - 1, and not the first-order approximation, GMm/Rc^2. The corresponding relativistic gravitational potential energy must have similar exponential form to be consistent with the composition of relativistic gravitational redshift.
Unlike Newtonian potential energy which is negative, relativistic gravitational potential energy is positive, and equal to mc^2 exp(-GM/Rc^2). In the absence of a gravitational field, it is equal to rest energy. Gravitational potential energy is taken from that rest energy, and thus has a finite limit. Newtonian potential energy -GMm/R, is an approximation to mc^2 [exp(-GM/Rc^2) -1] for weak fields. Relativistic gravitational potential energy is an exponential map of the classical potential energy.
Relativistic gravitational potential energy gives an escape velocity sensibly limited to the speed of light, as might be expected from a relativistic theory, whereas this condition is violated in both the classical theory and general relativity. The singularity-free metric corresponding to the escape velocity is the same as Brans-Dicke. In that theory, inertial and gravitational mass differ slightly, by a presently undetectable amount. I suspect this discrepancy could arise from failing to account properly for the exponential nature of gravitational energy.
The original work can be found at the link in the file
shells2010dec29.pdf. It has some simple examples to demonstrate the essential concepts. I was not aware of product integrals when it was written in 2010. This derivation of the product integral addresses the issue of normalization, which can be inferred from the physics of the problem. I don't have an essay for this contest, but here is a link to
an essay from the last contest that shows some radical consequences of accepting the composite relativistic gravitational redshift.
It seems to me that there might be a way to incorporate these relativistic compositions for gravity into general relativity via the product integral and arrive at the Brans-Dicke metric. I wonder, what would be your intuition on this possibility?
Colin Walker
view post as summary
report post as inappropriate
Author Tim Palmer wrote on Feb. 7, 2020 @ 08:15 GMT
Jonathan J. Dickau wrote on Feb. 13, 2020 @ 22:18 GMT
Hello again Tim,
After reading Lawrence Crowell's paper; I have a greater appreciation for your work, and even moreso that you are able to write so lucidly about it for lay audiences. I am impressed. I will have more questions now, after all that fuel for thought.
Would the correctness of your theory imply that the fabric of spacetime is fractal? This is a feature of several quantum gravity theories, in terms of the microstructure. Does that project onto the large scale structure of the cosmos in your view? Would it surprise you if I said it appears some of your starting assumptions would follow naturally, if my own theory pans out?
Tip of the old iceberg for you.
More later,
Jonathan
report post as inappropriate
Author Tim Palmer replied on Feb. 14, 2020 @ 07:32 GMT
Thanks for these kind comments Jonathan.
You ask a good question. However, to be honest, I am not 100% sure at present what my model implies about the structure of space-time, so I prefer to be agnostic about this for now. However, I am working on a generalisation of my model so that the properties of momentum/position commutators are (like spin commutators) describable by number theory. This will allow me to start reformulating relativistic quantum field theory in a more deterministic framework, and from there answers to your questions should emerge. However, I want to do this slowly and carefully, and not jump to conclusions that may at first sight seem reasonable, but will ultimately turn out to be wrong.
Jonathan J. Dickau replied on Feb. 14, 2020 @ 15:52 GMT
That was a satisfying answer Tim...
This speaks to the question of what sort of evidence of your theory would we see in the cosmos that might provide verification or refutation for its veracity. I asked a similar question of Gerard 't Hooft at one point and his answer was similar - that it was too early to tell what the cosmic evidence would be.
The following year at FFP11; he elaborated in his talk about the desirability of and difficulties with obtaining Lorentz invariance in a CA based QG theory, but still no hard predictions about what we would observe (in black hole emissions perhaps) that would distinguish it from the standard.
I've seen or heard several predictions from Loop Quantum Gravity folks about possible signature detections - such as Lorentz invariance violations, comb filtered emissions from black holes, and so on. But I see that each time such a prediction is made, folks will jump on it as excluding a theory if the exact signature predicted is not found. And String Theory folks seemingly refrain from making any hard predictions at all.
All the Best,
Jonathan
report post as inappropriate
Steve Dufourny replied on Mar. 5, 2020 @ 10:38 GMT
Hello to both of you,
the problem about these strings is that it is a kind of fashion now about what we have at this planck scale and about the 1D main Cosmic field creating our reality by the Waves, fields and oscillations. But in fact a sure thing is that nobody can prove and be certain about this philosophical generality. The same for ,my gravitational coded aether made of spheres sent from the central cosmological sphere. We cannot affirm and all rational deterministic searchers accept the difference between a proved law, equation, axiom or an assumption. Nobody can affirm what we have at this planck scale nor about the philosophy of the generality of this universe. Have we coded particles or Waves creating our geometries, topologies, matters and properties and this emergent space time. We know that inside the theoretical sciences Community, all we are persuaded and that the Vanity is important, but without proofs we cannot affirm, it is a fact.
What is really this space, this vacuum ? is it still this gravitational superfluid coded aether or fields different , we don t know simply and we must accept this and our limits in knowledges.
Regards
report post as inappropriate
Eckard Blumschein replied on Mar. 5, 2020 @ 15:57 GMT
Let me support : "slowly and carefully, and not jump to conclusions that may at first sight seem reasonable, but will ultimately turn out to be wrong."
Having read
"⊆ means subset, ⊂ will not be used"
I reconsidered my question "Isn't it logically impossible to include a single real number?"
report post as inappropriate
Steve Dufourny replied on Mar. 18, 2020 @ 15:32 GMT
the numbers and maths are not the problems you know Eckard , nor the finite ranked numbers or the different class of numbers, rationals, reals, complex, irrationals or others, or the different infinities inside this physicality, the problem is our limitations in knowledges, you can tell all what you want about the single ral number, all what I said is a fact. The set, the sets, the subsets are not the problem, the partition universal is the problem
report post as inappropriate
hide replies
Jack James wrote on Feb. 17, 2020 @ 10:34 GMT
Dear Tim,
Great essay, congrats. Wish I had a background in physics to completely understand. Please indulge me if you have time.
1) Are you essentially suggesting that mathematical incomputability/undecidability exists as space-time, emergent from quantum non-linearity (as that is what the wave function seems to suggest, which may in effect be the cause of macroscopic gravity?
2) If something (anything that exists as part of detectable science) is incomplete as a matter of ontology (incomplete in the Godel sense) how could that ontology possibly verify determinism or superdeterminism?
Best,
Jack
(Essay: Misalignment Problem - You may enjoy the amalgamated sleuths section)
report post as inappropriate
Author Tim Palmer replied on Feb. 17, 2020 @ 12:49 GMT
Yes I do think that relativistic space-time will be found to be emergent from this fractal state-space geometry. However, making this a precise notion, and not just an aspiration, is something that I am currently thinking hard about!
I'm not sure I fully understand your second question. However, it triggers in my mind an important question: are there experimentally testable consequences of determinism? Again, this is something my collaborator Sabine Hossenfelder and I are currently thinking about.
So, in short, I can't answer either question, but they both touch on important issues!
Jack James replied on Feb. 18, 2020 @ 01:27 GMT
Thanks Tim,
I am very glad you are thinking (with the great tools of physics) about the same questions I am.
Re Q2 I think you have grasped my question in your statement "are there experimentally testable consequences of determinism?" Because if Godel's incompleteness manifests physically (space-time & mass) then you could never test determinism because the physical system would have unknowable states that cannot be determined by the system itself. So you couldn't have a determinable system, could you?
Best,
Jack
report post as inappropriate
Author Tim Palmer replied on Feb. 18, 2020 @ 08:04 GMT
My view (which I tried to express in the essay) is that such undecidability only manifests itself in questions about the structure of state space, not in questions about the structure of, or processes in, space-time. Hence I do think there are experimentally testable consequences of determinism.
Yehonatan Knoll wrote on Feb. 20, 2020 @ 17:41 GMT
Tim,
What Bell had in mind (and explicitly expressed so in many interviews) is that, if particles are little machines, then his inequality must be respected. Now, as with any statement regarding the physical world, it tacitly assumes also `common sense'. One can bend this vague notion to an arbitrary extent, but there is a more direct attack on Bell's theorem, which has been staring us in the face for over a century: Particles (and chaotic systems and humans) are not machines! (no new-age stuff)
You are invited to read my essay which is further relevant to your main area of expertise - predicting the behavior of chaotic systems. Ensemble average over initial conditions is probably not the right way to do so.
report post as inappropriate
Lawrence B. Crowell wrote on Feb. 22, 2020 @ 22:06 GMT
I finally got to reading your paper. I have been working to get a piece of instrumentation developed meant to go to another planet. In reading this I think what you say is maybe not that different from what I develop.
Your paper drives home the point on using the Blum, Shub, and Smale (BSS) concept of computability. This is an odd concept for it involves complete computation of the reals...
view entire post
I finally got to reading your paper. I have been working to get a piece of instrumentation developed meant to go to another planet. In reading this I think what you say is maybe not that different from what I develop.
Your paper drives home the point on using the Blum, Shub, and Smale (BSS) concept of computability. This is an odd concept for it involves complete computation of the reals to infinite precision and where our usual idea of close approximations are not real computations. This is a certain definition of incomputability. Since these I_U fractal subsets for an underlying fractal system are forms of Cantor sets the p-adic number or metric system is used to describe them. As fractal sets are recursively enumerable their complements are what are incomputable in a standard Church-Turing sense. Since this fractal is really defined in a set of such, there is a set of p-adic numbers or metrics and by Matiyasevich this is not globally computable. By this there is no principal ideal for the entire set or equivalently a single algorithm for all possible Diophantine equations. This is the approach I take with my FQXi paper. As a result, at this time I am relatively disposed to your concept here.
This meaning to incomputability in the BBS system is different, but not that out of line with the standard Church-Godel-Turing understanding. We can see that determinism is not always computable. The Busy Beaver algorithm of Rado has the first five numbers 0, 1, 4, 6, 13, but beyond that things become tough. The 6th is thought to be 4098, though not proven as yet. The 7th is a number greater than 1.29×10^{865}. It is not possible to compute higher Busy Beaver numbers. The failure to do so is a form of the Berry paradox or undecidability. The Busy Beaver is then a sort of model idea of a strange attractor with the exponential separation of differing initial conditions for two systems.
We have for coherent states, a general form of laser states of light, the occurrence of states of the form |p, q⟩ that have both symplectic and Riemannian geometry. My mind is pondering what connection this concept of incomputability has to coherent states. The occurrence of Riemannian geometry for spacetime, particularly if spacetime is a large N entanglement or condensate of states, and an underlying quantum geometry may be ordered as such. Einstein in his Annus Mirabilus proposed that states of light have blackbody or Boltzmann thermal distributions with a coherent set of states in his coefficients. This may really describe quantum gravitation as well.
view post as summary
report post as inappropriate
Author Tim Palmer wrote on Feb. 23, 2020 @ 08:29 GMT
Please take a look at the referenced paper by Simant Dube. He finds essentially the same computability result as Blum et al, studying the fractal attractors of iterated function systems.
Eckard Blumschein wrote on Feb. 28, 2020 @ 11:45 GMT
Dear Tim Palmer,
If you are a physicist rather than an inflexible mathematician, you may hopefully be in position and ready to answer my question:
While I know, "a closed interval is an interval that includes all of its limit points", I guess there might be a fundamental point-based alternative to the "dot-based" mathematics from Dirichlet up to Heine and Borel. Given real numbers constitute Euclidean points, isn't then a discrimination between closed and bounded only justified for rational numbers? Isn't it logically impossible to include a single real number? Is the notion limit point really reasonable?
Well, this request relates to my own essay rather than directly to your essay. I am just curious.
Sincerely,
Eckard Blumschein
report post as inappropriate
Author Tim Palmer replied on Mar. 2, 2020 @ 10:51 GMT
All I can say is that these are deep questions!!
Eckard Blumschein replied on Mar. 2, 2020 @ 13:42 GMT
Thank you.
I will try and explain at 3385 why I as a layman in mathematics feel forced to deal with fundamentals of mathematics.
Let me here quote from your abstract something easily understandable to everybody:
"...undecidability is only manifest in propositions about the physical consistency
of putative hypothetical states". In my words: Continue calculating as if.
Just an aside on your Hilbert quote after illustrating Cantor's dust: "The infinite is nowhere to be found." I argue that the property of being infinite is to be seen in every closed loop.
report post as inappropriate
Steve Dufourny replied on Mar. 18, 2020 @ 15:28 GMT
Hi to both of You, dear Eckard, it is too much complex to find the real meaning of the infinity, we can of course rank the different infinities inside this physicality and still we know just a small number of these infinities, cantor, Godel or Euler or all the maths works are not the problem, the problem is our limitations inside the physicality and philosophically, we cannot understand inside the physicality all the finite series and all the different infinities simply and it is still more complicated to encircle a kind of infinity beyond this physicality, is it conscious or not and how this thing creates this physicality, is it with strings and wavesm fields or points and a geonetrodynamics or in my model with 3D spheres coded and a gravitational coded aether , we cannot affirm, so that implies a pure uncertainty for our foundamental objects and we cannot predict and rank all simply, like we cannot compute all. We are limited in knowledges simply, even in closed loops you cannot find the answers for these infinities inside this physicality and still less this infinity beyond this physicality, we must recognise this simple fact.What do you Think? Regards
report post as inappropriate
Steve Dufourny wrote on Mar. 2, 2020 @ 09:38 GMT
Hello,
I liked a lot your general essay. Several ideas are very relevant about the links between this quantum mechanics and this GR. I consider personally in my model of spherisation a gravitational coded aether sent from the central cosmological sphere made of finite series of spheres, I tell me that we have a deeper logic than only our relativity and these photons like main essence. This space, vacuum seems more than we can imagine. I have shared your essay on Facebook because it is one of my favorites, regards
report post as inappropriate
Author Tim Palmer replied on Mar. 2, 2020 @ 10:51 GMT
John C Hodge wrote on Mar. 3, 2020 @ 15:27 GMT
You demonstrate the point that to synthesize General Relativity (GR) and Quantum theory (QT) seems to require much more complex mathematics. Along the way, for problems of quantum entanglement, Bell inequality, and (I submit) quantum eraser ,a theory would have to be non-local and causal. Also, there are many problem observations in astronomy and cosmology which should be addressed. I prefer to generalize the main goal as finding one theory that describes both big and small of our universe.
I think, like you, that dealing with non-computability could "...break the road-block in finding a satisfactory theory...". That is, non-computability of the mathematics is a problem. I suggest the novel path to this new theory is to remove the mathematics that make the models complex and then to restructure the principles to explain the problem observation and to correspond to GR and QT with approximations.
For example, interpret the Bell inequality so to say NO interaction takes place at less than the speed of light or that ALL interactions take place at a speed much greater than light (such as van Flanern and other observations finding the gravity speed is much faster than light). The Newtonian speculation that interference of light includes the aether wave traveling much faster than the photons and then directing the photons. This suggest the matter causes the aether waves and the aether divergence directs the photons - like in GR. Thus, a further unity can be achieved if the aether is the left side of the GRs field equation (space-time) and the medium supporting real waves in QT. Just these two changes/insights can yield a simpler and more complete model.
I'm unsure how to treat chaos ideas. I think you're correct, we should assume determinism and self-similar (fractal) model even if the reality of the universe may be different.
Hodge
report post as inappropriate
Michael Smith wrote on Mar. 4, 2020 @ 17:10 GMT
Great article Tim. You present your ideas clearly and logically so that even a non-physicist "newbie" such as myself can appreciate the main points.
I was drawn to your title as my (much less rigorous) theory approaches possible unification through fractal geometry as well. I too suggest that the space-time of general relativity may emerge from the higher dimensional geometry of a quaternionic structure, but come at it from a rather unique way - through the cyclical nature of the prime numbers in base-12 (my article is titled "Primarily True").
I posit that my "base-12 prime vibration" as a fractal invariant pattern should emerge in space-time at 10-to-the-power-11 logarithmic fractals (in terms of particle density) as that would represent a complete logarithmic cycle or "octave" of a quaternion power cycle in base-12. This might help explain why there are that many galaxies in the observable universe, suns per mature spiral galaxy, atoms per DNA molecule and even neurons in the human brain. In the gap between each such fractal would therefore be where nature would emerge in an unstructured way, much like your conjecture.
One question: If the quaternion model were purely geometric such as I'm picturing - as simply prime positions on the base-12 circle (thus Euclidean), would it not then become computable after all? If I understand it correctly, Tarski's geometric decidability theorem seems to indicate that would be the case. Any insight would be much appreciated.
Cheers,
Michael
report post as inappropriate
Peter Jackson wrote on Mar. 4, 2020 @ 21:09 GMT
Dear Tim,
Excellent essay. I agree Chaos Theory offers good insight into nature as well as a predictive tool. I think it brave to major on it. Bill McHarris did so well in 2016 with a poor response. I hope you do better. (I drew more on new foundations & fuzzy sets.) I agree that resistance to non-finite maths is problematic, and thanks for reminding us of Hilbert's quote. Was he blind to 'infinite' Pi, space and time?
I was interested in your view that a deterministic foundation to QM should exist, one shared by myself and John Bell. I quote Bell this and last yr and describe a mechanism appearing to show we're correct! But of course too shocking for most to countenance! I hope you'll take a close look.
Beyond your (p6) pairs; if BOTH have N and S poles & parallel axes, and A & B polariser interactions give Poincare sphere surface vector additions, can you think of any reason A & B, by reversing their settings, couldn't reverse their own 'amplitude'
outcomes. I found that's NOT a hypothesis Bohr tested!
Your p8. assertion that inequality violation can emerge from an uncomputable deterministic model seems to preclude a physical ontological understanding of process, as all others assume. Does it? If so I disagree so hope you'll explain why you believe so (if Bells proof can be 'sidestepped' as he suggested).
I agree gravity is non-computable, but do you agree that may be in the same way weather parameters are? i.e; All low pressure areas have a density gradient due to rotational velocity, after Bernouili, but all differ slightly and constantly evolve.
Great essay Tim, and I look forward to discussing various matters further, I think best
after you've read mine.
Very Best
Peter
report post as inappropriate
Flavio Del Santo wrote on Mar. 11, 2020 @ 16:48 GMT
Congratulations on a nice essay! I think that chaos theory is an underestimated element of the foundations of physics. I think you may find resonance with your ideas in Hoessenfelder's essay (https://fqxi.org/community/forum/topic/3433) and partly in mine (https://fqxi.org/community/forum/topic/3436).
Best of luck for the contest!
Flavio
report post as inappropriate
John David Crowell wrote on Mar. 13, 2020 @ 16:09 GMT
Tim. I enjoyed your article and the inclusion of chaos theory into the discussion of the unification of quantum mechanics and relativity. In my essay “Clarification Of Physics—“ I introduce a new perspective into the unification efforts. In the essay I propose a creation process that emerges from chaos, unifies quantum mechanics and relativity, and creates “our” finite multiverse and the visible universe. I would appreciate your comments on the essay. John D Crowell
report post as inappropriate
Member Jeffrey Bub wrote on Mar. 30, 2020 @ 16:47 GMT
A novel and fascinating idea. It brings to mind Pitowsky's 'Resolution of the EPR and Bell paradoxes' by extending the concept of probability to non-measurable sets.
report post as inappropriate
Author Tim Palmer wrote on Mar. 31, 2020 @ 07:01 GMT
Thanks Jeffrey for these kind remarks. I know what you mean about Pitowski's work. However, on (what I call) the invariant set, the relevant measures can be described by elementary finite frequentist probability theory. The mathematics underpinning undecidability arises only when considering counterfactual states which do not lie on the invariant set. My approach, is simply to deny ontic reality to such states by postulating the primacy of the invariant set. Without this, I think one would indeed be drawn to consider non-measurable sets as did Pitowski. However, the concept of non-measurability does not seem to make physical sense to me - as the famous Banach-Tarski paradox clearly indicates.
PS Reference [18} - my paper arXiv:1804.01734 on invariant set theory - has now been accepted to appear in Proceedings of the Royal Society A.
Vladimir Rogozhin wrote on Apr. 1, 2020 @ 12:13 GMT
Dear Tim,
You write:
"Our inability to synthesise general relativity theory and quantum theory into a satisfactory quantum theory of gravity is legendary and is widely considered as the single biggest challenge in contemporary theoretical physics."...
Quantum theory and the general relativity theory are phenomenological theories (parametric, operationalistic) without an ontological basification (justification+substantiation). It makes no sense to “unite” them, let each one work in its own “field”. Problem №1 for fundamental science and cognition in general is the ontological basification (substantiation) of mathematics, and therefore knowledge in general.
You conclude:
“From where do new ideas come? Do they pop out of the aether as some random flashes of inspiration with no obvious precedent? Or do these ideas mostly already exist, but in a completely separate setting."Ideas come to our minds from the primordial (absolute) generating structure that lies both in the “beginning” of the Universum (“top”) and in our heads (“bottom”). The task of physicists, mathematicians and philosophers is to understand the dialectics of Nature (catch on the “net” “Proteus of Nature” using the “goddess of form” Eidothea and “crazy” ontological ideas) and build this Superstructure - the ontological basis of Mathematics (“language of Nature”) and Knowledge as a whole: ontological framework, ontological carcass, ontological foundation. Today we need a global brainstorming session to “assemble” all the ideas for discussing and creating the Ontological Knowledge Base.
With kind regards, Vladimir
report post as inappropriate
Fabien Paillusson wrote on Apr. 11, 2020 @ 15:28 GMT
Dear Tim,
It is a very nice and original idea you have presented in your essay. As many others have said I would probably need a few more reads to grasp all the details though.
Few questions if I may:
- If an underlying fractal geometry can give rise to quantum-like behaviour, how does classicality emerge from this picture, if it does at all?
- Would you have any toy-example with the Lorentz attractor of non-computable counterfactual?
Many thanks.
Best,
Fabien
report post as inappropriate
Author Tim Palmer replied on Apr. 11, 2020 @ 16:49 GMT
Thanks Fabien. Good questions.
My fractal model has a free parameter N. In the singular limit N=infinity all the fractal gaps close up and the state-space geometry is classical. However, for any finite value of N, no matter how big, the Bell counterfactuals lie in the fractal gaps and the state-space geometry is non-classical. Michael Berry has written about how old theories of physics are often the singular limits of new theories as some parameter of the new theory is set to zero or infinity.
The Cantor Set underpins the Lorenz attractor. Imagine a point X on the Cantor Set and perturb it with a perturbation delta X drawn randomly using the measure of the Euclidean line in which the Cantor Set is embedded. Then the perturbation almost certainly perturbs the point off the Cantor Set.
Such a perturbation can be thought of as corresponding to one of my counterfactuals: although I live in a world where I did X, what would have happened if I had instead done X+delta X? Suppose the delta X is dynamically unconstrained - i.e. something you just make up in your head without consideration of whether it satisfies the laws of physics - then if the world associated with X lies on the invariant set, the world associated with X+delta X almost certainly does not and so the answer to the question "what would have happened?" is undefined.
Edwin Eugene Klingman wrote on Apr. 12, 2020 @ 20:49 GMT
Dear Tim Palmer,
Any essay combining general relativity and Bell’s theorem is a ‘must read’.
In it you show that it’s possible to violate Bell’s inequalities with a locally causal but uncomputable deterministic theory for locally causal spacetime computations. Chaos is powerful, but I’m unsure what the ontological implications are.
A number of authors are concerned whether ‘classical physics’ is truly deterministic, and if not, how is this explained.
If one assumes that the deBroglie-like gravitomagnetic wave circulation is induced by the mass flow density of the particle [momentum-density], then the equivalent mass of the field energy induces more circulation. This means that the wave field is self-interacting. For ‘one free particle’ a stable soliton-like particle plus wave is essentially deterministic. But for many interacting particles, all of which are also self-interacting, then ‘determinism’ absolutely vanishes, in the sense of calculations or predictions, and the statistical approach becomes necessary.
This theory clearly supports ‘local’ entanglement, as the waves interact and self-interact, while rejecting Bell’s ‘qubit’-based projection: A, B = +1, -1 consistent with the Stern-Gerlach data (see Bohr postcard). For Bell experiments based on ‘real’ spin (3-vector) vs ‘qubit’ spin (good for spins in magnetic domains) the physics easily obtains the correlation which Bell claims is impossible, hence ‘long distance’ entanglement is not invoked and locality is preserved.
This is not a matter of math; it is a matter of ontology. I believe ontology is the issue for the number of authors who also seem to support more ‘intuition’ in physics. My current essay,
Deciding on the nature of time and space treats intuition and ontology in a new analysis of special relativity, and I invite you to read it and comment.
Edwin Eugene Klingman
report post as inappropriate
Author Tim Palmer wrote on Apr. 17, 2020 @ 07:52 GMT
Tim Palmer re-uploaded the file Palmer_FXQi_Palmer_1.pdf for the essay entitled "Undecidability, Fractal Geometry and the Unity of Physics" on 2020-04-17 07:52:00 UTC.
post approved
Member Simon DeDeo wrote on Apr. 20, 2020 @ 00:11 GMT
Hello Tim —
Wow, this rather blew my mind, and I'm still digesting it.
Let me ask you a really basic question. A chaotic system implies that even approximate knowledge about a path into the far future depends upon the initial conditions—you need to keep going to more and more decimal places in the expansion.
This, in turn, means that we should expect meaningful facts about the future evolution of a chaotic system to be uncomputable. To be really specific, there should be many facts along the lines of "will these three objects collide with each other eventually" whose answer will be uncomputable.
Would it be fair to say that your results here (Eq 6) draw their power from this feature of chaotic dynamics? This would help me in understanding your results better.
So many lovely things here. I had never thought about Lewis' notion of "neighbourhood" in modal logic could be so usefully transposed to counterfactual thinking in physical systems. The idea that the p-adic metric is the "right" notion of "nearby", i.e., modally accessible, is extremely cool.
Yours,
Simon
PS, minor remark Re: finite time singularities in Navier-Stokes—we know (for sure) that they exist in General Relativity, and if you're a hardcore physical Church-Turing thesis person, this is one way we know that (classical) GR is incomplete.
report post as inappropriate
Author Tim Palmer replied on Apr. 20, 2020 @ 07:38 GMT
Hi Simon
Thanks for your kind comments.
Yes indeed, if the question you ask of a chaotic system somehow probes its asymptotic states (e.g. "will three objects collide eventually") then one can (likely) reformulate the question in terms of the state-space geometry of these asymptotic states - and my claim is that such geometric questions are typically undecidable. However, the sensitivity of simple finite-time forecasts to the initial state is not itself a illustration of non-computability.
Indeed, I would say that my results (e.g. producing a viable model which can violate statistical independence without falling foul of the usual objections to such violation) arises because of this uncomputable property of chaotic dynamics. I should emphasise that in this picture, uncomputability leaves its mark on finite approximations to such dynamics in the form of computational irreducibility (the system can't be emulated by a simpler system). Hence we can still produce viable finite models where (6) is satisfied.
Re Lewis, my belief is that the potential pitfalls of unconstrained counterfactual reasoning have not been given sufficient attention in studying the foundations of physics. In this we are being beguiled by our intuition. You may be interested a recent paper of mine:
https://www.mdpi.com/1099-4300/22/3/281
which tries to explain why we are so beguiled.
Finally, in classical GR with a cosmic censorship hypothesis, the singularities seem to be hidden from sight and are therefore not as ubiquitous as they might be - if only we could prove it! - in Navier-Stokes!
Tim
Member Seth Lloyd wrote on Apr. 24, 2020 @ 03:06 GMT
Dear Tim,
This is a very nice contribution to efforts to reconcile quantum indeterminacy with classical mechanics by invoking classical chaos theory. Your arguments are convincing. But where do complex numbers and amplitudes come in? They are necessary for quantum mechanics in general and non-local quantum correlations in particular. I'm sympathetic to getting intrinsic uncertainty out of classical chaos. But it seems like something is still missing. Please enlighten us even more!
Yours,
Seth
report post as inappropriate
Author Tim Palmer wrote on Apr. 24, 2020 @ 07:51 GMT
Dear Seth
Thanks for your input. I fully agree that complex numbers are central to quantum theory.
To understand the emergence of complex numbers in my fractal model, could I refer you to the technical paper recently published in Proc. Roy. Soc.A (open access):
https://royalsocietypublishing.org/doi/10.1098/rspa.
2019.0350
on which this essay is based - some aspects...
view entire post
Dear Seth
Thanks for your input. I fully agree that complex numbers are central to quantum theory.
To understand the emergence of complex numbers in my fractal model, could I refer you to the technical paper recently published in Proc. Roy. Soc.A (open access):
https://royalsocietypublishing.org/doi/10.1098/rspa.
2019.0350
on which this essay is based - some aspects of which are summarised in the Appendix to the essay. In particular, I construct a particular fractal geometric model of (what I call) the state-space invariant set based on the concept of fractal helices (see Fig 4 in Section 3 of the paper). At a particular fractal iteration, the trajectory segments of the helix evolve to specific clusters in state space - these clusters representing measurement outcomes/ eigenstates of observables. I then describe this helical structure symbolically (BTW symbolic dynamics is a powerful tool in nonlinear dynamical systems theory for describing dynamics on fractal attractors topologically). In the case of two measurement outcomes, the symbolic descriptions of the helix are then given by finite bit strings. Now in Section 2 of the paper, I show than I can define multiplicative complex roots of unity in terms of permutation/negation operators on these bit strings. A very simple illustration of this is to take the bit string
S={a_1, a_2}
where a_1, a_2 in {1, -1} - as a representation of a pair of trajectories labelled symbolically by which of the two distinct clusters ("1" and "-1") they evolve. Now define the operator i by
i S = {-a_2, a_1}.
Then i^2=-1 if -S={-a_1, -a_2}.
In fact, even more I can define quaternionic multiplication and hence Pauli spin matrices (and hence Dirac gamma matrices) in terms of certain permutation/negation operators on longer finite bit strings. See the paper for more details.
This answers half of your question - about complex multiplication. The second half of your question - relating additive properties of such bit strings to the additive properties of complex numbers - is something I am currently writing up. It turns out that to do this I have to extend the number-theoretic properties of trigonometric functions which play a vital role in the particular discretisation of the Bloch sphere described in the paper cited above - see also below - to number-theoretic properties of hyperbolic functions. Whilst the former provide a natural way to discretise rotations in physical space, the latter provide a natural way to discretise Lorentz transformations in space time! In this way, I have some belief that the properties of the invariant set are more primitive than those of space-time, with the prospect of the latter emerging from the former. With the current lockdown, I should have a draft paper shortly! With this, I will have a complete answer to your question.
However, a crucially important point in all this is that I do not, and will not, recover in this way the full *continuum* field of complex numbers, but only a particular discrete subset (essentially those complex numbers with rational squared amplitudes and rational phase angles). These complex numbers play an important role in my model for describing the symbolic properties of the helix in a probabilistic way. Number theoretic properties of trigonometric functions applied to these discretised complex numbers provide the basis for my description of quantum complementarity (and indeed the Uncertainty Principle - see Section 2e of paper above). However, in my model there is no requirement for these complex numbers to be arithmetically closed. Such arithmetic closure arises at the deeper deterministic level and this can be described by the arithmetically closed p-adic integers, these providing the basis for a deterministic dynamic on the invariant set. (There is a rich theory of deterministic dynamical systems based on the p-adics.)
All this means that in describing my fractal model from a probabilistic perspective, I can and do (in the paper above) use the formalism of complex Hilbert vectors (and associated tensor products). However, these vectors are required, by the discretised nature of the helix of trajectories in state space, to have squared amplitudes which are rational numbers and complex phases which are rational multiples of pi. Importantly, almost all elements of the complex Hilbert Space *continuum* have no (ontic) correspondence with probabilistic descriptions of the invariant set helices.
My own view is that quantum theory's dependence on the *continuum* of complex numbers (i.e. through the continuum complex Hilbert space) is the origin of its deep conceptual problems, e.g. as arises in trying to understand the meaning of the Bell Theorem or the sequential Stern-Gerlach experiment, or the Mach-Zehnder interferometer, or GHZ, or....you name it!!. Indeed I think quantum theory's dependence on the complex continuum is the origin of the difficulties we have reconciling quantum theory and general relativity theory. Of course, in quantum theory, we don't have a deterministic underpinning and so breaking the arithmetic closure of Hilbert Space is a real theoretical problem. However, in a model where there is a deeper deterministic basis, breaking the arithmetic closure of Hilbert space in this way doesn't matter a jot - since it's not a fundamental description of the underlying theory!! Here, in my view, we physicists have been overly beguiled by one aspect of the beauty of mathematics - the complex continuum field C!!
Recall that in mathematics, C arose as a tool for solving polynomial equations. Perhaps we need to retrace our steps and ask whether taking this tool onboard wholesale for describing the equations of fundamental physics could actually now be causing us some big problems (the utility of C notwithstanding)! Perhaps we imported a virus which has rather grown over the centuries and now completely permeates the core of fundamental physics making it impossible to make vigorous leaps forward! The real-number continuum virus doesn't matter in classical physics, because discretised approximations can come arbitrarily close to the continuum limit. However, the complex-number continuum does matter in a much more essential way in quantum theory. Recall in Lucien Hardy's axioms for quantum theory, the complex continuum plays a central and inviolable role - in complete contrast with classical theory. Hence in order to find a discretised theory of quantum physics, which I think should be an important goal for physical theory, quantum theory must be a singular and not a smooth limit as the discretisation goes to zero. My deterministic model has this property.
I am going to pick up on one other point in your correspondence. You say that I try to reconcile quantum theory with classical mechanics. I don't really see my proposal as "classical" in the following sense. The dynamics of classical chaos are differential (or difference) equations and the fractal attractor is an asymptotic set of states on which, classically, one never actually arrives, at least from a generic initial condition in state space. However, from this classical perspective there is no essential/ontological difference between a state which is "almost" on the attractor, and one on the attractor precisely.
By contrast, here I am postulating a primitive role for this fractal geometry (rather than the differential equations). Because of this, as I try to discuss in the essay, the p-adic metric may be a better yardstick of distance in state space than the familiar Euclidean metric. The p-adic metric certainly does distinguish between points which are not on the fractal and those that are, no matter how close such points may be from a Euclidean perspective. In this sense although my model is certainly motivated by classical deterministic chaos, I would not call it classical.
There is much more to be teased out of this model and I feel I am rather at the beginning of a journey with it, rather than the end.
Thanks again for your interest. Not sure how much you will have been enlightened, but I hope you see where I am coming from, at least!
Tim
view post as summary
Flavio Del Santo wrote on Apr. 26, 2020 @ 19:59 GMT
Dear Prof. Palmer,
I really liked your esssay and especially how it emphasises the (overlooked) role of chaotic systems for the foundations of science.
I would greatly appreciate your opinion on
my essay which is based on the research I am carrying out with Nicolas Gisin. I think our approaches have some similarities, for we also rely on classical chaos to introduce indeterminism in classical physics too.
I wish you the best of luck for the contest, and to get to a prize as you deserve.
Best wishes,
Flavio
report post as inappropriate
Author Tim Palmer wrote on Apr. 26, 2020 @ 22:16 GMT
Dear Flavio
Thank you for your kind remarks. Nicolas Gisin and I have already discussed some of the matters discussed in your essay and whilst I do agree that your and Nicolas's ideas are very thought provoking, I would say that we are not in complete agreement.
Let me start by remarking that I fully agree that it is possible to treat chaotic classical deterministic systems by some...
view entire post
Dear Flavio
Thank you for your kind remarks. Nicolas Gisin and I have already discussed some of the matters discussed in your essay and whilst I do agree that your and Nicolas's ideas are very thought provoking, I would say that we are not in complete agreement.
Let me start by remarking that I fully agree that it is possible to treat chaotic classical deterministic systems by some finite indeterministic approximation. In fact this is exactly what we do in modelling climate:
https://www.nature.com/articles/s42254-019-0062-2
whi
ch is to say that we approximate a set of deterministic chaotic partial differential equations by a finite deterministic numerical approximation and represent the unresolved remainder of the system by constrained stochastic noise. It works well!
However, I do not believe this approach will work for quantum physics, if one believes that the complex Hilbert Space of quantum theory is somehow fundamental, the reason being (Lucien) Hardy's Continuity Axiom. By virtue of this axiom, the continuity of Hilbert Space is fundamental to quantum theory.
Put this way, the continuum appears to play a more vital role in quantum theory than it does in classical theory. This suggests that if we seek some finite theory of quantum physics - which I certainly do seek - then the resulting theory will have to be radically different from quantum theory (even with a stochastic collapse model) and will not be just some approximation to it.
I can in fact state this a little more precisely. I believe that by virtue of Hardy's Continuity Axiom, quantum theory will have to be a singular limit and not a smooth limit of a finite discretised theory of quantum physics, as the discretisation scale goes to zero.
In my essay I attempt to suggest a deterministic (i.e. not indeterministic) alternative to quantum theory in which quantum theory is a singular limit (as a certain fractal gap parameter goes to zero). However, until potential departures from quantum theory can be experimentally tested, and perhaps this day is not so far away, who knows whether this really is the right way forward.
Having said this, there are clearly many points of commonality between our essays and I look forward to discussing these with you sometime!
Best wishes
Tim
view post as summary
Michael muteru wrote on Apr. 28, 2020 @ 21:07 GMT
I too concur and oblige that fractals offer structured patterns to which human thought assigns meaning to topological landscapes.Can anthropic bias be key to unravelling New physics that bridge the gap between general relativity and quantum mechanics. kindly read/rate how,why and where here https://fqxi.org/community/forum/topic/3525.thanks
report post as inappropriate
Kwame A Bennett wrote on May. 1, 2020 @ 20:16 GMT
I take a different look at fractals in my essay
Please rate:
Please take a look at my essay A grand Introduction to Darwinian mechanic
https://fqxi.org/community/forum/topic/3549
report post as inappropriate
Yutaka Shikano wrote on May. 4, 2020 @ 23:13 GMT
Dear Tim,
I enjoyed reading your essay and learned a lot of things on the chaos theory. Because I also studied the quantum nature from the viewpoint of quantum walk related to quantum chaos, I would like to know the clarification on the stochastic nature and chaotic nature. From your viewpoint, what do you think about this relationship? As in
my essay, the chaotic theory is completely different from the stochastic thing from the viewpoint of computation. Therefore, I would like to know your opinion.
Best wishes,
Yutaka
report post as inappropriate
Author Tim Palmer replied on May. 5, 2020 @ 08:07 GMT
Dear Yutaka
Thank you for your question. From the perspective of my essay, stochastic and chaotic dynamics are very different concepts. Let me give an example. In my essay I wrote down the equations of the famous Lorenz model which is chaotic for certain parameter values. For standard chaotic values of the parameters, about 96% of the variance of the model lies in a two-dimensional sub-space of state space. Now one can choose a basis where you retain the dynamical equations in this two-dimensional subspace, but replace the dynamics in the third dimension with a stochastic process. The resulting attractor looks superficially like the Lorenz attractor. However, it differs in one vital regard - all the fractal gaps in the attractor are filled in by the stochastic process.
That is to say, replacing chaotic determinism with stochasticity completely negates my arguments about counterfactual incompleteness (associated with states which lie in the fractal gaps in my cosmological invariant set). Hence my arguments about why the violation of statistical independence is explainable in a suitable nonlinear dynamical framework are nullified if determinism is replaced with stochasticity.
It is for this reason that I am somewhat sceptical of models which attempt to replace real numbers with truncated rationals + stochastic noise will work in explaining quantum physics.
In conclusion, there is a vital difference between chaotic and stochastic dynamics, in my opinion.
With regards
Tim
George Gantz wrote on May. 6, 2020 @ 01:09 GMT
Tim -
An exquisite and erudite exposition on matters far beyond my formal training in math and physics (from some decades ago). I gather that you are positing some level of determinism arising from infinite recursion of fractal attractors. In lay terms, if our frame and timeframe are large enough, we can regain the confidence of determinism from the local instability of chaos, just as statistical mechanics rescues us from the chaos of the independent behaviors of individual particles. Am I following this correctly?
That said, I am dubious that determinism of any sort can be rescued. We can speculate with infinities but we cannot prove anything at all, as the reasoning will always fall short. This verse from the Rubaiyat captures the thought:
XXIX. Into this Universe, and Why not knowing
Nor Whence, like Water willy-nilly flowing;
And out of it, as Wind along the Waste,
I know not Whither, willy-nilly blowing.
Thanks - George Gantz, The Door That Has No Key: https://fqxi.org/community/forum/topic/3494
report post as inappropriate
Author Tim Palmer wrote on May. 6, 2020 @ 07:42 GMT
Dear George
Thank you for your comment.
My goal is to formulate a finite theory of quantum physics where the fractal invariant set model of quantum physics is a smooth limit as a parameter of the finite model goes to infinity.
Finding such smooth limits is highly non-trivial in quantum physics. For example, if you try to discretise the complex Hilbert Space of quantum theory then you violate the Continuity Axiom of Hardy's axioms of quantum theory - and according to his axioms you would revert to classical theory. In this sense quantum theory is the singular and not the smooth limit of a finite discretised theory of Hilbert space as the discretisation goes to zero.
As Michael Berry has discussed, singular limits are quite commonplace in physics and in some sense represent a discontinuous jump when you go from "very large but finite" to truly infinite.
What I am trying to do is find a finite theory of quantum physics which has a smooth and not a singular limit as some parameter goes to infinity. In practice I can achieve this by assuming that the symbolic labels associated with the fractal iterates of the invariant set have periodic structure. This is entirely equivalent to the idea that rational numbers have a periodic representation in terms of their decimal expansions. The larger the periodicity the closer they are to irrationals.
With this I can effectively interpret the invariant set as a finite periodic limit cycle, but with very large periodicity. As discussed (albeit briefly) in the essay, the property of non-computability is then replaced by computational irreducibility. None of the key properties which allow me to reinterpret Bell's theorem are lost in going from strict non-computability to computational irreducibility.
With regards
Tim
John Joseph Vastola wrote on May. 7, 2020 @ 19:12 GMT
Very intriguing essay! The central idea, (which I understood to be) that one might be able to get around Bell's theorem by having aspects of the underlying deterministic theory be uncomputable in a certain precise sense, is very clever. It's much better than a philosophical monstrosity like superdeterminism, too...Still, I admit I did not fully understand all of the technical details. Maybe I will reread it again.
Here's a philosophical question, though. There's how the universe 'really is', and there's the collection of things we can ever know about it; these sets are almost certainly not equivalent. If there is some sort of deterministic theory that underlies quantum mechanics, but it has the property that it 'looks' probabilistic to us because of uncomputability etc, why should we prefer the deterministic theory? I guess it's possible that ideas like this could help with unification, but it seems to me necessary that the proposed unification would suggest some experiment that would distinguish between the different possibilities in order for that unification to be useful.
More generally, how can we ever know the 'true' behavior of quantum mechanics, given all these clever alternatives?
John
report post as inappropriate
Author Tim Palmer wrote on May. 7, 2020 @ 20:43 GMT
Dear John
Certainly a new theory of quantum physics should suggest some hopefully experimentally testable differences from quantum theory.
In the technical paper https://royalsocietypublishing.org/doi/full/10.1098/rspa.201
9.0350 on which this essay is based, I present some preliminary ideas on possible differences.
Thanks
Tim
Michael James Kewming wrote on May. 13, 2020 @ 00:00 GMT
Hi Tim,
Thank you for writing a very interesting essay! I certainly fell into the category of 'physicist who finds p-adic numbers exotic'. I have never encountered them but am eager to take a bit of a dive into them.
You certainly raise some very interesting points particularly that undeciability is a property of the underlying state-space of the system and not the physical process occurring in spacetime. Moreover, this lead into a very nice discussion about counterfactuals and free will that I really appreciated.
If I understand correctly, a non-computable theory can violate the Bell inequality. This uncomputable theory is a based on fractal attractors which correspond to the possible eigenstates of the state space being observed i.e determined by the Hamiltonian?
It's an interesting paradigm and am eager to read more. Another question I would ask is how does dissipation alter this paradigm? Does it change the state space where the fractal atractors now change to multiple steady state attractors?
In any case, it was a very thought provoking essay. I hope you have time to take a look at my essay
noisy mahcines which considers the limitations of finite resources in undeciable systems.
Thanks,
Michael
report post as inappropriate
Author Tim Palmer replied on May. 13, 2020 @ 07:48 GMT
Dear Michael
Thanks for your kind comments.
Regarding p-adics, I am reminded of a paper I once read by Herman Bondi who said that if children we were taught special relativity in primary school, as adults we would not find things like length contraction and time dilation the least bit strange or unusual. Similarly, I expect, if we were taught p-adic arithmetic in primary school, we...
view entire post
Dear Michael
Thanks for your kind comments.
Regarding p-adics, I am reminded of a paper I once read by Herman Bondi who said that if children we were taught special relativity in primary school, as adults we would not find things like length contraction and time dilation the least bit strange or unusual. Similarly, I expect, if we were taught p-adic arithmetic in primary school, we would not find p-adic numbers strange or exotic as adults. Peter Scholze, who won the Fields Medal last year, is quoted as saying that he has got so used to p-adics that now he finds the real numbers really strange and exotic!
Your letter raises a really interesting and important issue - the role of irreversibility. The fractal attractors I am considering have zero volume and hence zero measure relative to the measure of the Euclidean space in which they are embedded. The classical dynamical systems which generate these attractors asymptotically must therefore be irreversible: start with a finite volume and it shrinks to zero asymptotically.
What is the origin of this irreversibility? In terms of the attractor geometry, the irreversibility could be localised to some small region of state space, such that when the state of the system goes through this region, state-space volumes shrink a bit. In this way, it is possible for the dynamics to be Hamiltonian almost everywhere. But it cannot be strictly Hamiltonian everywhere. It is tempting to suppose that such irreversibility is associated with space-time singularities, but this is merely a conjecture.
Regarding your essay, I think I am in agreement with your perspective. Although I am claiming that the universe as a whole has these properties of uncomputability, I don't think it makes sense to think of sub-systems of the universe as approximating the properties of the full system in any way at all. In my essay I refer to the inability of the full system to be fully emulated by a sub-system of the full system as Computational Irreducibility - a phrase that I think Stephen Wolfram coined.
Of course it is worth noting that in many practical cases, noise can and should be treated as a positive resource. Personally, I think human creativity arises because the brain has been able to harness noise in this way - please see:
https://www.mdpi.com/1099-4300/22/3/281
Best wishes
Tim
view post as summary
James Arnold wrote on May. 13, 2020 @ 21:50 GMT
Tim, a most sophisticated essay! I can believe that if anyone could accomplish what you've sought, "to provide some basis for believing that these theories [chaos, quantum, and GR] can be brought closer together through the unifying concept of non-computability", you would be the one to do it!
You are no fool on that errand, but regarding chaos, the dependence of a chaotic system on initial conditions, combined with multiple vectors of recurrent interaction, just makes for a recurring deterministic system that may, as you point out, eventually break out into simple (deterministic) turbulence. So chaos: deterministic but not always computable. Quantum theory: un-deterministic but computable at least as a probability. And General Relativity: deterministic and computable. I'm not optimistic.
I didn't understand your reason for thinking "there must also be some deterministic framework underpinning quantum physics."
Finally, more in my wheelhouse, you quote R. Kane “one is free when there are no constraints preventing one from doing as one wishes” – a poor definition that doesn’t distinguish between being determined to wish for something and being merely influenced to wish.
Overall, congratulations on an impressive essay.
report post as inappropriate
Author Tim Palmer wrote on May. 14, 2020 @ 07:15 GMT
Dear James
Thanks for your kind comments.
Regarding free will. Bell's Theorem involves a mathematical assumption called Free Choice. I have proposed a revised definition called Free Choice on the Invariant Set. This basically means you can't choose to do things which are inconsistent with the laws of physics (the laws of physics in my proposed model derive from the fractal geometry of the invariant set). Put like this, I hope you will agree that this is not an unreasonable definition. We don't say that we are not free because we can't flap our arms and fly like birds!
However, in this definition one cannot predict ahead of time what choices will violate the laws of physics and which will not - this is linked to the non-computability of the invariant set. So, therefore I ask what a more operational definition of free will might be that evades this difficulty. The one I propose is such an operational definition. It's one I personally use in my day-to-day life.
If there is one real takeaway message from my essay that I hope will resonate with you is that in physics the assumption of rather unrestricted counterfactual definiteness is something that has not been analysed enough. I think this issue should be discussed more in Philosophy of Physics circles. For example, in my essay I give a reason why Lewis's counterfactual theory of causation might be faulty because of an implicit use of Euclidean distance in state space.
Best wishes
Tim
James Arnold replied on May. 16, 2020 @ 12:56 GMT
Tim, does randomness (defined as something uncaused or unprovoked) defy the laws of physics? It is used regularly in quantum physics to describe the unpredictable. I maintain that it is the best explanation for nothing happening at all. I suggest "spontaneity" as an explanation for anything from the quantum level to human inspiration, which by definition exceeds the laws of physics, but is more credible than nothingness.
report post as inappropriate
Author Tim Palmer replied on May. 16, 2020 @ 13:04 GMT
James
Personally, I have considerable difficulty with the concept of randomness in fundamental physics (even though it is an incredibly useful concept for many areas of applied physics). If you give me a bit string 010010....01 that you claim has been generated randomly, I will give you a deterministic rule for generating that same bit string. Now some might say that randomness is informationally incompressible determinism. Well I would say that in practice the two may well be indistinguishable. However, at a fundamental level the latter is generated by a deterministic rule and the former, presumably, is not. In my view the sooner we get back to thinking about physics deterministically (even though it may be computationally irreducible determinism) the better!!
Best wishes
Tim
Michael Alexeevich Popov wrote on May. 14, 2020 @ 09:25 GMT
Tim,
My intuition suggests that Bell theorem could be formulated also as " UUU - mathematical problem" in physics. Hence, there is some so - called "nonclassical tacit math" behind Bell as well?
Thank you for essay.
Michael
report post as inappropriate
Member Emily Christine Adlam wrote on May. 16, 2020 @ 13:09 GMT
This is a really exciting essay; I'm really intrigued by the connections you suggest between quantum mechanics and chaos theory, and am now keen to learn more about this area.
I did have one general question about the motivation for this approach. If I understand you correctly, the idea is that by constraining the state of the universe to evolve on some uncomputable fractal subset of state...
view entire post
This is a really exciting essay; I'm really intrigued by the connections you suggest between quantum mechanics and chaos theory, and am now keen to learn more about this area.
I did have one general question about the motivation for this approach. If I understand you correctly, the idea is that by constraining the state of the universe to evolve on some uncomputable fractal subset of state space, we get a natural way to violate statistical independence (without the denial of free will) and thus we can have violations of Bell's inequality without violations of locality.I wonder, though, why you consider it important to avoid violations of locality? As I understand it, the similarity of Schrodinger's equation to the Liouville equations leads you to consider underlying deterministic dynamics, and then since Bell's theorem rules out any underlying deterministic
local dynamics, you turn to non-computability as a means of violating statistical independence and thus invalidating Bell's theorem. But an alternative possible route would have been to accept the existence of nonlocality and consider how Schrodinger's equation could arise from an underlying deterministic non-local dynamics - is there a specific reason you chose not to go down this route?
I also have some questions about the sense in which 'locality' is preserved by your model. First, consider applying the constraint of evolution on a fractal subset to an indeterministic model. Then if the evolution of the universe is constrained to remain on the fractal subset, it would seem that the (non-deterministic) evolution of the universe at any one spacetime point must depend on the (non-deterministic) evolution of the universe at all points spacelike related to that point, as if the points evolved independently and non-deterministically then it would be possible to go off the fractal subset. So constraining the universe to lie on the fractal subset does not, in the absence of determinism, seem to give us a local theory. So now consider a deterministic model as you propose. Here the evolution of the universe is fully determined by the initial state (I am assuming here that by 'deterministic' you are referring to initial-value determinism, as the term is commonly used), and so the constraint you suggest comes down to requiring that the universe has a fine-tuned initial state which ensures that its evolution always remains on the fractal subset. But surely if this sort of fine-tuning is allowed then we can quite easily explain nonlocality without needing to appeal to noncomputability or fractal subspaces - i.e. we can encode the choices of measurements and the measurement results for all Bell experiments which will ever be performed directly into the initial state, and thus produce experimental results which appear non-local even though they are in fact produced from local evolution from this fine-tuned initial state. I think most physicists are not keen to adopt this approach to eliminating nonlocality because it seems unreasonably conspiratorial and fine-tuned - do you think your fractal approach gets round this complaint in some way, and if so, how?
I was also interested in the approach you take to recovering 'free will.' The distinction you make between defining free will via counterfactuals vs defining free will as the absence of constraint clearly ties into long-standing arguments in philosophy about the nature of free will, and I think there are indeed good arguments in favour of the latter approach even before one comes to the specific theoretical model that you introduce here - indeed I would be fascinated to read a paper discussing the links between your proposal and the body of philosophical literature on this topic!
view post as summary
report post as inappropriate
Author Tim Palmer replied on May. 16, 2020 @ 13:57 GMT
Dear Emily
Thank you for your interesting and important questions. In replying I need to make sure I don't end up writing another paper!
As far as my motivation is concerned, I did my PhD many years ago in GR (under the cosmologist Dennis Sciama) and in truth the reason why I have got so interested in Bell's Theorem is not because I am interested in Bell's Theorem per se, but rather...
view entire post
Dear Emily
Thank you for your interesting and important questions. In replying I need to make sure I don't end up writing another paper!
As far as my motivation is concerned, I did my PhD many years ago in GR (under the cosmologist Dennis Sciama) and in truth the reason why I have got so interested in Bell's Theorem is not because I am interested in Bell's Theorem per se, but rather that I think sorting this out properly is crucial to finding a viable approach to the synthesis of quantum and gravitational physics. These issues of locality vs nonlocality are very subtle: I can violate Bell's Theorem with the postulates "Free Choice on the Invariant Set" and "Factorisation on the Invariant Set". Technically these violate "Free Choice" and "Factorisation" in Bell's Theorem, but I would argue that they do so in a way that is comprehensible from the perspective of space-time causality in GR. To be honest, instead of talking about locality and nonlocality, I would rather ask: Does a nonlinear deterministic fractal state-space approach to quantum physics lead to a better model for synthesising with GR than conventional quantum theory? Well there are certainly some hints that it might!
Regarding fine-tuning, I sometimes use the analogy with BCC TV Doctor Who's Tardis. From the outside it is a tiny 1950s Police Box. From the inside it is an incredibly spacious space-ship. A Cantor Set has a similar property. From the outside it has measure zero with respect to the measure of the embedding Euclidean Space and therefore appears insignificant. However, from the inside it has the Cardinality of the continuum and therefore appears incredibly massive and spacious!!
So it all depends how you define this notion of "fine-tuned". From the intrinsic (technically Haar) measure of the invariant set, it is not at all fine tuned. Moreover using the p-adic metric (discussed in my essay), points which do not lie on the invariant set are necessarily distant from points which do, even though from a Euclidean point of view the distance between such points may appear tiny.
Finally, you are much more knowledgeable about the philosophy literature on free will than me. I would love to discuss with you how my ideas may or may not connect up to this literature. You will see in the essay that I flagged one of the key ideas in Lewis's theory of counterfactual causality as potentially being incorrect if the metric on state space is not Euclidean. I'm sure there is much to discuss here!!
Best wishes
Tim
view post as summary
Member Tejinder Pal Singh wrote on May. 17, 2020 @ 18:01 GMT
Dear Tim,
It was a pleasure reading your essay. and the valuable insights it gives. Would you happen to know if Connes' non-commutative geometry formalism would classify as uncomputable?
As regards quantum gravity, I beg to submit that I have made important progress recently, and proposed the theory of Spontaneous Quantum Gravity, described for example in my paper
Nature does not play dice at the Planck scaleIndependent of anything to do with the present contest, I will value your critique of my theory. I would like to reach out to many physicists with a request to examine this theory.
Many thanks,
Tejinder
report post as inappropriate
Author Tim Palmer wrote on May. 17, 2020 @ 18:33 GMT
Dear Tejinder
Connes comments in his book that non-commutative differential geometry provides a way to represent the measure of fractal sets such as the Julia set. Since these fractionally dimensioned sets are non-computable, then non-commutative geometry and non-computable geometries are related.
Hence, perhaps there are some interesting connections to be made between my invariant set model and Connes' non-commutative models of quantum physics.
Having said that, non-commutative geometry is not an area of mathematics of which I have any great knowledge.
I will take a look at your essay. The title sounds very appealing to me!!
Best wishes
Tim
Member Tejinder Pal Singh replied on May. 18, 2020 @ 00:41 GMT
Thanks Tim! This connection between non-computable geometries and non-commutative geometries is extremely interesting! I did not know about it. My new quantum theory of gravity builds on Connes' non-commutative geometry and Adler's trace dynamics.
Best,
Tejinder
report post as inappropriate
Torsten Asselmeyer-Maluga wrote on May. 18, 2020 @ 21:17 GMT
Dear Tim,
what a wonderful essay and it rings a bell.
Now since 8 years I studied general fractals (better known as wild embeddings) to get a spacetime representation for quantum states. You also discussed this interesting relation and you are right, there is a direct link between such objects and general relativity as well. The space of leaves of a foliation is also a non-commutative geometry (a la Connes) and this space is related to wild embeddings (or fractals) as well.
Amazingly, these connections are naturally related to the structure of spacetime (I mean the 4-manifold). If you choose another differential structure then you get automatically these connections. But enough for now.
I enjoyed very much reading your essay and gave them the highest score.
Maybe you are also interested to ream my
essay Because of Corona, I was to late this year....
Best wishes
Torsten
report post as inappropriate
Jonathan J. Dickau wrote on May. 18, 2020 @ 23:23 GMT
Hi Tim,
I read your essay a while back and I have not seen any comment from you on mine. Since there is some overlap in our topic emphases; it would be nice to have your opinion.
All the Best,
Jonathan
report post as inappropriate
Login or
create account to post reply or comment.