CATEGORY:
The Nature of Time Essay Contest (2008)
[back]
TOPIC:
Lessons from the Block Universe by Ken Wharton
[refresh]
Login or
create account to post reply or comment.
Ken Wharton wrote on Nov. 21, 2008 @ 10:16 GMT
Essay AbstractOur time-asymmetric intuitions make it difficult to be objective when considering the nature of time. But these difficulties can be overcome by using the framework of the "block universe", where every event is mapped onto a static, four-dimensional structure. In this perspective, time is represented as a spatial dimension, so the block universe can never "change"; there is no additional time dimension for such a concept to even make sense.
This essay argues that the block universe is by far the best framework for physical theories, as general relativity is simply incompatible with any alternative. The only part of physics that does not fit into such a "block" picture is quantum theory, as it was not originally developed in a block-universe framework. But far from implying that the block universe is incorrect, I argue that we can instead use lessons from the block universe to reconstruct quantum theory in a manner compatible with general relativity.
The essay then outlines how this might be accomplished. A block-universe quantum wavefunction must be represented in four-dimensional space-time, so the usual higher-dimensional "configuration space" is critically examined. The block universe view reveals that the extra information encoded in these higher dimensions is not actually needed, because all possible measurements do not occur on any given system. The need to "discard" the excess information in turn implies that every quantum system must solve a four-dimensional boundary value problem. Interestingly, this is also an approach that solves other outstanding interpretational problems from quantum theory, including the "collapse" of the wavefunction. Taking this research path would require a radical revision of nearly all aspects of quantum theory, but also promises to reshape our understanding of the nature of time.
Author Bio
Ken Wharton is a physics professor at San Jose State University. After attending Stanford (BS, Physics, 1992) and UCLA (PhD, Physics, 1998), he joined the Department of Physics and Astronomy at SJSU in 2001. While originally an experimental laser physicist, he is now a full-time quantum theorist who is actively pursuing the research program outlined in this essay. He has also been known to occasionally publish "hard" (scientifically-accurate) science fiction stories, including a novel that won the Special Citation for the 2001 Philip K. Dick Award.
Download Essay PDF File
T H Ray wrote on Nov. 22, 2008 @ 15:28 GMT
"...there's no objective way to distinguish an initial boundary condition from a final boundary condition without resorting to our time-asymmetric intuitions that don't apply in a block universe."
Nice.
This idea of "introcausality" just may hold the key to the preservation of continuous function physics in a way tractable to machine-computing algorithms, i.e., finite methods.
Intriguing insight, with a clearly defined research path. Thanks, Ken.
Tom
Peter Morgan wrote on Nov. 23, 2008 @ 14:31 GMT
Hi, Tom. Ken's arXiv:0706.4075 was mentioned to me on Friday by David Miller in Sydney, so finding this FQXi essay hours later was a surprise to me, both in timing and, by comparison, in content. In response to your comment, I guess I don't yet see that Ken's research path is clear.
Hi, Ken. I'm sorry to say that I have all sorts of trouble with details of your essay. I've been doing...
view entire post
Hi, Tom. Ken's arXiv:0706.4075 was mentioned to me on Friday by David Miller in Sydney, so finding this FQXi essay hours later was a surprise to me, both in timing and, by comparison, in content. In response to your comment, I guess I don't yet see that Ken's research path is clear.
Hi, Ken. I'm sorry to say that I have all sorts of trouble with details of your essay. I've been doing mathematics of quantum fields and of random fields for some time, a small part of which is outlined in my FQXi essay. The mathematics is avowedly set against a block-world Minkowski space -- I s'pose anyone who works in QFT would be s'prised to be told they ain't working in terms of block world /models/.
I note that discussions about placing initial and final conditions on our models are academic, insofar as we only have empirical data about the world-tube in the universe that our world (or, more solipsistically but in a similar world-tube sense, my body) occupies in the block-world. And, at any moment of our effective psychological present, our data is limited to what we actually have gathered and recorded in the past.
I could take issue with almost every paragraph of your paper, but I will instead pick your last paragraph's comment that "physicists would be better off using the block universe to re-envision quantum theory from scratch". This is part of what I would say I have tried to do (you will judge to what extent I've succeeded), however I have no ontological commitment to the block world, it is merely a useful way to organize our empirical data (which is always only about the past, albeit the past is always increasing, and of limited detail), and our approximate predictions about the empirical data we will gather in the future.
Over-briefly, I will say of one detail that your strictures about particle configuration space in quantum mechanics do not apply once we model the world using quantum or random fields, at least on my essentially statistical field-theoretic interpretation. Additionally, however, as soon as we speak of statistics in Physics we have to identify an ensemble, for which we have to identify distinct regions of space-time as measured to be similar enough for them to be put together as an ensemble, but measured to be different enough for the statistics to be nontrivial (interesting and/or useful is good too). This is hard enough if we use a Minkowski space background, but much easier then than if we run experiments against a dynamical curved space background. The point for your view is, I think, that insofar as the experimental verification of a model requires ensembles and statistics, this requires more mathematical structure than a block world has traditionally provided.
To put what I think is a significant question as well as my position rudely, do you see your block world as an ontology or as an effective ordering principle for empirical data? You put your heart on your sleeve for a block world, but to me you don't give a strong enough sense of your philosophical intentions. Feel free to reply expansively; it's your comment thread.
Despite the above, I have commented here because of some common feeling with your essay.
view post as summary
T H Ray wrote on Nov. 23, 2008 @ 15:41 GMT
Hi Peter. You write "Hi, Tom. Ken's arXiv:0706.4075 was mentioned to me on Friday by David Miller in Sydney, so finding this FQXi essay hours later was a surprise to me, both in timing and, by comparison, in content. In response to your comment, I guess I don't yet see that Ken's research path is clear."
It's clear to me.
Ken is quite correct that quantum mechanics did not evolve from a block universe model. The relativistic block universe came to us mathematically complete; quantum mechanics was knitted from experimental results. The addition of field theoretical quantum physics (e.g., your model) attempts to restore continuous function analysis to discrete phenomena.
I doubt that the David Miller you mention is the Karl Popper protege and scholar at Warwick U., U.K., with whom I am acquainted, but if he were, he should appreciate Ken's bold conjectural approach toward this problem. I know I do, and although Ken can speak for himself, if it's philosophical intentions you demand, I stand firmly in the Popper camp.
The research direction in Ken's work that I find clear--as I stated--is the possibility of strict computability, i.e., of an algorithm to model discrete phenomena that explains why the universe appears to obey continuous functions.
Tom
Peter Morgan wrote on Nov. 23, 2008 @ 19:26 GMT
Tom, I agree that QM didn't evolve from a block universe model. I think, and I think you accept in your comment, that relativistic quantum fields do, now, largely adopt a block world ground. You know, but Ken presumably doesn't yet, that I work with continuous models not because it's how I think the world must be, but only because /I/ find it convenient, as of now, to do so, even though the finite number of finite accuracy measurements that we can make and record cannot possibly justify a continuous model.
I guess the David Miller I mentioned appreciates something about Ken's work, at least in relation to mine, because he was reminded of Ken's work, and suggested it to me, upon seeing my FQXi essay, which I had asked him and a few others in Sydney to look at. This is a David Miller who wrote "Realism and time symmetry in quantum mechanics", Phys. Lett. A222, 31-36(1996). His web-page in Sydney is at
http://www.usyd.edu.au/time/people/miller.htm.
I appreciate the empiricist sentiment that citing Popper claims, but most Physicists and Philosophers of Physics are influenced enough by the devastating mid-century critiques of positivism that it's important to know in what ways they accommodate those critiques. Claiming to be Popperian no longer adequately informs us of your point of view.
I perhaps fail clearly to discern Ken's research path more because of the pairing of Ken's FQXi essay with his arXiv:0706.4075 than because of either of them taken separately. I worry that a block world structure is not a sufficiently strong guiding principle by itself for constructing a new mathematics, and I don't see clearly what other mathematical or physical principles Ken advocates.
Mark Stuckey wrote on Nov. 23, 2008 @ 19:52 GMT
Great essay, Ken. As you know from our meeting at Perimeter this fall, we agree on the use of a blockworld for fundamental physics. I’ve two comments/questions:
“Doesn't this imply that we need to come up with a more expansive view of space-time that is somehow compatible with both quantum theory and relativity? Balderdash. … GR is the correct tool to ask questions about space and time.”
GR has (at least one) temporal pathology, namely closed time-like curves (CTCs) allowing for self-inconsistency, e.g., a particle looping around a CTC segment so that it strikes itself before it entered the CTC segment, thereby keeping it from entering the segment to begin with. How do you propose GR be modified to rectify the existence of such CTCs and how is this incorporated in your formalism?
“So far, the closest approach to the block universe is the ‘de Broglie-Bohm Interpretation’.”
The Relational Blockworld (Foundations of Physics 38, No. 4, 348 – 383 (2008), quant-ph/0510090) is consistent with your argument that blockworld time be made compatible with quantum physics. The main difference between your approach and RBW is that RBW is fundamentally probabilistic so we don’t have “to reinvent every single piece of quantum theory in a block universe framework.” On the contrary, quantum physics as it stands makes perfect sense in RBW. [Our essay will be posted this week.]
A BW kindred spirit,
Mark
Dr. E (The Real McCoy) wrote on Nov. 23, 2008 @ 20:44 GMT
"Gradually the conviction gained recognition that all knowledge about things is exclusively a working-over of the raw material furnished by the senses. ... Galileo and Hume first upheld this principle with full clarity and decisiveness." --(Albert Einstein, Ideas and Opinions)
Hello Ken,
You write, "Our time-asymmetric intuitions make it difficult to be objective when considering the...
view entire post
"Gradually the conviction gained recognition that all knowledge about things is exclusively a working-over of the raw material furnished by the senses. ... Galileo and Hume first upheld this principle with full clarity and decisiveness." --(Albert Einstein, Ideas and Opinions)
Hello Ken,
You write, "Our time-asymmetric intuitions make it difficult to be objective when considering the nature of time. But these difficulties can be overcome by using the framework of the "block universe", where every event is mapped onto a static, four-dimensional structure. In this perspective, time is represented as a spatial dimension, so the block universe can never "change"; there is no additional time dimension for such a concept to even make sense."
Actually Godel had a huge problem with the block universe, as it implies time travel while denying the flow of time. Godel pointed out the paradoxical "timeless" implications of the block universe, as well as its inability to account for time as we experience it, and this problem has largely been swept under the rug, along with curiosities such as quantum entanglement, nonlocality and all the dualities--space/time, energy/mass, and wave/particle. Today we are told that that is "just the way things are" and not to worry about it. Perhaps this helps explains why physics has not really advanced in the past thirty years... for Einstein stated, "curiosity is more importnat than knowledge."
The block universe is an human-constructed artifact of certain interpretations of relativity, as physicists glossed over the fact that x4 or "ict" is very different from the three spatial dimensions, x1, x2, x3. But Einstein and Minkowski had it right there in Einstein's 1912 manuscript: x4 = ict. Ergo, if time moves, so must x4. My paper discusses the fourth expanding in far more detail.
Time as an Emergent Phenomenon: Traveling Back to the Heroic Age of Physics by Elliot McGucken
--http://fqxi.org/community/forum/topic/238
Yes--this
block time paradox/problem was swept under the rug on many levels, as well as the EPR paradox, and it is great that fqxi allows a forum to discuss such curious phenomena of our physical reality. MDT's simple principle, celebrating a hitherto unsung universal invariant--the fourth dimension is exapnding relative to the three spatial dimensions--provides a physical model liberating us from Godel's block universe while also accounting for the "spooky" action at a distance in the EPR Paradox.
You should read A World Without Time, by Palle Yourgrau
"For Godel, if there is time travel, there isn't time. The goal of the great logician was not to make room in physics for one's favorite episode of Star Trek, but rather to demonstrate that if one follows the logic of relativity further even than its father was willing to venture, the results will not just illuminate but eliminate the reality of time." -A World Without Time, Palle Yourgrau"
MDT posits that time travel into the past is not possible, as the past does not physically exist--an observation in line with all empirical observations. MDT chooses Godel, Einstein, and Minkowski over Star Trek.
You write, "This essay argues that the block universe is by far the best framework for physical theories, as general relativity is simply incompatible with any alternative."
General Relativity is completely compatible with MDT, as MDT's physical reality underlies all of relativity--indeed, relativity is derived from MDT in my paper. All of quantum mechanics is also completely compatibel with MDT.
You write, "The only part of physics that does not fit into such a "block" picture is quantum theory, as it was not originally developed in a block-universe framework. But far from implying that the block universe is incorrect, I argue that we can instead use lessons from the block universe to reconstruct quantum theory in a manner compatible with general relativity." Quantum gravity exists neither in reality, nor in any consistent theory.
You write, "Balderdash. Looking to quantum theory for answers about space-time is like looking to a roadmap for answers about geology: it's a tool designed for something else entirely."
Every physical measurement made of physical reality is governed by quantum mechanics. So please do not throw out all empirical evidence in contemplating the physical nature of time.
You say, "The solution to this dilemma is not to jettison the block universe; without the block universe we would never be able to make sense of relativity."
Actually, my essay liberates us from the block universe, while providing a *physical* foundation for relativity, while unfreezing time and providing a *physical* mechanism for entropy, nonlocality, quantum entanglement, all the dualities--wave/particle, space/time, mass/energy--and time and all its arrows and assymetries across all realms, as well as the pervasiveness of Huygens' Principle.
You write, "Instead, the solution is to reinvent every single piece of quantum theory in a block universe framework. It's a daunting task, but I'll outline how it might be done after I discuss why quantum theory can't just be "tweaked" into a block universe framework." Is this not what the quantum gravity regimes and string theorists have spent hundreds of millions of dollars trying to accomplish, with naught to show for it? As MDT shows that the block universe does not exist, there is no longer any need to send postdocs and graduate studnets dashing down dead-end roads. Indeed, we live in a strange era where physicits try to advance physics by paying other people to work out the details on their non-theories, raising funding by merely promising that they are on to something big, which oft neglects physical reality as a foundational premise.
You write, "So what happens to the "wasted" information in ø for those unperformed measurements? Well, in the Copenhagen interpretation, much of that information gets erased forever thanks to the "collapse" of the wavefunction. At this point one might ask: What is the point of using a high-dimensional configuration space to encode extraneous information that just gets erased anyway?"
By conducting physics in realms safe from measurement, as well as simple logic and reason and physical reality, one is generally guaranteed a lifetime of funding--a theorem proven time and again by string theory and other anti-theories.
You write, "Counterintuitive though this may be, our intuitions about time are notoriously unreliable. We need to free our intuition from time itself, taking all of our lessons from the static block universe."
Einstein would disagree--"The only real valuable thing is intuition."--Albert Einstein.
To reject *physical* intuition and replace it with the nonsensical block universe MDT does away with seems to go exactly against the spirit by which physics has ever advanced, according to Galileo, Einstein, and other noble physicists.
It seems a preposterous conclusion that quantum mechanics, which works so very well, must be thrown out and reformulated for something which MDT shows there is no need for--the block universe.
"In the long run my observations have convinced me that some men, reasoning preposterously, first establish some conclusion in their minds which, either because of its being their own or because of their having received it from some person who has their entire confidence, impresses them so deeply that one finds it impossible ever to get it out of their heads. Such arguments in support of their fixed idea ... gain their instant acceptance and applause. On the other hand whatever is brought forward against it, however ingenious and conclusive, they receive with disdain or with hot rage - if indeed it does not make them ill. Beside themselves with passion, some of them would not be backward even about scheming to suppress and silence their adversaries. I have had some experience of this myself. ... No good can come of dealing with such people, especially to the extent that their company may be not only unpleasant but dangerous."--(Galileo Galilei)
"my dear Kepler, what do you think of the foremost philosophers of this University? In spite of my oft-repeated efforts and invitations, they have refused, with the obstinacy of a glutted adder, to look at the planets or Moon or my telescope." --Galileo Galilei
We must forver keep physical reality in the front and center, along with logic and reason and *physical* intuition--otherwise progress in physics will grind to a halt, as it has for the past thirty years.
"But before mankind could be ripe for a science which takes in the whole of reality, a second fundamental truth was needed, which only became common property among philosophers with the advent of Kepler and Galileo. Pure logical thinking cannot yield us any knowledge of the empirical world; all knowledge of reality starts form experience and ends in it. Propositions arrived at by purely logical means are completely empty as regards reality. Because Galileo saw this, and particularly because he drummed it into the scientific world, he is the father of modern physics -- indeed, of modern science altogether." --Albert Einstein, Ideas and Opinions
view post as summary
Michael wrote on Nov. 24, 2008 @ 07:50 GMT
Hi Ken,
This is in answer to your question how RBW differs from your view. First see our essay soon to be posted. Second, read the following:“Why Quantum Mechanics Favors Adynamical and Acausal Interpretations such as Relational Blockworld over Backwardly Causal and Time-Symmetric Rivals” in a focus issue of Studies in the History and Philosophy of Modern Physics on time-symmetric approaches to quantum mechanics edited by Huw Price, Volume 39, Issue 4, pp. 732-747. M. Silberstein, M. Cifone and M. Stuckey. I'll try to attach it.
The general answer however is this: while, like yourself, we take blockworld (BW) as an essential feature of interpreting QM, we don't need to revamp QM, e.g., replace the Schrodinger equation with the Klein-Gordon equation. In the aforementioned paper we argue that retrocausal accounts of QM do not take acausal and adynamical thinking seriously enough and that their retrocausal devices amount to little more than a veiled assertion that in the BW the outcomes of QM experiments are already "there." For example, even Price admits that all such causal talk (retro or otherwise) is merely perspectivial. So the first problem is how to invoke BW in a non-trivial explanatory fashion. We show how to do this with an adynamical and acausal explanation that is fundamental to any dynamical explanation of QM and it involves BW in an essential fashion. Second problem, we show that the experimental set-up known as quantum liar experiment (QLE) is fatal for any purely dynamical time-like or retrocausal account that purports to save locality, while RBW has no problems.
In addition, RBW fully resolves the measurement problem and is fully compatible with special relativity (SR) as it is local while being non-separable and requires no FTL influences or action-at-a-distance. Perhaps most importantly of all, as the essay will elaborate, RBW leads to a completely unique solution to the problem of quantum gravity with profound implications for the various problems of time.
One last minor point. I think you might want to sharpen your claim that QM and BW are inherently incompatible. You seem to think that BW ENTAILS that there is only one outcome for every experiment in M4, while the Hilbert space of QM demands otherwise. But surely this isn't true,after all, Saunders and other Everettians defend the "QM block world" wherein all the outcomes exist in a BW setting. It's true that they must explain why it appears that there are only 3 spatial dimensions or why these 3 dimensions of space emerge from the more fundamental Hilbert space, but there are many such programs. Furthermore, the Everett interpretation squares perfectly with SR and locality. The burden of establishing comparative advantage is especially high for you given your need to radically revise QM.
Another kindred BW spirit.
Michael
attachments:
SHPMP557.pdf
Ken Wharton wrote on Nov. 24, 2008 @ 14:27 GMT
Tom,
Thank you for the kind words; I'm glad you found the essay stimulating. As for your discussion with Peter as to whether there is a "clear research path", you're right that certain paths forward are certainly clear, and he's right that this essay (and the arXiv paper) don't exactly make it clear *which* research paths I'm advocating. There are a lot of paths forward, and I'm still not certain which ones are most promising. After all, when going all the way back to 1927 and changing all these fundamental assumptions, the amount of work that needs to be done just to recover known experimental results is truly daunting. (Of course, I have my opinions on how best to proceed, but more on that some other time...)
Concerning your interest in the computability aspect, there's both good news and bad news. The good news (that you point out) is that there's actually something to compute: systems of well-defined equations and well-defined boundary conditions, with solutions of easy-to-interpret classical field values at given points in space-time. The bad news is that by imposing different portions of the boundary conditions at different times, the usual computational technique of starting with the initial solution and then incrementally calculating subsequent time-steps will no longer work. The whole 4D system needs to be solved globally, like a 3D spatial boundary problem. And when one can't solve the equations exactly, it's not at all clear how to proceed computationally. (Ideally, I'd like to "push" the computational uncertainties toward the center of the 4-volume, away from the boundaries.) Any insight on this issue would certainly be appreciated.
Best,
Ken
Ken Wharton wrote on Nov. 24, 2008 @ 14:28 GMT
Peter,
Thanks for your interest -- and no, I certainly don't consider it rude to ask me about my ontology! (More on this in a sec.) By the way, I had already read your recent arXiv paper, and I had been planning to introduce myself once I read up on random fields. This week I'll head over to your own essay and see what I can find out...
>I s'pose anyone who works in QFT would be...
view entire post
Peter,
Thanks for your interest -- and no, I certainly don't consider it rude to ask me about my ontology! (More on this in a sec.) By the way, I had already read your recent arXiv paper, and I had been planning to introduce myself once I read up on random fields. This week I'll head over to your own essay and see what I can find out...
>I s'pose anyone who works in QFT would be s'prised to be told they ain't working in terms of block world /models/.
In that case, what's the difference between classical fields, random fields, and quantum fields? As I see it, the moment one takes the classical meaning of the x-coordinate and turns it into an *operator*, one has left the block universe behind. Furthermore, any QFT theorist would admit that many interpretational questions are shunted down to non-relativistic quantum mechanics. (If it were otherwise, QFT theorists could answer all the outstanding quantum mysteries just by taking the non-relativistic limit of QFT.)
>I note that discussions about placing initial and final conditions on our models are academic, insofar as we only have empirical data about the... past.
What about a double-slit experiment that happened back in 2003? Why shouldn't we be able to go back and apply the empirical data as boundary conditions on both the state-preparation stage and the data-acquisition stage of the experiment? That would be a final boundary condition on the intermediate system, but it's all still in the past. (And if that's okay, then we could certainly imagine doing the same thing for possible future outcomes of future experiments as a computational tool for making predictions.)
>insofar as the experimental verification of a model requires ensembles and statistics, this requires more mathematical structure than a block world has traditionally provided.
I wasn't trying to imply that we can get *all* of our answers from the block universe, any more than Einstein was able to derive GR from just the equivalence principle. But I think we can certainly include ensembles and statistics in a block universe; see my next post for more detail.
>do you see your block world as an ontology or as an effective ordering principle for empirical data?
I see the block universe as a framework that is a logical necessity for any physics model that includes space and time. So in that sense it's an ordering principle, but I'm applying it to more than just empirical data. After all, I want to be able to say what happens between quantum measurements, at all points in space-time. The ontology I'm proposing is simply that of classical fields, where all the "quantum weirdness" comes about from the closed-hypersurface boundary conditions on those fields (and a probability that is applied to the whole hypersurface, not just the outcome).
>I worry that a block world structure is not a sufficiently strong guiding principle by itself for constructing a new mathematics, and I don't see clearly what other mathematical or physical principles Ken advocates.
Your worry is well-founded; the block universe is certainly not enough by itself. But in addition to this framework, along with insights concerning space-time from general relativity, the principle that has guided me this far is simply this:
Fundamental physics proposes an underlying ontology to explain empirical data. Einstein taught us the importance of making sure the ontology is consistent when the same empirical data are viewed from different reference frames. I simply want to extend this consistency requirement to time-reversed reference frames.
Every single standard interpretation of quantum mechanics fails on this count; they all give *fundamentally different explanations* for the same empirical data, if simply viewed in the opposite temporal order. Fixing this problem is a very demanding principle, that leads almost inevitably to the type of conclusions that I am drawing. However, I realize that few people feel this type of symmetry is important, so the essay and the arXiv paper each explore different motivations that reach the same conclusions.
Best,
Ken
view post as summary
Ken Wharton wrote on Nov. 24, 2008 @ 14:30 GMT
Hi Mark,
I must apologize for not having written since we met last month; I have your papers on top of a stack of must-read items, but haven't yet been able to devote the necessary time to them this semester. Soon, I promise.
>GR has (at least one) temporal pathology, namely closed time-like curves (CTCs) allowing for self-inconsistency, e.g., a particle looping around a CTC segment so that it strikes itself before it entered the CTC segment, thereby keeping it from entering the segment to begin with.
I think quantum effects save the day here, for two different reasons: one is that we need to replace classical particles with fields, so there's so such thing as an "all or nothing" trajectory (didn't Feynman work out an example like this with a light switch?). The other is that there's no way to prepare the initial state with sufficient accuracy to cause this precise dilemma. In fact, as I see it, the logical necessity of uncertainty principle is that it prevents exactly this sort of paradox. (And these paradoxes would no longer require CTCs if one takes an introcausal perspective where the future is always affecting the past.)
>The main difference between your approach and RBW is that RBW is fundamentally probabilistic so we don’t have “to reinvent every single piece of quantum theory in a block universe framework.”
I wasn't trying to lump in RBW in with the "established" interpretations; more on RBW after you post your essay.
But I will say that I've given a great deal of thought to how a theory can be "fundamentally probabilistic" and still work in a block universe. I do think it can be done, but not in the way that standard QM's Born Rule is probabilistic: those are *outcome* probabilities, which is a concept of dubious validity in a block universe. The key, I think, is to define probabilities over parameters that are unknown but well-defined. Then, once you know everything, all those probabilities can converge to 1 or 0 (in a block universe, any given event either happens or it doesn't). This is also deeply consistent with a Bayesian interpretation of probability, where probability is just a measure of belief based on available information, not anything fundamental that is somehow "out there" in the static block universe.
Cheers,
Ken
Ken Wharton wrote on Nov. 24, 2008 @ 14:30 GMT
Dr. E,
From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your post has not swayed me. Perhaps we'll just have to agree to disagree.
Best,
Ken
Ken Wharton wrote on Nov. 24, 2008 @ 14:31 GMT
Hi Michael,
I'm looking forward to your essay... I'll try to carefully read through your papers this week as well.
>...their retrocausal devices amount to little more than a veiled assertion that in the BW the outcomes of QM experiments are already "there."
Wait a sec -- surely if you are using a block picture you must agree that the outcomes are, in a timeless sense, "already there"? I hope your point here is that other approaches merely assert the outcome without giving any tool to determine that outcome's relative likelihood. If so, I have such a tool; a probability measure of the entire hypersurface boundary; it's in the arXiv paper.
>Saunders and other Everettians defend the "QM block world" wherein all the outcomes exist in a BW setting.
I guess I merely dismissed such a picture in my essay without going into details... But I refuse to accept that any Everettian picture is compatible with a block universe until I see their version of general relativity. It would have to explain exactly how all these universes are connected together, something they seem to avoid pinning down. (Actually, I admit that I still probably wouldn't accept it, even then, because it would still violate the guiding principle I spelled out at the end of my earlier response to Peter.)
To me, Everett's Many Worlds Interpretation is the poster child for how awkward it can be to extrapolate non-block-universe concepts to their logical conclusions. Best to start off with block-universe-compatible concepts in the first place.
Cheers,
Ken
Peter Morgan wrote on Nov. 24, 2008 @ 18:01 GMT
Interesting responses to all your commenters. I feel clearer, anyway.
"what's the difference between classical fields, random fields, and quantum fields"
Between the first and the second, classical fields don't work well with probability: if we introduce thermal fluctuations, when we measure the field it will almost certainly be discontinuous (akin to the discontinuous paths of Brownian motion). To accommodate modern physics experiments, however, we *have to have probability* in the mix, which I would /prefer/ to have well-defined. Continuous random fields are the nicest mathematics for introducing probability into a block world of classical fields (at least, I think there's no contest from stochastic methods, such as are used in Stochastic ElectroDynamics, in GRW-type reduction mechanisms, or in Langevin-type equations). Perhaps, although here I am out of my mathematical area, my methods could be called a block world approach to the Fokker-Plank equation.
Continuous random fields are very close to quantum fields, but they are different in commuting instead of being non-commuting at time-like separation. That of course leads to a different measurement theory, which we certainly have to discuss carefully, but so much stays the same that we can feel relatively comfortable with the transition from QFT to continuous random fields. This seems a strong merit for random fields as a mathematics that moves us away from QFT, even if it's only a transition to a better mathematics for fundamental physics.
x,y,z,t are coordinates in QFT and for continuous random fields. The field is an operator-valued distribution, but QFT and random fields are set against a classical manifold. That changes if one moves to the mathematics of non-commutative geometry, but that's not the standard model.
I feel ambivalent about asserting a block world ontology for future events, insofar as I cannot experiment on the future, confined as I am to my psychological present, but a model has to model, aka predict, something about the future.
I'm also looking forward to the RBW essay.
Dr. E (The Real McCoy) wrote on Nov. 24, 2008 @ 20:59 GMT
Thanks Ken!
You write, "From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your...
view entire post
Thanks Ken!
You write, "From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your post has not swayed me. Perhaps we'll just have to agree to disagree."
There are vast problems with the block universe, including the fact that implies time travel, while offering no mechanism for the arrows of times, nor the assymetries of time. Also, the block universe freezes time, while robbing us of our free will.
These problems are not my mere "opinion," but rather they are realities noted by great minds including Kurt Godel and R. P. Feynman.
MDT resolves these problems, not by ignoring them and glossing over them, but by presenting a hitherto unsung universal invariant--the fourth dimension is expanding relative to the three spatial dimensions at c, distributing locality and fathering time: dx4/dt=ic.
Indeed, MDT finally provides, in Feyman's words, "the thing that makes the whole phenomena of the world seem to go one way."
While begin your essay with, "Our time-asymmetric intuitions make it difficult to be objective when considering the nature of time," Feynman instead embraces physical reality, and sees the ubiquitous presence of time's arrows and assymtries--and MDT agrees with Feynman, that we ought not ignore nature's *physical* reality, and that we need to find "the thing that makes the whole phenomena of the world seem to go one way."
"All knowledge of reality starts form experience and ends in it." --Einstein
And that "thing" is the invariant expansion of the fourth dimension--dx4/dt=ic.
Feynman stated, "Now if the world of nature is made of atoms, and we too are made of atoms and obey physical laws, the most obvious interpretation of this evident distinction between past and future, and this irreversibility of all phenomena, would be that some laws, some of the motion laws of the atoms, are going one way – that the atom laws are not such that they can go either way. There should be somewhere in the works some kind of principle that uxles only make wuxles, and never vice versa, and so the world is turning away from uxley character to wuxley character all the time – and this one-way business of the interactions of things should be the thing that makes the whole phenomena of the world seem to go one way. But we have not found this yet. That is, in all the laws of physics that we have found so far there does not seem to be any distinction between the past and the future. The moving picture should work the same going both ways, and the physicist who looks at it should not laugh."--(The Distinction of Past and Future, from The Character of Physical Law, Richard Feynman, 1965)
MDT also resolves the problems Godel had with a 4D block universe, which allowed time travel, while disallowing the flow of time. Time travel is impossible, because time, as measured on our watches, is an emergent phenomena that arises becase the fourth dimension is expanding relative to the three spatial dimensions at c, or dx4/dt=ic. In his 1912 Manuscript on Relativity, Einstein never stated that time is the fourth dimension, but rather he wrote x4 = ict. The fourth dimension is not time, but ict. Despite this, prominent physicists have oft equated time and the fourth dimension, leading to un-resolvable paradoxes and confusion regarding time’s physical nature, as physicists mistakenly projected properties of the three spatial dimensions onto a time dimension, resulting in curious concepts including frozen time and block universes in which the past and future are omni-present, thusly denying free will, while implying the possibility of time travel into the past, which visitors from the future have yet to verify. Beginning with the postulate that time is an emergent phenomenon resulting from a fourth dimension expanding relative to the three spatial dimensions at the rate of c, diverse phenomena from relativity, quantum mechanics, and statistical mechanics are accounted for. Time dilation, the equivalence of mass and energy, nonlocality, wave-particle duality, and entropy are shown to arise from a common, deeper physical reality expressed with dx4/dt=ic. This postulate and equation, from which Einstein’s relativity is derived, presents a fundamental model accounting for the emergence of time, the constant velocity of light, the fact that the maximum velocity is c, and the fact that c is independent of the velocity of the source, as photons are but matter surfing a fourth expanding dimension. In general relativity, Einstein showed that the dimensions themselves could bend, curve, and move. The present theory extends this principle, postulating that the fourth dimension is moving independently of the three spatial dimensions, distributing locality and fathering time. This physical model underlies and accounts for time in quantum mechanics, relativity, and statistical mechanics, as well as entropy, the universe’s expansion, and time’s arrows and assymetries.
Godel had a huge problem with the block universe, as it implies time travel while denying the flow of time. Godel pointed out the paradoxical "timeless" implications of the block universe, as well as its inability to account for time as we experience it, and this problem has largely been swept under the rug, along with curiosities such as quantum entanglement, nonlocality and all the dualities--space/time, energy/mass, and wave/particle. Today we are told that that is "just the way things are" and not to worry about it, nor ask foundational questions. Perhaps this helps explain why physics has not really advanced in the past thirty years... for Einstein stated, "curiosity is more important than knowledge."
"For Godel, if there is time travel, there isn't time. The goal of the great logician was not to make room in physics for one's favorite episode of Star Trek, but rather to demonstrate that if one follows the logic of relativity further even than its father was willing to venture, the results will not just illuminate but eliminate the reality of time." -A World Without Time, Palle Yourgrau"
Too many modern physicists resolve glaring problems these days by completely ignoring them. I think FQXI is sensitive to this--the fact that foundational questions are largely ignored by the modern academy; while those who ponder them are too often snarked.
"Forget time," we are told, while at the same *time* funding is sought for throwing out quantum mechanics and reality, so as to reformulate physics based on the unreal--the human construct of a block universe, while ignoring the real--the ever-flowing movement of time, which emerges because the fourth dimension is expanding relative to the three spatial dimensions at c.
Best,
Dr. E (The Real McCoy)
view post as summary
Mark Stuckey wrote on Nov. 25, 2008 @ 02:51 GMT
Thanks for your response, Ken. Hope you’re willing to continue this thread until I understand your position.
“I think quantum effects save the day here, for two different reasons: one is that we need to replace classical particles with fields, so there's so such thing as an "all or nothing" trajectory (didn't Feynman work out an example like this with a light switch?). The other is that there's no way to prepare the initial state with sufficient accuracy to cause this precise dilemma. In fact, as I see it, the logical necessity of uncertainty principle is that it prevents exactly this sort of paradox.”
I’m talking about classical objects, so the DeBroglie wavelengths are much smaller than the objects themselves. You’re not suggesting that such objects be modeled as and exhibit wave characteristics, are you?
Regarding your second point, are you claiming that a classical object won’t follow the self-inconsistent CTC simply because that path is highly improbable? If so, what makes it more improbable than any other? What happens when an object follows this improbable path? Or, do you mean impossibility rather than improbability? If so, what makes it impossible?
“(And these paradoxes would no longer require CTCs if one takes an introcausal perspective where the future is always affecting the past.)”
So, are you saying GR needs to be augmented with a self-consistency principle? How is it realized physically? Would I feel a mysterious force pushing the ball out of my hands as I’m about to start it on the self-inconsistent path? Would I suddenly change my mind, asking myself later, “Gee, why didn’t I release the ball?”
Thanks again for your patience,
Mark
Anonymous wrote on Nov. 25, 2008 @ 13:27 GMT
Ken,
You write "The whole 4D system needs to be solved globally, like a 3D spatial boundary problem. And when one can't solve the equations exactly, it's not at all clear how to proceed computationally. (Ideally, I'd like to "push" the computational uncertainties toward the center of the 4-volume, away from the boundaries.)" I am in full accord.
I envision the computational possibility for a sorting algorithm to perform strongly polynomial time claculations of least path, least energy between t and t' based on your probability calculations of future boundary conditions at t', for any arbitrarily chosen scale. To explain:
My ICCS 2007 paper, necsi.org/events/iccs7/papers/740473b577c92da06ccd77fad70c.p
df, proposes that the flow of information for a random field of complete probable future states to a partially ordered present is indistinguishable--as you have also concluded--from the flow of information past to future.
How about an n-dimensional, 2-point boundary value problem in which the path t to t' is maximally efficient, least action? I compare the classical 2-point boundary, 6 dimension problem of landing a rocket on the moon in shortest path at least fuel cost, with a universal control system in which negative feedback from the future informs the present state. Gravity is, in fact, just such a univeral negative feedback system. In other words, negative feedback informs the present, positive feedback informs the future, and stasis--or neutral feedback--is the aggregated smoothly continuous property of the complex system on the large scale.
As a result, what we call "the present," at an arbitrarily chosen frozen moment of time, is the least of all possible moments. Your block universe is therefore preserved without sacrificing the dynamical properties of an evolving system, and accounting for the closed hypersurface bondary conditions.
Peter, you write "Claiming to be Popperian no longer adequately informs us of your point of view." Fair enough! The philosophy to which I particularly refer is what Popper called "metaphysical realism." (Objective Knowledge; Realism & the Aim of Science.) I think Ken's proposal meets the criteria; future boundary conditions are necessarily metaphysical but not beyond comprehension and indirect measurement.
Tom
report post as inappropriate
Ken Wharton wrote on Nov. 26, 2008 @ 06:14 GMT
Hi Peter,
You may be "ambivalent" about treating the future the same as the past, but I argue it's this precise ambivalence that has let to so many problems when it comes time to reconcile QM with GR. Even though it's counter-intuitive, we need to force ourselves to treat the past and the future on the same footing. After all, *eventually* the future will be past, and we shouldn't have to treat those events differently in our equations once that happens. (Granted, learning about uncertain values makes them more certain, but that sort of thing equally applies to uncertain values both in the past and the future.)
I'll email you with some thoughts concerning random fields and probability... I've recently become interested in a possible overlap between stochastic fields and this two-time boundary framework, and would like to better understand if there's any connection with your own research -- because as you say, there are some common threads between our ideas. (On the other hand, I'm concerned that you're treating probability as a physical substance rather than just a consequence of uncertainty. That doesn't work in a static block universe, because those "real" probabilities must somehow *change* to become some certain outcomes.)
Cheers,
Ken
Ken Wharton wrote on Nov. 26, 2008 @ 06:18 GMT
Hi Mark,
I *am* suggesting that everything is really classical fields, but this is no stranger than a QFT theorist suggesting that everything is really quantum fields -- it's just a question of when you're allowed to approximate those fields as classical objects. (And no fair giving me a far-out scenario, and then appealing to common sense to prevent me from using fields! :-) After all,...
view entire post
Hi Mark,
I *am* suggesting that everything is really classical fields, but this is no stranger than a QFT theorist suggesting that everything is really quantum fields -- it's just a question of when you're allowed to approximate those fields as classical objects. (And no fair giving me a far-out scenario, and then appealing to common sense to prevent me from using fields! :-) After all, if one *ever* expects some weird field-like aspect of a macroscopic object to rear its head, it'll be in some sort of outlandish situation like this one...).
I found the Feynman reference -- it was a very similar example addressed in Wheeler/Feynman's 1949 paper (not the 1945 one). Check it out -- they conclude that all these paradoxes rely on an "all-or-nothing" sort of interaction, but once you allow continuous interactions (say, a glancing blow due to a slightly-misaligned trajectory through a CTC) there's always a resolution.
>Regarding your second point, are you claiming that a classical object won’t follow the self-inconsistent CTC simply because that path is highly improbable? If so, what makes it more improbable than any other?
Yes, that's almost what I'm claiming. Technically, I'm claiming that the precise initial conditions that would send it on such a trajectory, combined with the precise later conditions on space-time that the CTC exists, have a joint probability distribution that is related to the number of global solutions that satisfy all those conditions. If you postulate that there are no such solutions (as you do), then the probability is exactly zero.
>So, are you saying GR needs to be augmented with a self-consistency principle? How is it realized physically?
Now this question I'm surprised to hear coming from a "Block World Kindred Spirit"... To me, one of the biggest advantages of the block universe framework is that it *is* a consistency principle, in and of itself! Paradoxes can't happen in a block universe, by definition. In a non-block-universe framework, like standard QM, each of these possible paradoxes has to be ruled out, one at a time (although see arXiv: quant-ph/0506141 for David Pegg's excellent take on how QM can do this).
So GR doesn't need any new consistency principle as long as you don't impose so many boundary conditions that there's no solution, and QM wouldn't need one either if we re-build it along the lines I suggest in my essay. I would hope that adding a generic constraint on the allowed boundary conditions would naturally prevent overconstrained problems in classical GR (such as this one).
>Would I feel a mysterious force pushing the ball out of my hands as I’m about to start it on the self-inconsistent path? Would I suddenly change my mind, asking myself later, “Gee, why didn’t I release the ball?”
Well, everything's mysterious until you understand it... :-) They’ve done interference experiments with buckyballs, which are pretty darn big. Is the "force" that keeps those C60 molecules from hitting the dark fringes "mysterious" or not?
Regardless, you wouldn't ever *feel* such a quantum effect; you would just see the end result. That's because if you are measuring one set of parameters (like a force), then Heisenberg says you're losing information about some other parameters, and in my approach it's always in the unknown parameters where the "mysterious" quantum effects would come into play. (After all, the parameters that you measure are imposed as a boundary condition on the system.)
For that last part of your question, it sounds like you're trying to back me into a corner where I have to choose between free will and the block universe. I doubt that such a corner exists, but if it did, I'd come down on the side of the block universe every time. As a well-thought out explanation of why I might feel this way, check out Greg Egan's short story, "The Hundred Light Year Diary" in his collection "Axiomatic".
Cheers,
Ken
view post as summary
Ken Wharton wrote on Nov. 26, 2008 @ 06:19 GMT
Tom,
Thanks for your thoughts on the matter... I'll need to think about how a "sorting algorithm" might do the job. The problem with continuous, classical fields vs. classical particle paths is that the number of options to sort would seem to be much larger in the case of fields. (And don't forget, the field has to consistently solve a set of differential equations throughout the 4-volume. And those are Euler-Lagrange equations that already minimize the action, so that sort of minimization principle doesn't get you anything extra in this case.)
Cheers,
Ken
Hrvoje Nikolic wrote on Nov. 26, 2008 @ 10:06 GMT
Hi Ken,
I am glad to see that you also have a contribution here.
Have you seen my essay, which is also about the block universe/time? Now there are two of us.
What could be even more interesting to you, is my last paper
http://xxx.lanl.gov/abs/0811.1905
which also discusses the probabilistic interpretation of Klein-Gordon equation in a block-universe spirit.
T H Ray wrote on Nov. 26, 2008 @ 11:58 GMT
Ken,
You write "Thanks for your thoughts on the matter... I'll need to think about how a "sorting algorithm" might do the job. The problem with continuous, classical fields vs. classical particle paths is that the number of options to sort would seem to be much larger in the case of fields. (And don't forget, the field has to consistently solve a set of differential equations throughout the 4-volume. And those are Euler-Lagrange equations that already minimize the action, so that sort of minimization principle doesn't get you anything extra in this case.)"
Right. That's why finitely probable particle paths in a nonlocal quantum mechanical system are not the same as infinite possible paths in a local classical system. Suppose (and I do) that differential equations are not the only or even the best mathematics to model continuous functions, under an assumption that time is n-dimensional continuous. Then, insofar as gravity is time dependent, dissipative energy over hyperspatial manifolds in an infinitely self similar system restricts particle paths locally to a set confined to the 4D manifold defined by your two boundary points t and t'. A sorting algorithm (which implies strongly polynomial time solutions) applies to problems in such self-assembled phenomena as protein folding, where the final configuration is known but the path of the process through space is not. Soritng the paths by energy differential between the two boundary points to the stable state is a least energy solution (thus my suggesiton of an n-dimensional, 2-point boundary value model).
The key concepts here, besides the assumption of an n-dimension continuous time metric (n > 4), are 1) infinite self-similarity, which obviates a boundary between classical and quantum domains; and 2) removing the problem from the spherical volume to the flat hypersurface, where maps are mathematically simpler and better behaved.
Do hope we can continue a dialogue. Thanks.
Tom
Peter Morgan wrote on Nov. 26, 2008 @ 14:12 GMT
Hi Ken. I think we use mathematical models that have effective ways to treat asymmetry, rather than that we can insist that there is no asymmetry.
For me, there are three layers of modeling, (1) lists of finite data of finite accuracy from experience, raw data such as Gregor Weihs can still send you as about 1.5GB from his Bell-EPR violating experiments running up to 1998, (2) statistics of...
view entire post
Hi Ken. I think we use mathematical models that have effective ways to treat asymmetry, rather than that we can insist that there is no asymmetry.
For me, there are three layers of modeling, (1) lists of finite data of finite accuracy from experience, raw data such as Gregor Weihs can still send you as about 1.5GB from his Bell-EPR violating experiments running up to 1998, (2) statistics of that data, and (3) idealized models, which in one way or another generate probabilities, expected values, and correlations, etc.
For the Weihs raw data, there is no doubt that it is presented as points in space-time. The time is listed for every data point, and the place is implied by which list a given time is found in. There is a detailed discussion of how to compare the times that are assigned at the two places. We know the experiment was done in Innsbruck, running up to 1998. We have no such raw data for experiments done in 2037.
To construct statistics, we have to identify an ensemble of data points, which requires that we decide on a way to identify regions of space-time as similar enough (but not identical, otherwise the statistics are trivial). There may be multiple ways to do this that are interesting, not all based only on identification of space-time regions.
Note that I'm not advocating a frequency interpretation of probability, only that we expect to be able to construct statistics that usefully correspond to expected values that are generated by a measure-theoretic mathematics, and that ultimately we **must** be able to construct an engineering method (which rules out ad-hoc models). There is of course a significant industry of estimation of parameters in probability models from statistics, significance testing, etc.
If I were Max Tegmark, I might claim unabashedly that there is a mathematical model that is identical to the raw data that I could gather, timelessly, if I were God, Maxwell's Demon, or a Physicist Supernatura. However, for my lifetime I suppose there will only be time-asymmetric lists of past data, while probability models will gaily predict what statistics there will be in the future. I'm not saying that there cannot be a block world mathematical model of the sort I take you to suggest --- perhaps our world is a classical finite automaton at the Planck scale, or a classical continuous field that has absolutely no fluctuations in the ultraviolet limit, even though the field fluctuates FAPP Lorentz-invariantly above the Planck scale (I note that it is only because of my analysis in "Bell inequalities for random fields" that I can legitimately daydream anything of the sort) --- but our present inability to handle the complexity of the initial conditions of such models means that we have to deal with statistics and probabilities. I consider that to do Physics is to stay away from the metaphysics of considering ontological structure, that Physics is to leave questions of what lies beyond the immense gaps in our knowledge to Philosophy and Religion. Of course on Philosophy days and on Sundays it's only proper for us to go there.
To do Physics should also be to show some humility about our models, to accept that future Physicists are likely to think of better ways to organize our experience, even if they aren't as smart as Physicists today, if they have enough time.
PS. Probability as substance doesn't fit with my epistemological stance.
PPS. So far, I can't see enough usefulness from using initial and final conditions to go there. I think the constraints of raw data on our models come from anywhere/when we happen to collect it. We certainly can't collect all the raw data on a space-like hyperplane. Sometimes we engineer the raw data (actually, we engineer the ensembles, we can't make events happen precisely when we like), sometimes it just happens to us.
I hope this is interesting enough to read it (all). I walk in a fine tradition in these comment streams of too much already.
view post as summary
Mark Stuckey wrote on Nov. 28, 2008 @ 20:53 GMT
Hi Ken,
“I found the Feynman reference -- it was a very similar example addressed in Wheeler/Feynman's 1949 paper (not the 1945 one). Check it out -- they conclude that all these paradoxes rely on an "all-or-nothing" sort of interaction, but once you allow continuous interactions (say, a glancing blow due to a slightly-misaligned trajectory through a CTC) there's always a...
view entire post
Hi Ken,
“I found the Feynman reference -- it was a very similar example addressed in Wheeler/Feynman's 1949 paper (not the 1945 one). Check it out -- they conclude that all these paradoxes rely on an "all-or-nothing" sort of interaction, but once you allow continuous interactions (say, a glancing blow due to a slightly-misaligned trajectory through a CTC) there's always a resolution.”
The Novikov Conjecture would not be necessary if Polchinski's paradox was not a reality, i.e., if there were no self-inconsistent CTCs in GR solns. But, there are such CTCs in GR and the only resolution of cases like Polchinski’s paradox, besides pointing to their highly improbable nature, is that *something* will happen to prevent violations of the principle of non-contradiction. The Novikov Conjecture is an addendum to GR. Essentially, I’m trying to find out if your approach suggests an underlying physical mechanism that would prove Novikov’s conjecture.
“Now this question I'm surprised to hear coming from a "Block World Kindred Spirit"... To me, one of the biggest advantages of the block universe framework is that it *is* a consistency principle, in and of itself! Paradoxes can't happen in a block universe, by definition.”
That reality is a BW doesn’t rule out the existence of self inconsistencies, it would just be a BW with self inconsistencies. How would we describe such a reality? We don’t know because our brains are wired in accord with the principle of non-contradiction. As physicists we tacitly assume that reality doesn’t contain such violations, otherwise physics fails. So, I’m with you in that belief and I’m trying to find out how GR needs to be modified.
“So GR doesn't need any new consistency principle as long as you don't impose so many boundary conditions that there's no solution, and QM wouldn't need one either if we re-build it along the lines I suggest in my essay. I would hope that adding a generic constraint on the allowed boundary conditions would naturally prevent overconstrained problems in classical GR (such as this one).”
I don’t understand how self-inconsistent CTCs result from “overconstrained problems.” A CTC is simply a curve on the spacetime manifold whose metric is given by some soln to Einstein’s equations. If there were a problem with EEs being overly constrained in these cases, why is only one curve on the spacetime manifold affected?
“Well, everything's mysterious until you understand it... :-) They’ve done interference experiments with buckyballs, which are pretty darn big. Is the "force" that keeps those C60 molecules from hitting the dark fringes "mysterious" or not?”
Are you claiming that the “Bohm force,” responsible for interference in the twin-slit experiment, provides the physical mechanism underwriting Novikov’s Conjecture? In that case you’ve blurred an important distinction between screened-off particles and non-screened-off particles, e.g., macroscopic objects.
“Regardless, you wouldn't ever *feel* such a quantum effect; you would just see the end result. That's because if you are measuring one set of parameters (like a force), then Heisenberg says you're losing information about some other parameters, and in my approach it's always in the unknown parameters where the "mysterious" quantum effects would come into play. (After all, the parameters that you measure are imposed as a boundary condition on the system.)”
I still don’t see how the uncertainty associated with boundary conditions for any particular trajectory would serve to *absolutely prevent* the instantiation of that trajectory. Again, that argument could be applied to ANY trajectory, so why does ANYTHING have a trajectory?
According to statistical mechanics, there is an exceedingly small probability that all the air molecules in my office might suddenly move to the other side of the room. I’m not worried about this occurrence because the probability is low, but that’s not why there is no “problem.” There is no “problem” because I can compute the consequences to me, the room and its other contents should that happen. The problem with self-inconsistent CTCs is that we can’t compute their consequences. So, either the universe contains phenomena we can’t analyze or it doesn’t, in which case GR is to be corrected/amended. I’m just trying to figure out how you, via your approach, propose to amend it.
“For that last part of your question, it sounds like you're trying to back me into a corner where I have to choose between free will and the block universe. I doubt that such a corner exists, but if it did, I'd come down on the side of the block universe every time.”
No, I’m not addressing the issue of free will. Again, I’m trying to find out if your approach suggests an underlying physical mechanism that would ultimately prove Novikov’s conjecture.
Thanks very much for your response, Ken. I think we’re almost done!
Mark
view post as summary
Cristi Stoica wrote on Dec. 1, 2008 @ 09:18 GMT
Dear Ken,
Congratulations for your nicely written tutorial on the conceptualization of time as a 4-D block! Your exposition points out the most common pitfalls in representing/understanding the frozen time.
1. “Beyond Copenhagen, there are several other established interpretations, some of which are explicitly inconsistent with the block universe (one of them postulates many universes).”
I developed a “world theory” which provides easily a block view for the MWI and standard QM (only that we don’t necessarily have Lorentz invariance). On the other hand, Penrose presents a general relativistic spacetime able to split in many worlds (he splits them along lightlike 3d-surfaces, if you are interested, I will look for the article).
2. Your discussion of the wavefunction discontinuous collapse can be related to my solution, in which I replace this discontinuity with “delayed initial conditions”, sending back the Quantum Mechanics in the block time view.
My Smooth QM is deterministic (but compatible with free-will), and, contrary to Bohm’s theory, it relies only to the evolution equation (Schrodinger for purified states, Liouville - von Neumann for mixtures), and does not require other hidden variables than the initial conditions of the evolution PDE. These I called "delayed initial conditions". I find some similarity with your solution, in that both of them look like retro-causation. I understand that you solved the incompatibility between initial/final conditions by using Klein-Gordon equation; I solve it by moving the discussion to the entanglement between observed state and initial measurement device. Another important difference: my delayed initial conditions are partial, and spread in spacetime, not just at the beginning and end; they are "caused" by the measurements. I do not want to detail more my theories on your discussion thread, since it is appropriate to talk about your work here. I just pointed out some connections.
Best wishes,
Cristi Stoica
“Flowing with a Frozen River”,
http://fqxi.org/community/forum/topic/322
John Merryman wrote on Dec. 2, 2008 @ 23:32 GMT
Professor Wharton,
"After all, *eventually* the future will be past, and we shouldn't have to treat those events differently in our equations once that happens. (Granted, learning about uncertain values makes them more certain, but that sort of thing equally applies to uncertain values both in the past and the future.)"
From a layman's perspective, this is the problem I see with "block time." Yes, if time is a fundamental dimension proceeding from the past into the future, block time does make sense, but the reality is the present, with time flowing by it from future potential to past circumstance. Which is more fundamental, the earth rotating, or the linear progression of days? I would argue time is a consequence of motion, rather than the basis for it.
Ken Wharton wrote on Dec. 4, 2008 @ 05:24 GMT
Hrvoje, Tom, and Peter: Since your recent posts are getting a bit off-topic, let's move these discussions to email for now. (Hrvoje and Peter -- I already owe you responses to your latest emails, but probably won't get to them until next week. Tom, feel free to contact me at wharton(.at.)science.sjsu.edu.)
Ken Wharton wrote on Dec. 4, 2008 @ 05:25 GMT
Mark: I don't see Novikov's self-consistency principle as being an "addition" to GR; it's just a tautology: if you apply so many constraints to a system of physical equations such that there is no solution, then there's no solution. And if the solutions correspond to "reality", then it must be impossible to impose those inconsistent constraints in the first place. This must be true for *any* physical theory; not just GR. If you ask "What keeps me from imposing that many constraints?", the answer is simply that those constraints are self-inconsistent. You might as well ask why I can't both impose a net force on an object and also impose that its velocity remains constant.
Now, if your question boils down to which *sorts* of constraints one is allowed to impose on physical equations, without overconstraining the system... you're getting into exactly the questions that I am considering. You should also read Steve Weinstein's essay in this contest (and related recent arXiv post) for some very interesting insights. (Also, thanks for your detailed response to my questions on your own essay thread; I'll get to that as soon as I can, probably next week.)
Best,
Ken
Ken Wharton wrote on Dec. 4, 2008 @ 05:29 GMT
Cristi,
Thanks for your kind comments. I think you might find a lot of connections between your research and the approach of Larry Schulman (I cited his book as a reference in my essay). He also is trying to "nudge" the wavefunction into a system that can match two-time boundary conditions.
I like certain aspects of your essay very much -- particularly getting away from this "instantaneous" aspect of measurement that many people seem over-reliant on -- but I just wanted to comment that I think it's important to treat measurement and preparation on the same footing. After all, any preparation process could also serve as a non-destructive measurement of a yet-earlier preparation. So if you're going to use diagrams that make a clear distinction between the preparation and the measurement, I'd suggest showing that the final measurement might *also* have a time-duration to it, and draw the figure in a way that shows the process you envision might repeat itself over and over.
Of course, I would also urge you to consider that there might be other, hidden variables that get changed over the duration of the measurement, while the aspects that are actually measured stay constant throughout the measurement process. And going to a relativistic picture naturally gives you exactly the right number of extra parameters to make this work. See my arXiv paper (0706.4075) if you're interested in how this might work...
Cheers!
Ken
Ken Wharton wrote on Dec. 4, 2008 @ 05:49 GMT
John,
You write: "...reality is the present, with time flowing by it..."
Yes, I realize that's how most people see matters. And that's exactly why I made such a concerted attempt in this essay to try to convince the reader that such a picture just doesn't make sense when it comes to physics.
But here's another point I didn't go into in much detail. The analogy of time "flowing" is dangerous because the very word "flowing" (and the general concept of motion) is meaningless without a prior concept of time. Given our primitive concepts of space and time, flow and motion both make sense. The danger comes in when one tries to make an *analogy* between, say, the flow of water and the "flow of time". I tried to make the point in the essay that it's a terrible analogy, because now instead of flow velocity = distance/time, one ends up with the non-sensical notion of "time velocity"=time/time, which isn't anything meaningful at all.
We do have primitive notions of space and time (for more on this, see "The Stuff of Thought" by Steven Pinker) -- the question is how to overcome these intuitive concepts so that we can think about physics objectively. And the best way to do this is to move to a static block universe. If my essay didn't convince you, at least give Huw Price's book a try (Time's Arrow and Archimedes' Point). It's by far the best generally-accessible, non-trivial book on time that you'll find.
Best,
Ken
Cristi Stoica wrote on Dec. 4, 2008 @ 15:07 GMT
Dear Ken,
“I'd suggest showing that the final measurement might *also* have a time-duration to it, and draw the figure in a way that shows the process you envision might repeat itself over and over.”
Thank you for the suggestions. I totally agree with you. In fact, in a more detailed description, in the original article “Smooth Quantum Mechanics” (http://philsci-archive.pitt.edu/archive/00004199/), I specified it:
“After each observation, the quantum system gets entangled with the measurement device. Thus, even if the system is found in a precise state by the measurement, the entanglement with the measurement device makes its state to be again undetermined. The next measurement selects again an initial condition, to specify the state of the observed system. But now the system gets entangled with the measurement apparatus used for the last observation, and the cycle continues.”
Unfortunately, in the essay I omitted, because of the length limitation, to describe the cycle, as well as in the version I cite in the essay (the second one) of my paper SQM (http://philsci-archive.pitt.edu/archive/00004344/).
And you are right, a picture with this cycle will help a lot.
Thank you,
Cristi Stoica
“Flowing with a Frozen River”,
http://fqxi.org/community/forum/topic/322
John Merryman wrote on Dec. 4, 2008 @ 19:06 GMT
Professor Wharton,
Maybe I shouldn't have used the word "flow." Especially since your view of time is that it exists as a static higher dimension, so let me put this another way; Does the rotation of the earth turn tomorrow into yesterday?
It's not that I view the "present" as a "point" in time, since I view time itself as an abstraction, similar to temperature. My argument is that there is simply what might best be described as "energy" and as the arrangements of this energy change, each arrangement is replaced by the next, so this progression of events goes from future potential to past circumstance. So the only "flow" is an attribute of the energy.
Mark Stuckey wrote on Dec. 5, 2008 @ 02:44 GMT
Dear Ken,
Thanks for your reply. I want to press one point b/c I don't know that you appreciate the "problem" I’m trying to convey.
"Mark: I don't see Novikov's self-consistency principle as being an "addition" to GR; it's just a tautology: if you apply so many constraints to a system of physical equations such that there is no solution, then there's no solution. And if the...
view entire post
Dear Ken,
Thanks for your reply. I want to press one point b/c I don't know that you appreciate the "problem" I’m trying to convey.
"Mark: I don't see Novikov's self-consistency principle as being an "addition" to GR; it's just a tautology: if you apply so many constraints to a system of physical equations such that there is no solution, then there's no solution. And if the solutions correspond to "reality", then it must be impossible to impose those inconsistent constraints in the first place. This must be true for *any* physical theory; not just GR."
I just gave a lecture to our engineering students on the different rates at which Earth-based clocks and orbiting clocks run according to the Schwarzschild soln (the GR corrections are now relevant to GPS satellite technology). We can use the vacuum solutions of GR to find the trajectories of small objects where "small" means their stress-energy tns (ST tns) doesn't appreciably affect the vacuum soln, of course. When we put an object into motion along one of these vacuum trajectories, the object follows the trajectory as one would expect since the object doesn’t change the spacetime curvature alg the trajectory.
Now, suppose I solve Einstein’s eqns and find a self-inconsistent CTC in some vacuum soln (these solns exist). Do I have a soln of GR that REALLY possesses a self-inconsistent trajectory? Yes. Do I have a soln of GR that possesses SELF-INCONSISTENCY? No, because while it’s true that a small object won’t change the CTC in question, technically speaking, the RHS of Einstein’s eqns (ST tns) must be divergence-free b/c the LHS is divergence-free (Einstein tns). So, technically speaking, no matter how small the object’s ST tns, it MUST be divergence-free in order to claim that you have a soln of GR. But, of course, I can’t construct the ST tns for a self-inconsistent situation (duh). So, GR does NOT possess self-inconsistency, even though it DOES possess self-inconsistent CTCs.
Hopefully you’ve jumped ahead of me and anticipate the question, “What if I have a soln with self-inconsistent CTCs, instantiate it physically and then place a small object on the self-inconsistent CTC?” This may or may not be a soln of GR. [Keep in mind that the object is small so it does not affect the existence or structure of the trajectory proper.] If the object is dry ice or something that evaporates before getting to the self-inconsistent region, we have a GR soln (like putting GPS satellites on free fall geodesics about Earth) and we know what will happen. If the object is a bowling ball, we don’t have a GR soln and we don’t know what will happen. So, what happens in that case? Do you see that the answer to that question lies outside of GR, even though GR clearly prompted it by saying this trajectory exists and gives me no reason why I can’t at least START an object thereupon. [This nicely captures the clash of relativity’s BW with dynamical experience.]
I’m just wondering if your approach suggests how GR might be corrected/augmented to answer this question. In the quote above I infer that you believe, as do I, that it must be impossible to realize this situation so SOMETHING must preclude it, i.e., make it impossible, not merely improbable. Per our discrete approach there is no empty spacetime, so the problematic trajectory really DOES NOT EXIST – the GR approximation is not accurate in saying that empty trajectories “exist” (although, trajectories in vacuum solns can be realized (made real, made to exist) via a small, divergence-free ST tns – but, no ST tns means no trajectory). What do you say?
Mark
view post as summary
Michael Silberstein wrote on Dec. 5, 2008 @ 18:41 GMT
Dear Ken,
Sorry to double team you here, but below is a passage from Halpern's essay which supports Mark's claim about Novikov's self-consistency principle being an add-on to GR.
"To combat such conundrums several proposals were suggested. Hawking formulated the “Chronology Protection Conjecture” as an attempt to forbid backward time travel based on the laws of physics [11]. Igor Novikov took a different tact and proposed a self-consistency principle
that permitted past-directed travel as long as it was fully consistent with what already had transpired [12]."
Cheers,
Michael
Ken Wharton wrote on Dec. 12, 2008 @ 22:18 GMT
Cristi: I guess we're in agreement on most of my earlier points. I'm trying to incorporate a finite-duration measurement myself (at least for non-destructive measurements), but so far the closest I've come is to allow the exact interaction time to be part of the overall solution space, with its own probability distribution. For more details you'll have to wade through arXiv:0706.4075.
John: I'm afraid I don't understand your question. I will say that retreating from "flow" and "motion" to a more general "change" can't explain anything fundamental about time, because the very concept of "change" *relies* on our primitive notion of time to even make sense. (What could change mean without time?) I'll continue to argue that the best way to get rid of these primitive temporal notions and focus on the physics is to use a block universe framework.
Mark and Michael: Your points are well taken, and serve to remind me that I've been mentally inhabiting a block universe for so long that I forget most people don't think that way. From the traditional "time-evolve the initial boundary conditions" perspective, it's absolutely correct that something like this would appear mysterious and in need of an additional postulate to prevent paradoxes.
I guess my revised point is that *any* consistent block-universe perspective (mine or yours) can deal with this problem almost trivially, without additional postulates. To recap how my particular type of model would solve this problem, one would impose external boundary conditions on both the space-time region in question and on the particle itself, but the precision at which one can impose all of those boundaries is limited by the uncertainty principle. The probability of any given outcome is then directly related to the number of acceptable solutions to the boundary-value problem. If some particular solution (say, the particle going around the loop) isn't self-consistent, then it's not a solution, and the probability of that outcome will be exactly zero. Simple as that.
We've moved far enough off-topic here we should probably retreat to email if you're not happy with such an answer... and I'll "see" you both soon over at your own essay thread!
Ken
John Merryman wrote on Dec. 13, 2008 @ 01:27 GMT
Ken,
Humor me for a moment and reconsider a reality in which change and motion are acceptable. The arrow of time goes from what comes first, to what comes second. For the observer, past events proceed future ones, so we observe time as going from the past to future. On the other hand, these events are first in the future, then in the past, so their arrow goes the opposite direction. Throughout history, in fact the very description of the narrative construct we call history, the understanding of time is of the first arrow. That events proceed along this universal path, whether Newton's absolute time, or Einstein's relative time, from past to future.
Yet the only reality ever experienced is of the present. So lets examine the consequence of viewing reality as a fixed present consisting of energy in motion, thus causing change and as each arrangement described by this energy is replaced by the next, these events go from future potential to past circumstance. Therefore past and future do not physically exist because the energy to manifest all such events is only manifesting one moment at a time.
So rather than a fundamental dimension, time becomes an emergent description and consequence of motion, similar to temperature. Temperature, as a scalar average of motion, doesn't exist if we only consider singular motion, but only emerges when measuring a mass of activity. So time, as a sequencing of units of motion, doesn't effectively exist if we cannot define a progression. It is just quantum fuzziness. The present can't be a dimensionless point either, since it is a description of motion and would only be dimensionless if all motion has stopped, so, like temperature, the measurement becomes fuzzy when examined closely.
Whether time proceeds along some dimension from past to future, or is caused by the progression of events from future to past, might seem semantic, yet consider the consequences; If time is that dimension moving toward the future, we need to explain how it deals with potentialities. Either we go with multi-worlds, in which all potentials are taken, or block time, where the potentials are illusionary and it is fundamentally deterministic. Now if we view it from the other direction, where time is the events moving from future potential to past circumstance, the collapsing wave of probabilities makes sense, since it is only energy in motion and time is simply an emergent description of the process, not some fundamental dimension.
What is primitive is the narrative assumption that time is a linear projection from the past into the future.
Lawrence B. Crowell wrote on Jan. 1, 2009 @ 14:13 GMT
Might it be that instead of there being a retrocausality of wave functions that they configure themselves at one time according to the a future time as determined by a block structure?
Lawrence B. Crowell
Ken Wharton wrote on Jan. 5, 2009 @ 03:28 GMT
John, Thanks for your comments, but I don't really have much to add to my previous response to you. I just don't see how time can be said to emerge from a picture that starts with a primitive concept of change or motion, because those concepts require an even more primitive concept of time to even make sense. Such an approach is therefore doomed to being a circular argument.
Lawrence, I think I agree with what you're trying to say, but the way that you said it is technically wrong. When you use the phrase "at one time", surely you don't literally mean "at one particular temporal coordinate", because you're talking about a block structure that spans some range of time. Instead, I'm guessing that you're talking about some wavefunction that finds a global solution in a block universe framework, which is exactly what I'm arguing for in this essay. But to say this happens "at one time" or "all at once" is flatly incorrect -- the solution spans many time coordinates, not just one. (The key is to avoid temporal language entirely when thinking in a block universe framework, or else you fall into the trap of imagining some meta-time that is *not* included in the block universe.)
Ken
John Merryman wrote on Jan. 5, 2009 @ 17:57 GMT
Ken,
You are right that it is primitive, but physics is about understanding the basics. Motion doesn't exist without time, because time is units of motion. Just as collective motion doesn't exist without temperature, as temperature is averaging of motion.
Time as a dimension doesn't accord the physical reality of the present any precedence over the physical non-existence of the past and future. You may not have a problem with that, but I like my understanding of reality to accord with reality. Therefore I view time as the series of events which go from being in the future to being in the past, created and consumed by the process which is the present. Not as a non-dynamic dimension along which we exist. Yes, time is relative. If you speed up the motion, time speeds up, just as temperature increases. It is only when you are assuming some fundamentally static dimension that this seems illogical.
Login or
create account to post reply or comment.