If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at forums@fqxi.org with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Contests Home

Previous Contests

**What Is “Fundamental”**

*October 28, 2017 to January 22, 2018*

*Sponsored by the Fetzer Franklin Fund and The Peter & Patricia Gruber Foundation*

read/discuss • winners

**Wandering Towards a Goal**

How can mindless mathematical laws give rise to aims and intention?

*December 2, 2016 to March 3, 2017*

Contest Partner: The Peter and Patricia Gruber Fund.

read/discuss • winners

**Trick or Truth: The Mysterious Connection Between Physics and Mathematics**

*Contest Partners: Nanotronics Imaging, The Peter and Patricia Gruber Foundation, and The John Templeton Foundation*

Media Partner: Scientific American

read/discuss • winners

**How Should Humanity Steer the Future?**

*January 9, 2014 - August 31, 2014*

*Contest Partners: Jaan Tallinn, The Peter and Patricia Gruber Foundation, The John Templeton Foundation, and Scientific American*

read/discuss • winners

**It From Bit or Bit From It**

*March 25 - June 28, 2013*

*Contest Partners: The Gruber Foundation, J. Templeton Foundation, and Scientific American*

read/discuss • winners

**Questioning the Foundations**

Which of Our Basic Physical Assumptions Are Wrong?

*May 24 - August 31, 2012*

*Contest Partners: The Peter and Patricia Gruber Foundation, SubMeta, and Scientific American*

read/discuss • winners

**Is Reality Digital or Analog?**

*November 2010 - February 2011*

*Contest Partners: The Peter and Patricia Gruber Foundation and Scientific American*

read/discuss • winners

**What's Ultimately Possible in Physics?**

*May - October 2009*

*Contest Partners: Astrid and Bruce McWilliams*

read/discuss • winners

**The Nature of Time**

*August - December 2008*

read/discuss • winners

Previous Contests

read/discuss • winners

How can mindless mathematical laws give rise to aims and intention?

Contest Partner: The Peter and Patricia Gruber Fund.

read/discuss • winners

Media Partner: Scientific American

read/discuss • winners

read/discuss • winners

read/discuss • winners

Which of Our Basic Physical Assumptions Are Wrong?

read/discuss • winners

read/discuss • winners

read/discuss • winners

read/discuss • winners

Forum Home

Introduction

Terms of Use

RSS feed | RSS help

Introduction

Terms of Use

*Posts by the author are highlighted in orange; posts by FQXi Members are highlighted in blue.*

RSS feed | RSS help

RECENT POSTS IN THIS TOPIC

**Kamilla Kamilla**: *on* 4/10/16 at 21:55pm UTC, wrote I wanted to thank you for this excellent read!! I definitely loved every...

**John Merryman**: *on* 1/5/09 at 17:57pm UTC, wrote Ken, You are right that it is primitive, but physics is about...

**Ken Wharton**: *on* 1/5/09 at 3:28am UTC, wrote John, Thanks for your comments, but I don't really have much to add to my...

**Lawrence B. Crowell**: *on* 1/1/09 at 14:13pm UTC, wrote Might it be that instead of there being a retrocausality of wave functions...

**John Merryman**: *on* 12/13/08 at 1:27am UTC, wrote Ken, Humor me for a moment and reconsider a reality in which change and...

**Ken Wharton**: *on* 12/12/08 at 22:18pm UTC, wrote Cristi: I guess we're in agreement on most of my earlier points. I'm...

**Michael Silberstein**: *on* 12/5/08 at 18:41pm UTC, wrote Dear Ken, Sorry to double team you here, but below is a passage from...

**Mark Stuckey**: *on* 12/5/08 at 2:44am UTC, wrote Dear Ken, Thanks for your reply. I want to press one point b/c I don't...

RECENT FORUM POSTS

**Joe Fisher**: "Today’s Closer To Truth Facebook page contained this peculiar..."
*in* Dissolving Quantum...

**Georgina Woodward**: "Just shutting up and calculating won't do. The steps are; correctly..."
*in* Space-time from Collapse...

**Joe Fisher**: "Today’s Closer To Truth Facebook page contained this peculiar..."
*in* Dissolving Quantum...

**Georgina Woodward**: "Specifically identifying and naming the issue is a significant advance. It..."
*in* Space-time from Collapse...

**Steven Andresen**: "Anybody got the inside word on the theme for this years essay contest? ..."
*in* Alternative Models of...

**john smith**: "It's a new thing for my knowledge I am looking for the same I recently..."
*in* Neutrino mysteries,...

**Elina Williams**: "Technology is becoming an integral part of our life. From handheld devices..."
*in* Manipulating the Quantum...

**Jena Somerhalder**: "Hello! I must say this post is very interesting for reading. I found there..."
*in* Time in Physics & Entropy...

RECENT ARTICLES

*click titles to read articles*

**Dissolving Quantum Paradoxes**

The impossibility of building a perfect clock could help explain away microscale weirdness.

**Constructing a Theory of Life**

An all-encompassing framework of physics could help to explain the evolution of consciousness, intelligence, and free will.

**Usurping Quantum Theory**

The search is on for a fundamental framework that allows for even stranger links between particles than quantum theory—which could lead us to a theory of everything.

**Fuzzballs v Black Holes**

A radical theory replaces the cosmic crunchers with fuzzy quantum spheres, potentially solving the black-hole information paradox and explaining away the Big Bang and the origin of time.

**Whose Physics Is It Anyway? Q&A with Chanda Prescod-Weinstein**

Why physics and astronomy communities must take diversity issues seriously in order to do good science.

RECENT FORUM POSTS

RECENT ARTICLES

The impossibility of building a perfect clock could help explain away microscale weirdness.

An all-encompassing framework of physics could help to explain the evolution of consciousness, intelligence, and free will.

The search is on for a fundamental framework that allows for even stranger links between particles than quantum theory—which could lead us to a theory of everything.

A radical theory replaces the cosmic crunchers with fuzzy quantum spheres, potentially solving the black-hole information paradox and explaining away the Big Bang and the origin of time.

Why physics and astronomy communities must take diversity issues seriously in order to do good science.

FQXi FORUM

November 13, 2018

CATEGORY:
The Nature of Time Essay Contest (2008)
[back]

TOPIC: Lessons from the Block Universe by Ken Wharton [refresh]

TOPIC: Lessons from the Block Universe by Ken Wharton [refresh]

Our time-asymmetric intuitions make it difficult to be objective when considering the nature of time. But these difficulties can be overcome by using the framework of the "block universe", where every event is mapped onto a static, four-dimensional structure. In this perspective, time is represented as a spatial dimension, so the block universe can never "change"; there is no additional time dimension for such a concept to even make sense.

This essay argues that the block universe is by far the best framework for physical theories, as general relativity is simply incompatible with any alternative. The only part of physics that does *not* fit into such a "block" picture is quantum theory, as it was not originally developed in a block-universe framework. But far from implying that the block universe is incorrect, I argue that we can instead use lessons from the block universe to reconstruct quantum theory in a manner compatible with general relativity.

The essay then outlines how this might be accomplished. A block-universe quantum wavefunction must be represented in four-dimensional space-time, so the usual higher-dimensional "configuration space" is critically examined. The block universe view reveals that the extra information encoded in these higher dimensions is not actually needed, because all possible measurements do not occur on any given system. The need to "discard" the excess information in turn implies that every quantum system must solve a four-dimensional boundary value problem. Interestingly, this is also an approach that solves other outstanding interpretational problems from quantum theory, including the "collapse" of the wavefunction. Taking this research path would require a radical revision of nearly all aspects of quantum theory, but also promises to reshape our understanding of the nature of time. **Author Bio**

Ken Wharton is a physics professor at San Jose State University. After attending Stanford (BS, Physics, 1992) and UCLA (PhD, Physics, 1998), he joined the Department of Physics and Astronomy at SJSU in 2001. While originally an experimental laser physicist, he is now a full-time quantum theorist who is actively pursuing the research program outlined in this essay. He has also been known to occasionally publish "hard" (scientifically-accurate) science fiction stories, including a novel that won the Special Citation for the 2001 Philip K. Dick Award.**Download Essay PDF File**

"...there's no objective way to distinguish an initial boundary condition from a final boundary condition without resorting to our time-asymmetric intuitions that don't apply in a block universe."

Nice.

This idea of "introcausality" just may hold the key to the preservation of continuous function physics in a way tractable to machine-computing algorithms, i.e., finite methods.

Intriguing insight, with a clearly defined research path. Thanks, Ken.

Tom

Nice.

This idea of "introcausality" just may hold the key to the preservation of continuous function physics in a way tractable to machine-computing algorithms, i.e., finite methods.

Intriguing insight, with a clearly defined research path. Thanks, Ken.

Tom

Hi, Tom. Ken's arXiv:0706.4075 was mentioned to me on Friday by David Miller in Sydney, so finding this FQXi essay hours later was a surprise to me, both in timing and, by comparison, in content. In response to your comment, I guess I don't yet see that Ken's research path is clear.

Hi, Ken. I'm sorry to say that I have all sorts of trouble with details of your essay. I've been doing...

view entire post

Hi, Ken. I'm sorry to say that I have all sorts of trouble with details of your essay. I've been doing...

view entire post

Hi Peter. You write "Hi, Tom. Ken's arXiv:0706.4075 was mentioned to me on Friday by David Miller in Sydney, so finding this FQXi essay hours later was a surprise to me, both in timing and, by comparison, in content. In response to your comment, I guess I don't yet see that Ken's research path is clear."

It's clear to me.

Ken is quite correct that quantum mechanics did not evolve from a block universe model. The relativistic block universe came to us mathematically complete; quantum mechanics was knitted from experimental results. The addition of field theoretical quantum physics (e.g., your model) attempts to restore continuous function analysis to discrete phenomena.

I doubt that the David Miller you mention is the Karl Popper protege and scholar at Warwick U., U.K., with whom I am acquainted, but if he were, he should appreciate Ken's bold conjectural approach toward this problem. I know I do, and although Ken can speak for himself, if it's philosophical intentions you demand, I stand firmly in the Popper camp.

The research direction in Ken's work that I find clear--as I stated--is the possibility of strict computability, i.e., of an algorithm to model discrete phenomena that explains why the universe appears to obey continuous functions.

Tom

It's clear to me.

Ken is quite correct that quantum mechanics did not evolve from a block universe model. The relativistic block universe came to us mathematically complete; quantum mechanics was knitted from experimental results. The addition of field theoretical quantum physics (e.g., your model) attempts to restore continuous function analysis to discrete phenomena.

I doubt that the David Miller you mention is the Karl Popper protege and scholar at Warwick U., U.K., with whom I am acquainted, but if he were, he should appreciate Ken's bold conjectural approach toward this problem. I know I do, and although Ken can speak for himself, if it's philosophical intentions you demand, I stand firmly in the Popper camp.

The research direction in Ken's work that I find clear--as I stated--is the possibility of strict computability, i.e., of an algorithm to model discrete phenomena that explains why the universe appears to obey continuous functions.

Tom

Tom, I agree that QM didn't evolve from a block universe model. I think, and I think you accept in your comment, that relativistic quantum fields do, now, largely adopt a block world ground. You know, but Ken presumably doesn't yet, that I work with continuous models not because it's how I think the world must be, but only because /I/ find it convenient, as of now, to do so, even though the finite number of finite accuracy measurements that we can make and record cannot possibly justify a continuous model.

I guess the David Miller I mentioned appreciates something about Ken's work, at least in relation to mine, because he was reminded of Ken's work, and suggested it to me, upon seeing my FQXi essay, which I had asked him and a few others in Sydney to look at. This is a David Miller who wrote "Realism and time symmetry in quantum mechanics", Phys. Lett. A222, 31-36(1996). His web-page in Sydney is at http://www.usyd.edu.au/time/people/miller.htm.

I appreciate the empiricist sentiment that citing Popper claims, but most Physicists and Philosophers of Physics are influenced enough by the devastating mid-century critiques of positivism that it's important to know in what ways they accommodate those critiques. Claiming to be Popperian no longer adequately informs us of your point of view.

I perhaps fail clearly to discern Ken's research path more because of the pairing of Ken's FQXi essay with his arXiv:0706.4075 than because of either of them taken separately. I worry that a block world structure is not a sufficiently strong guiding principle by itself for constructing a new mathematics, and I don't see clearly what other mathematical or physical principles Ken advocates.

I guess the David Miller I mentioned appreciates something about Ken's work, at least in relation to mine, because he was reminded of Ken's work, and suggested it to me, upon seeing my FQXi essay, which I had asked him and a few others in Sydney to look at. This is a David Miller who wrote "Realism and time symmetry in quantum mechanics", Phys. Lett. A222, 31-36(1996). His web-page in Sydney is at http://www.usyd.edu.au/time/people/miller.htm.

I appreciate the empiricist sentiment that citing Popper claims, but most Physicists and Philosophers of Physics are influenced enough by the devastating mid-century critiques of positivism that it's important to know in what ways they accommodate those critiques. Claiming to be Popperian no longer adequately informs us of your point of view.

I perhaps fail clearly to discern Ken's research path more because of the pairing of Ken's FQXi essay with his arXiv:0706.4075 than because of either of them taken separately. I worry that a block world structure is not a sufficiently strong guiding principle by itself for constructing a new mathematics, and I don't see clearly what other mathematical or physical principles Ken advocates.

Great essay, Ken. As you know from our meeting at Perimeter this fall, we agree on the use of a blockworld for fundamental physics. I’ve two comments/questions:

“Doesn't this imply that we need to come up with a more expansive view of space-time that is somehow compatible with both quantum theory and relativity? Balderdash. … GR is the correct tool to ask questions about space and time.”

GR has (at least one) temporal pathology, namely closed time-like curves (CTCs) allowing for self-inconsistency, e.g., a particle looping around a CTC segment so that it strikes itself before it entered the CTC segment, thereby keeping it from entering the segment to begin with. How do you propose GR be modified to rectify the existence of such CTCs and how is this incorporated in your formalism?

“So far, the closest approach to the block universe is the ‘de Broglie-Bohm Interpretation’.”

The Relational Blockworld (Foundations of Physics 38, No. 4, 348 – 383 (2008), quant-ph/0510090) is consistent with your argument that blockworld time be made compatible with quantum physics. The main difference between your approach and RBW is that RBW is fundamentally probabilistic so we don’t have “to reinvent every single piece of quantum theory in a block universe framework.” On the contrary, quantum physics as it stands makes perfect sense in RBW. [Our essay will be posted this week.]

A BW kindred spirit,

Mark

“Doesn't this imply that we need to come up with a more expansive view of space-time that is somehow compatible with both quantum theory and relativity? Balderdash. … GR is the correct tool to ask questions about space and time.”

GR has (at least one) temporal pathology, namely closed time-like curves (CTCs) allowing for self-inconsistency, e.g., a particle looping around a CTC segment so that it strikes itself before it entered the CTC segment, thereby keeping it from entering the segment to begin with. How do you propose GR be modified to rectify the existence of such CTCs and how is this incorporated in your formalism?

“So far, the closest approach to the block universe is the ‘de Broglie-Bohm Interpretation’.”

The Relational Blockworld (Foundations of Physics 38, No. 4, 348 – 383 (2008), quant-ph/0510090) is consistent with your argument that blockworld time be made compatible with quantum physics. The main difference between your approach and RBW is that RBW is fundamentally probabilistic so we don’t have “to reinvent every single piece of quantum theory in a block universe framework.” On the contrary, quantum physics as it stands makes perfect sense in RBW. [Our essay will be posted this week.]

A BW kindred spirit,

Mark

"Gradually the conviction gained recognition that all knowledge about things is exclusively a working-over of the raw material furnished by the senses. ... Galileo and Hume first upheld this principle with full clarity and decisiveness." --(Albert Einstein, Ideas and Opinions)

Hello Ken,

You write, "Our time-asymmetric intuitions make it difficult to be objective when considering the...

view entire post

Hello Ken,

You write, "Our time-asymmetric intuitions make it difficult to be objective when considering the...

view entire post

Hi Ken,

This is in answer to your question how RBW differs from your view. First see our essay soon to be posted. Second, read the following:“Why Quantum Mechanics Favors Adynamical and Acausal Interpretations such as Relational Blockworld over Backwardly Causal and Time-Symmetric Rivals” in a focus issue of Studies in the History and Philosophy of Modern Physics on time-symmetric approaches to quantum mechanics edited by Huw Price, Volume 39, Issue 4, pp. 732-747. M. Silberstein, M. Cifone and M. Stuckey. I'll try to attach it.

The general answer however is this: while, like yourself, we take blockworld (BW) as an essential feature of interpreting QM, we don't need to revamp QM, e.g., replace the Schrodinger equation with the Klein-Gordon equation. In the aforementioned paper we argue that retrocausal accounts of QM do not take acausal and adynamical thinking seriously enough and that their retrocausal devices amount to little more than a veiled assertion that in the BW the outcomes of QM experiments are already "there." For example, even Price admits that all such causal talk (retro or otherwise) is merely perspectivial. So the first problem is how to invoke BW in a non-trivial explanatory fashion. We show how to do this with an adynamical and acausal explanation that is fundamental to any dynamical explanation of QM and it involves BW in an essential fashion. Second problem, we show that the experimental set-up known as quantum liar experiment (QLE) is fatal for any purely dynamical time-like or retrocausal account that purports to save locality, while RBW has no problems.

In addition, RBW fully resolves the measurement problem and is fully compatible with special relativity (SR) as it is local while being non-separable and requires no FTL influences or action-at-a-distance. Perhaps most importantly of all, as the essay will elaborate, RBW leads to a completely unique solution to the problem of quantum gravity with profound implications for the various problems of time.

One last minor point. I think you might want to sharpen your claim that QM and BW are inherently incompatible. You seem to think that BW ENTAILS that there is only one outcome for every experiment in M4, while the Hilbert space of QM demands otherwise. But surely this isn't true,after all, Saunders and other Everettians defend the "QM block world" wherein all the outcomes exist in a BW setting. It's true that they must explain why it appears that there are only 3 spatial dimensions or why these 3 dimensions of space emerge from the more fundamental Hilbert space, but there are many such programs. Furthermore, the Everett interpretation squares perfectly with SR and locality. The burden of establishing comparative advantage is especially high for you given your need to radically revise QM.

Another kindred BW spirit.

Michael

attachments: SHPMP557.pdf

This is in answer to your question how RBW differs from your view. First see our essay soon to be posted. Second, read the following:“Why Quantum Mechanics Favors Adynamical and Acausal Interpretations such as Relational Blockworld over Backwardly Causal and Time-Symmetric Rivals” in a focus issue of Studies in the History and Philosophy of Modern Physics on time-symmetric approaches to quantum mechanics edited by Huw Price, Volume 39, Issue 4, pp. 732-747. M. Silberstein, M. Cifone and M. Stuckey. I'll try to attach it.

The general answer however is this: while, like yourself, we take blockworld (BW) as an essential feature of interpreting QM, we don't need to revamp QM, e.g., replace the Schrodinger equation with the Klein-Gordon equation. In the aforementioned paper we argue that retrocausal accounts of QM do not take acausal and adynamical thinking seriously enough and that their retrocausal devices amount to little more than a veiled assertion that in the BW the outcomes of QM experiments are already "there." For example, even Price admits that all such causal talk (retro or otherwise) is merely perspectivial. So the first problem is how to invoke BW in a non-trivial explanatory fashion. We show how to do this with an adynamical and acausal explanation that is fundamental to any dynamical explanation of QM and it involves BW in an essential fashion. Second problem, we show that the experimental set-up known as quantum liar experiment (QLE) is fatal for any purely dynamical time-like or retrocausal account that purports to save locality, while RBW has no problems.

In addition, RBW fully resolves the measurement problem and is fully compatible with special relativity (SR) as it is local while being non-separable and requires no FTL influences or action-at-a-distance. Perhaps most importantly of all, as the essay will elaborate, RBW leads to a completely unique solution to the problem of quantum gravity with profound implications for the various problems of time.

One last minor point. I think you might want to sharpen your claim that QM and BW are inherently incompatible. You seem to think that BW ENTAILS that there is only one outcome for every experiment in M4, while the Hilbert space of QM demands otherwise. But surely this isn't true,after all, Saunders and other Everettians defend the "QM block world" wherein all the outcomes exist in a BW setting. It's true that they must explain why it appears that there are only 3 spatial dimensions or why these 3 dimensions of space emerge from the more fundamental Hilbert space, but there are many such programs. Furthermore, the Everett interpretation squares perfectly with SR and locality. The burden of establishing comparative advantage is especially high for you given your need to radically revise QM.

Another kindred BW spirit.

Michael

attachments: SHPMP557.pdf

Tom,

Thank you for the kind words; I'm glad you found the essay stimulating. As for your discussion with Peter as to whether there is a "clear research path", you're right that certain paths forward are certainly clear, and he's right that this essay (and the arXiv paper) don't exactly make it clear *which* research paths I'm advocating. There are a lot of paths forward, and I'm still not certain which ones are most promising. After all, when going all the way back to 1927 and changing all these fundamental assumptions, the amount of work that needs to be done just to recover known experimental results is truly daunting. (Of course, I have my opinions on how best to proceed, but more on that some other time...)

Concerning your interest in the computability aspect, there's both good news and bad news. The good news (that you point out) is that there's actually something to compute: systems of well-defined equations and well-defined boundary conditions, with solutions of easy-to-interpret classical field values at given points in space-time. The bad news is that by imposing different portions of the boundary conditions at different times, the usual computational technique of starting with the initial solution and then incrementally calculating subsequent time-steps will no longer work. The whole 4D system needs to be solved globally, like a 3D spatial boundary problem. And when one can't solve the equations exactly, it's not at all clear how to proceed computationally. (Ideally, I'd like to "push" the computational uncertainties toward the center of the 4-volume, away from the boundaries.) Any insight on this issue would certainly be appreciated.

Best,

Ken

Thank you for the kind words; I'm glad you found the essay stimulating. As for your discussion with Peter as to whether there is a "clear research path", you're right that certain paths forward are certainly clear, and he's right that this essay (and the arXiv paper) don't exactly make it clear *which* research paths I'm advocating. There are a lot of paths forward, and I'm still not certain which ones are most promising. After all, when going all the way back to 1927 and changing all these fundamental assumptions, the amount of work that needs to be done just to recover known experimental results is truly daunting. (Of course, I have my opinions on how best to proceed, but more on that some other time...)

Concerning your interest in the computability aspect, there's both good news and bad news. The good news (that you point out) is that there's actually something to compute: systems of well-defined equations and well-defined boundary conditions, with solutions of easy-to-interpret classical field values at given points in space-time. The bad news is that by imposing different portions of the boundary conditions at different times, the usual computational technique of starting with the initial solution and then incrementally calculating subsequent time-steps will no longer work. The whole 4D system needs to be solved globally, like a 3D spatial boundary problem. And when one can't solve the equations exactly, it's not at all clear how to proceed computationally. (Ideally, I'd like to "push" the computational uncertainties toward the center of the 4-volume, away from the boundaries.) Any insight on this issue would certainly be appreciated.

Best,

Ken

Peter,

Thanks for your interest -- and no, I certainly don't consider it rude to ask me about my ontology! (More on this in a sec.) By the way, I had already read your recent arXiv paper, and I had been planning to introduce myself once I read up on random fields. This week I'll head over to your own essay and see what I can find out...

>I s'pose anyone who works in QFT would be...

view entire post

Thanks for your interest -- and no, I certainly don't consider it rude to ask me about my ontology! (More on this in a sec.) By the way, I had already read your recent arXiv paper, and I had been planning to introduce myself once I read up on random fields. This week I'll head over to your own essay and see what I can find out...

>I s'pose anyone who works in QFT would be...

view entire post

Hi Mark,

I must apologize for not having written since we met last month; I have your papers on top of a stack of must-read items, but haven't yet been able to devote the necessary time to them this semester. Soon, I promise.

>GR has (at least one) temporal pathology, namely closed time-like curves (CTCs) allowing for self-inconsistency, e.g., a particle looping around a CTC segment so that it strikes itself before it entered the CTC segment, thereby keeping it from entering the segment to begin with.

I think quantum effects save the day here, for two different reasons: one is that we need to replace classical particles with fields, so there's so such thing as an "all or nothing" trajectory (didn't Feynman work out an example like this with a light switch?). The other is that there's no way to prepare the initial state with sufficient accuracy to cause this precise dilemma. In fact, as I see it, the logical necessity of uncertainty principle is that it prevents exactly this sort of paradox. (And these paradoxes would no longer require CTCs if one takes an introcausal perspective where the future is always affecting the past.)

>The main difference between your approach and RBW is that RBW is fundamentally probabilistic so we don’t have “to reinvent every single piece of quantum theory in a block universe framework.”

I wasn't trying to lump in RBW in with the "established" interpretations; more on RBW after you post your essay.

But I will say that I've given a great deal of thought to how a theory can be "fundamentally probabilistic" and still work in a block universe. I do think it can be done, but not in the way that standard QM's Born Rule is probabilistic: those are *outcome* probabilities, which is a concept of dubious validity in a block universe. The key, I think, is to define probabilities over parameters that are unknown but well-defined. Then, once you know everything, all those probabilities can converge to 1 or 0 (in a block universe, any given event either happens or it doesn't). This is also deeply consistent with a Bayesian interpretation of probability, where probability is just a measure of belief based on available information, not anything fundamental that is somehow "out there" in the static block universe.

Cheers,

Ken

I must apologize for not having written since we met last month; I have your papers on top of a stack of must-read items, but haven't yet been able to devote the necessary time to them this semester. Soon, I promise.

>GR has (at least one) temporal pathology, namely closed time-like curves (CTCs) allowing for self-inconsistency, e.g., a particle looping around a CTC segment so that it strikes itself before it entered the CTC segment, thereby keeping it from entering the segment to begin with.

I think quantum effects save the day here, for two different reasons: one is that we need to replace classical particles with fields, so there's so such thing as an "all or nothing" trajectory (didn't Feynman work out an example like this with a light switch?). The other is that there's no way to prepare the initial state with sufficient accuracy to cause this precise dilemma. In fact, as I see it, the logical necessity of uncertainty principle is that it prevents exactly this sort of paradox. (And these paradoxes would no longer require CTCs if one takes an introcausal perspective where the future is always affecting the past.)

>The main difference between your approach and RBW is that RBW is fundamentally probabilistic so we don’t have “to reinvent every single piece of quantum theory in a block universe framework.”

I wasn't trying to lump in RBW in with the "established" interpretations; more on RBW after you post your essay.

But I will say that I've given a great deal of thought to how a theory can be "fundamentally probabilistic" and still work in a block universe. I do think it can be done, but not in the way that standard QM's Born Rule is probabilistic: those are *outcome* probabilities, which is a concept of dubious validity in a block universe. The key, I think, is to define probabilities over parameters that are unknown but well-defined. Then, once you know everything, all those probabilities can converge to 1 or 0 (in a block universe, any given event either happens or it doesn't). This is also deeply consistent with a Bayesian interpretation of probability, where probability is just a measure of belief based on available information, not anything fundamental that is somehow "out there" in the static block universe.

Cheers,

Ken

Dr. E,

From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your post has not swayed me. Perhaps we'll just have to agree to disagree.

Best,

Ken

From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your post has not swayed me. Perhaps we'll just have to agree to disagree.

Best,

Ken

Hi Michael,

I'm looking forward to your essay... I'll try to carefully read through your papers this week as well.

>...their retrocausal devices amount to little more than a veiled assertion that in the BW the outcomes of QM experiments are already "there."

Wait a sec -- surely if you are using a block picture you must agree that the outcomes are, in a timeless sense, "already there"? I hope your point here is that other approaches merely assert the outcome without giving any tool to determine that outcome's relative likelihood. If so, I have such a tool; a probability measure of the entire hypersurface boundary; it's in the arXiv paper.

>Saunders and other Everettians defend the "QM block world" wherein all the outcomes exist in a BW setting.

I guess I merely dismissed such a picture in my essay without going into details... But I refuse to accept that any Everettian picture is compatible with a block universe until I see their version of general relativity. It would have to explain exactly how all these universes are connected together, something they seem to avoid pinning down. (Actually, I admit that I still probably wouldn't accept it, even then, because it would still violate the guiding principle I spelled out at the end of my earlier response to Peter.)

To me, Everett's Many Worlds Interpretation is the poster child for how awkward it can be to extrapolate non-block-universe concepts to their logical conclusions. Best to start off with block-universe-compatible concepts in the first place.

Cheers,

Ken

I'm looking forward to your essay... I'll try to carefully read through your papers this week as well.

>...their retrocausal devices amount to little more than a veiled assertion that in the BW the outcomes of QM experiments are already "there."

Wait a sec -- surely if you are using a block picture you must agree that the outcomes are, in a timeless sense, "already there"? I hope your point here is that other approaches merely assert the outcome without giving any tool to determine that outcome's relative likelihood. If so, I have such a tool; a probability measure of the entire hypersurface boundary; it's in the arXiv paper.

>Saunders and other Everettians defend the "QM block world" wherein all the outcomes exist in a BW setting.

I guess I merely dismissed such a picture in my essay without going into details... But I refuse to accept that any Everettian picture is compatible with a block universe until I see their version of general relativity. It would have to explain exactly how all these universes are connected together, something they seem to avoid pinning down. (Actually, I admit that I still probably wouldn't accept it, even then, because it would still violate the guiding principle I spelled out at the end of my earlier response to Peter.)

To me, Everett's Many Worlds Interpretation is the poster child for how awkward it can be to extrapolate non-block-universe concepts to their logical conclusions. Best to start off with block-universe-compatible concepts in the first place.

Cheers,

Ken

Interesting responses to all your commenters. I feel clearer, anyway.

"what's the difference between classical fields, random fields, and quantum fields"

Between the first and the second, classical fields don't work well with probability: if we introduce thermal fluctuations, when we measure the field it will almost certainly be discontinuous (akin to the discontinuous paths of Brownian motion). To accommodate modern physics experiments, however, we *have to have probability* in the mix, which I would /prefer/ to have well-defined. Continuous random fields are the nicest mathematics for introducing probability into a block world of classical fields (at least, I think there's no contest from stochastic methods, such as are used in Stochastic ElectroDynamics, in GRW-type reduction mechanisms, or in Langevin-type equations). Perhaps, although here I am out of my mathematical area, my methods could be called a block world approach to the Fokker-Plank equation.

Continuous random fields are very close to quantum fields, but they are different in commuting instead of being non-commuting at time-like separation. That of course leads to a different measurement theory, which we certainly have to discuss carefully, but so much stays the same that we can feel relatively comfortable with the transition from QFT to continuous random fields. This seems a strong merit for random fields as a mathematics that moves us away from QFT, even if it's only a transition to a better mathematics for fundamental physics.

x,y,z,t are coordinates in QFT and for continuous random fields. The field is an operator-valued distribution, but QFT and random fields are set against a classical manifold. That changes if one moves to the mathematics of non-commutative geometry, but that's not the standard model.

I feel ambivalent about asserting a block world ontology for future events, insofar as I cannot experiment on the future, confined as I am to my psychological present, but a model has to model, aka predict, something about the future.

I'm also looking forward to the RBW essay.

"what's the difference between classical fields, random fields, and quantum fields"

Between the first and the second, classical fields don't work well with probability: if we introduce thermal fluctuations, when we measure the field it will almost certainly be discontinuous (akin to the discontinuous paths of Brownian motion). To accommodate modern physics experiments, however, we *have to have probability* in the mix, which I would /prefer/ to have well-defined. Continuous random fields are the nicest mathematics for introducing probability into a block world of classical fields (at least, I think there's no contest from stochastic methods, such as are used in Stochastic ElectroDynamics, in GRW-type reduction mechanisms, or in Langevin-type equations). Perhaps, although here I am out of my mathematical area, my methods could be called a block world approach to the Fokker-Plank equation.

Continuous random fields are very close to quantum fields, but they are different in commuting instead of being non-commuting at time-like separation. That of course leads to a different measurement theory, which we certainly have to discuss carefully, but so much stays the same that we can feel relatively comfortable with the transition from QFT to continuous random fields. This seems a strong merit for random fields as a mathematics that moves us away from QFT, even if it's only a transition to a better mathematics for fundamental physics.

x,y,z,t are coordinates in QFT and for continuous random fields. The field is an operator-valued distribution, but QFT and random fields are set against a classical manifold. That changes if one moves to the mathematics of non-commutative geometry, but that's not the standard model.

I feel ambivalent about asserting a block world ontology for future events, insofar as I cannot experiment on the future, confined as I am to my psychological present, but a model has to model, aka predict, something about the future.

I'm also looking forward to the RBW essay.

Thanks Ken!

You write, "From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your...

view entire post

You write, "From your post, and from other very similar posts you've written on other threads, I take it that we are coming at this issue from diametrically opposite philosophical perspectives: you're railing against the block universe, while the block universe is central to my thinking. Apparently the detailed arguments for such a view in my paper have not swayed you, and your...

view entire post

Thanks for your response, Ken. Hope you’re willing to continue this thread until I understand your position.

“I think quantum effects save the day here, for two different reasons: one is that we need to replace classical particles with fields, so there's so such thing as an "all or nothing" trajectory (didn't Feynman work out an example like this with a light switch?). The other is that there's no way to prepare the initial state with sufficient accuracy to cause this precise dilemma. In fact, as I see it, the logical necessity of uncertainty principle is that it prevents exactly this sort of paradox.”

I’m talking about classical objects, so the DeBroglie wavelengths are much smaller than the objects themselves. You’re not suggesting that such objects be modeled as and exhibit wave characteristics, are you?

Regarding your second point, are you claiming that a classical object won’t follow the self-inconsistent CTC simply because that path is highly improbable? If so, what makes it more improbable than any other? What happens when an object follows this improbable path? Or, do you mean impossibility rather than improbability? If so, what makes it impossible?

“(And these paradoxes would no longer require CTCs if one takes an introcausal perspective where the future is always affecting the past.)”

So, are you saying GR needs to be augmented with a self-consistency principle? How is it realized physically? Would I feel a mysterious force pushing the ball out of my hands as I’m about to start it on the self-inconsistent path? Would I suddenly change my mind, asking myself later, “Gee, why didn’t I release the ball?”

Thanks again for your patience,

Mark

“I think quantum effects save the day here, for two different reasons: one is that we need to replace classical particles with fields, so there's so such thing as an "all or nothing" trajectory (didn't Feynman work out an example like this with a light switch?). The other is that there's no way to prepare the initial state with sufficient accuracy to cause this precise dilemma. In fact, as I see it, the logical necessity of uncertainty principle is that it prevents exactly this sort of paradox.”

I’m talking about classical objects, so the DeBroglie wavelengths are much smaller than the objects themselves. You’re not suggesting that such objects be modeled as and exhibit wave characteristics, are you?

Regarding your second point, are you claiming that a classical object won’t follow the self-inconsistent CTC simply because that path is highly improbable? If so, what makes it more improbable than any other? What happens when an object follows this improbable path? Or, do you mean impossibility rather than improbability? If so, what makes it impossible?

“(And these paradoxes would no longer require CTCs if one takes an introcausal perspective where the future is always affecting the past.)”

So, are you saying GR needs to be augmented with a self-consistency principle? How is it realized physically? Would I feel a mysterious force pushing the ball out of my hands as I’m about to start it on the self-inconsistent path? Would I suddenly change my mind, asking myself later, “Gee, why didn’t I release the ball?”

Thanks again for your patience,

Mark

Ken,

You write "The whole 4D system needs to be solved globally, like a 3D spatial boundary problem. And when one can't solve the equations exactly, it's not at all clear how to proceed computationally. (Ideally, I'd like to "push" the computational uncertainties toward the center of the 4-volume, away from the boundaries.)" I am in full accord.

I envision the computational possibility for a sorting algorithm to perform strongly polynomial time claculations of least path, least energy between t and t' based on your probability calculations of future boundary conditions at t', for any arbitrarily chosen scale. To explain:

My ICCS 2007 paper, necsi.org/events/iccs7/papers/740473b577c92da06ccd77fad70c.p

df, proposes that the flow of information for a random field of complete probable future states to a partially ordered present is indistinguishable--as you have also concluded--from the flow of information past to future.

How about an n-dimensional, 2-point boundary value problem in which the path t to t' is maximally efficient, least action? I compare the classical 2-point boundary, 6 dimension problem of landing a rocket on the moon in shortest path at least fuel cost, with a universal control system in which negative feedback from the future informs the present state. Gravity is, in fact, just such a univeral negative feedback system. In other words, negative feedback informs the present, positive feedback informs the future, and stasis--or neutral feedback--is the aggregated smoothly continuous property of the complex system on the large scale.

As a result, what we call "the present," at an arbitrarily chosen frozen moment of time, is the least of all possible moments. Your block universe is therefore preserved without sacrificing the dynamical properties of an evolving system, and accounting for the closed hypersurface bondary conditions.

Peter, you write "Claiming to be Popperian no longer adequately informs us of your point of view." Fair enough! The philosophy to which I particularly refer is what Popper called "metaphysical realism." (Objective Knowledge; Realism & the Aim of Science.) I think Ken's proposal meets the criteria; future boundary conditions are necessarily metaphysical but not beyond comprehension and indirect measurement.

Tom

report post as inappropriate

You write "The whole 4D system needs to be solved globally, like a 3D spatial boundary problem. And when one can't solve the equations exactly, it's not at all clear how to proceed computationally. (Ideally, I'd like to "push" the computational uncertainties toward the center of the 4-volume, away from the boundaries.)" I am in full accord.

I envision the computational possibility for a sorting algorithm to perform strongly polynomial time claculations of least path, least energy between t and t' based on your probability calculations of future boundary conditions at t', for any arbitrarily chosen scale. To explain:

My ICCS 2007 paper, necsi.org/events/iccs7/papers/740473b577c92da06ccd77fad70c.p

df, proposes that the flow of information for a random field of complete probable future states to a partially ordered present is indistinguishable--as you have also concluded--from the flow of information past to future.

How about an n-dimensional, 2-point boundary value problem in which the path t to t' is maximally efficient, least action? I compare the classical 2-point boundary, 6 dimension problem of landing a rocket on the moon in shortest path at least fuel cost, with a universal control system in which negative feedback from the future informs the present state. Gravity is, in fact, just such a univeral negative feedback system. In other words, negative feedback informs the present, positive feedback informs the future, and stasis--or neutral feedback--is the aggregated smoothly continuous property of the complex system on the large scale.

As a result, what we call "the present," at an arbitrarily chosen frozen moment of time, is the least of all possible moments. Your block universe is therefore preserved without sacrificing the dynamical properties of an evolving system, and accounting for the closed hypersurface bondary conditions.

Peter, you write "Claiming to be Popperian no longer adequately informs us of your point of view." Fair enough! The philosophy to which I particularly refer is what Popper called "metaphysical realism." (Objective Knowledge; Realism & the Aim of Science.) I think Ken's proposal meets the criteria; future boundary conditions are necessarily metaphysical but not beyond comprehension and indirect measurement.

Tom

report post as inappropriate

Hi Peter,

You may be "ambivalent" about treating the future the same as the past, but I argue it's this precise ambivalence that has let to so many problems when it comes time to reconcile QM with GR. Even though it's counter-intuitive, we need to force ourselves to treat the past and the future on the same footing. After all, *eventually* the future will be past, and we shouldn't have to treat those events differently in our equations once that happens. (Granted, learning about uncertain values makes them more certain, but that sort of thing equally applies to uncertain values both in the past and the future.)

I'll email you with some thoughts concerning random fields and probability... I've recently become interested in a possible overlap between stochastic fields and this two-time boundary framework, and would like to better understand if there's any connection with your own research -- because as you say, there are some common threads between our ideas. (On the other hand, I'm concerned that you're treating probability as a physical substance rather than just a consequence of uncertainty. That doesn't work in a static block universe, because those "real" probabilities must somehow *change* to become some certain outcomes.)

Cheers,

Ken

You may be "ambivalent" about treating the future the same as the past, but I argue it's this precise ambivalence that has let to so many problems when it comes time to reconcile QM with GR. Even though it's counter-intuitive, we need to force ourselves to treat the past and the future on the same footing. After all, *eventually* the future will be past, and we shouldn't have to treat those events differently in our equations once that happens. (Granted, learning about uncertain values makes them more certain, but that sort of thing equally applies to uncertain values both in the past and the future.)

I'll email you with some thoughts concerning random fields and probability... I've recently become interested in a possible overlap between stochastic fields and this two-time boundary framework, and would like to better understand if there's any connection with your own research -- because as you say, there are some common threads between our ideas. (On the other hand, I'm concerned that you're treating probability as a physical substance rather than just a consequence of uncertainty. That doesn't work in a static block universe, because those "real" probabilities must somehow *change* to become some certain outcomes.)

Cheers,

Ken

Hi Mark,

I *am* suggesting that everything is really classical fields, but this is no stranger than a QFT theorist suggesting that everything is really quantum fields -- it's just a question of when you're allowed to approximate those fields as classical objects. (And no fair giving me a far-out scenario, and then appealing to common sense to prevent me from using fields! :-) After all,...

view entire post

I *am* suggesting that everything is really classical fields, but this is no stranger than a QFT theorist suggesting that everything is really quantum fields -- it's just a question of when you're allowed to approximate those fields as classical objects. (And no fair giving me a far-out scenario, and then appealing to common sense to prevent me from using fields! :-) After all,...

view entire post

Tom,

Thanks for your thoughts on the matter... I'll need to think about how a "sorting algorithm" might do the job. The problem with continuous, classical fields vs. classical particle paths is that the number of options to sort would seem to be much larger in the case of fields. (And don't forget, the field has to consistently solve a set of differential equations throughout the 4-volume. And those are Euler-Lagrange equations that already minimize the action, so that sort of minimization principle doesn't get you anything extra in this case.)

Cheers,

Ken

Thanks for your thoughts on the matter... I'll need to think about how a "sorting algorithm" might do the job. The problem with continuous, classical fields vs. classical particle paths is that the number of options to sort would seem to be much larger in the case of fields. (And don't forget, the field has to consistently solve a set of differential equations throughout the 4-volume. And those are Euler-Lagrange equations that already minimize the action, so that sort of minimization principle doesn't get you anything extra in this case.)

Cheers,

Ken

Hi Ken,

I am glad to see that you also have a contribution here.

Have you seen my essay, which is also about the block universe/time? Now there are two of us.

What could be even more interesting to you, is my last paper

http://xxx.lanl.gov/abs/0811.1905

which also discusses the probabilistic interpretation of Klein-Gordon equation in a block-universe spirit.

I am glad to see that you also have a contribution here.

Have you seen my essay, which is also about the block universe/time? Now there are two of us.

What could be even more interesting to you, is my last paper

http://xxx.lanl.gov/abs/0811.1905

which also discusses the probabilistic interpretation of Klein-Gordon equation in a block-universe spirit.

Ken,

You write "Thanks for your thoughts on the matter... I'll need to think about how a "sorting algorithm" might do the job. The problem with continuous, classical fields vs. classical particle paths is that the number of options to sort would seem to be much larger in the case of fields. (And don't forget, the field has to consistently solve a set of differential equations throughout the 4-volume. And those are Euler-Lagrange equations that already minimize the action, so that sort of minimization principle doesn't get you anything extra in this case.)"

Right. That's why finitely probable particle paths in a nonlocal quantum mechanical system are not the same as infinite possible paths in a local classical system. Suppose (and I do) that differential equations are not the only or even the best mathematics to model continuous functions, under an assumption that time is n-dimensional continuous. Then, insofar as gravity is time dependent, dissipative energy over hyperspatial manifolds in an infinitely self similar system restricts particle paths locally to a set confined to the 4D manifold defined by your two boundary points t and t'. A sorting algorithm (which implies strongly polynomial time solutions) applies to problems in such self-assembled phenomena as protein folding, where the final configuration is known but the path of the process through space is not. Soritng the paths by energy differential between the two boundary points to the stable state is a least energy solution (thus my suggesiton of an n-dimensional, 2-point boundary value model).

The key concepts here, besides the assumption of an n-dimension continuous time metric (n > 4), are 1) infinite self-similarity, which obviates a boundary between classical and quantum domains; and 2) removing the problem from the spherical volume to the flat hypersurface, where maps are mathematically simpler and better behaved.

Do hope we can continue a dialogue. Thanks.

Tom

You write "Thanks for your thoughts on the matter... I'll need to think about how a "sorting algorithm" might do the job. The problem with continuous, classical fields vs. classical particle paths is that the number of options to sort would seem to be much larger in the case of fields. (And don't forget, the field has to consistently solve a set of differential equations throughout the 4-volume. And those are Euler-Lagrange equations that already minimize the action, so that sort of minimization principle doesn't get you anything extra in this case.)"

Right. That's why finitely probable particle paths in a nonlocal quantum mechanical system are not the same as infinite possible paths in a local classical system. Suppose (and I do) that differential equations are not the only or even the best mathematics to model continuous functions, under an assumption that time is n-dimensional continuous. Then, insofar as gravity is time dependent, dissipative energy over hyperspatial manifolds in an infinitely self similar system restricts particle paths locally to a set confined to the 4D manifold defined by your two boundary points t and t'. A sorting algorithm (which implies strongly polynomial time solutions) applies to problems in such self-assembled phenomena as protein folding, where the final configuration is known but the path of the process through space is not. Soritng the paths by energy differential between the two boundary points to the stable state is a least energy solution (thus my suggesiton of an n-dimensional, 2-point boundary value model).

The key concepts here, besides the assumption of an n-dimension continuous time metric (n > 4), are 1) infinite self-similarity, which obviates a boundary between classical and quantum domains; and 2) removing the problem from the spherical volume to the flat hypersurface, where maps are mathematically simpler and better behaved.

Do hope we can continue a dialogue. Thanks.

Tom

Hi Ken. I think we use mathematical models that have effective ways to treat asymmetry, rather than that we can insist that there is no asymmetry.

For me, there are three layers of modeling, (1) lists of finite data of finite accuracy from experience, raw data such as Gregor Weihs can still send you as about 1.5GB from his Bell-EPR violating experiments running up to 1998, (2) statistics of...

view entire post

For me, there are three layers of modeling, (1) lists of finite data of finite accuracy from experience, raw data such as Gregor Weihs can still send you as about 1.5GB from his Bell-EPR violating experiments running up to 1998, (2) statistics of...

view entire post

Hi Ken,

“I found the Feynman reference -- it was a very similar example addressed in Wheeler/Feynman's 1949 paper (not the 1945 one). Check it out -- they conclude that all these paradoxes rely on an "all-or-nothing" sort of interaction, but once you allow continuous interactions (say, a glancing blow due to a slightly-misaligned trajectory through a CTC) there's always a...

view entire post

“I found the Feynman reference -- it was a very similar example addressed in Wheeler/Feynman's 1949 paper (not the 1945 one). Check it out -- they conclude that all these paradoxes rely on an "all-or-nothing" sort of interaction, but once you allow continuous interactions (say, a glancing blow due to a slightly-misaligned trajectory through a CTC) there's always a...

view entire post

Dear Ken,

Congratulations for your nicely written tutorial on the conceptualization of time as a 4-D block! Your exposition points out the most common pitfalls in representing/understanding the frozen time.

1. “Beyond Copenhagen, there are several other established interpretations, some of which are explicitly inconsistent with the block universe (one of them postulates many universes).”

I developed a “world theory” which provides easily a block view for the MWI and standard QM (only that we don’t necessarily have Lorentz invariance). On the other hand, Penrose presents a general relativistic spacetime able to split in many worlds (he splits them along lightlike 3d-surfaces, if you are interested, I will look for the article).

2. Your discussion of the wavefunction discontinuous collapse can be related to my solution, in which I replace this discontinuity with “delayed initial conditions”, sending back the Quantum Mechanics in the block time view.

My Smooth QM is deterministic (but compatible with free-will), and, contrary to Bohm’s theory, it relies only to the evolution equation (Schrodinger for purified states, Liouville - von Neumann for mixtures), and does not require other hidden variables than the initial conditions of the evolution PDE. These I called "delayed initial conditions". I find some similarity with your solution, in that both of them look like retro-causation. I understand that you solved the incompatibility between initial/final conditions by using Klein-Gordon equation; I solve it by moving the discussion to the entanglement between observed state and initial measurement device. Another important difference: my delayed initial conditions are partial, and spread in spacetime, not just at the beginning and end; they are "caused" by the measurements. I do not want to detail more my theories on your discussion thread, since it is appropriate to talk about your work here. I just pointed out some connections.

Best wishes,

Cristi Stoica

“Flowing with a Frozen River”,

http://fqxi.org/community/forum/topic/322

Congratulations for your nicely written tutorial on the conceptualization of time as a 4-D block! Your exposition points out the most common pitfalls in representing/understanding the frozen time.

1. “Beyond Copenhagen, there are several other established interpretations, some of which are explicitly inconsistent with the block universe (one of them postulates many universes).”

I developed a “world theory” which provides easily a block view for the MWI and standard QM (only that we don’t necessarily have Lorentz invariance). On the other hand, Penrose presents a general relativistic spacetime able to split in many worlds (he splits them along lightlike 3d-surfaces, if you are interested, I will look for the article).

2. Your discussion of the wavefunction discontinuous collapse can be related to my solution, in which I replace this discontinuity with “delayed initial conditions”, sending back the Quantum Mechanics in the block time view.

My Smooth QM is deterministic (but compatible with free-will), and, contrary to Bohm’s theory, it relies only to the evolution equation (Schrodinger for purified states, Liouville - von Neumann for mixtures), and does not require other hidden variables than the initial conditions of the evolution PDE. These I called "delayed initial conditions". I find some similarity with your solution, in that both of them look like retro-causation. I understand that you solved the incompatibility between initial/final conditions by using Klein-Gordon equation; I solve it by moving the discussion to the entanglement between observed state and initial measurement device. Another important difference: my delayed initial conditions are partial, and spread in spacetime, not just at the beginning and end; they are "caused" by the measurements. I do not want to detail more my theories on your discussion thread, since it is appropriate to talk about your work here. I just pointed out some connections.

Best wishes,

Cristi Stoica

“Flowing with a Frozen River”,

http://fqxi.org/community/forum/topic/322

Professor Wharton,

"After all, *eventually* the future will be past, and we shouldn't have to treat those events differently in our equations once that happens. (Granted, learning about uncertain values makes them more certain, but that sort of thing equally applies to uncertain values both in the past and the future.)"

From a layman's perspective, this is the problem I see with "block time." Yes, if time is a fundamental dimension proceeding from the past into the future, block time does make sense, but the reality is the present, with time flowing by it from future potential to past circumstance. Which is more fundamental, the earth rotating, or the linear progression of days? I would argue time is a consequence of motion, rather than the basis for it.

"After all, *eventually* the future will be past, and we shouldn't have to treat those events differently in our equations once that happens. (Granted, learning about uncertain values makes them more certain, but that sort of thing equally applies to uncertain values both in the past and the future.)"

From a layman's perspective, this is the problem I see with "block time." Yes, if time is a fundamental dimension proceeding from the past into the future, block time does make sense, but the reality is the present, with time flowing by it from future potential to past circumstance. Which is more fundamental, the earth rotating, or the linear progression of days? I would argue time is a consequence of motion, rather than the basis for it.

Hrvoje, Tom, and Peter: Since your recent posts are getting a bit off-topic, let's move these discussions to email for now. (Hrvoje and Peter -- I already owe you responses to your latest emails, but probably won't get to them until next week. Tom, feel free to contact me at wharton(.at.)science.sjsu.edu.)

Mark: I don't see Novikov's self-consistency principle as being an "addition" to GR; it's just a tautology: if you apply so many constraints to a system of physical equations such that there is no solution, then there's no solution. And if the solutions correspond to "reality", then it must be impossible to impose those inconsistent constraints in the first place. This must be true for *any* physical theory; not just GR. If you ask "What keeps me from imposing that many constraints?", the answer is simply that those constraints are self-inconsistent. You might as well ask why I can't both impose a net force on an object and also impose that its velocity remains constant.

Now, if your question boils down to which *sorts* of constraints one is allowed to impose on physical equations, without overconstraining the system... you're getting into exactly the questions that I am considering. You should also read Steve Weinstein's essay in this contest (and related recent arXiv post) for some very interesting insights. (Also, thanks for your detailed response to my questions on your own essay thread; I'll get to that as soon as I can, probably next week.)

Best,

Ken

Now, if your question boils down to which *sorts* of constraints one is allowed to impose on physical equations, without overconstraining the system... you're getting into exactly the questions that I am considering. You should also read Steve Weinstein's essay in this contest (and related recent arXiv post) for some very interesting insights. (Also, thanks for your detailed response to my questions on your own essay thread; I'll get to that as soon as I can, probably next week.)

Best,

Ken

Cristi,

Thanks for your kind comments. I think you might find a lot of connections between your research and the approach of Larry Schulman (I cited his book as a reference in my essay). He also is trying to "nudge" the wavefunction into a system that can match two-time boundary conditions.

I like certain aspects of your essay very much -- particularly getting away from this "instantaneous" aspect of measurement that many people seem over-reliant on -- but I just wanted to comment that I think it's important to treat measurement and preparation on the same footing. After all, any preparation process could also serve as a non-destructive measurement of a yet-earlier preparation. So if you're going to use diagrams that make a clear distinction between the preparation and the measurement, I'd suggest showing that the final measurement might *also* have a time-duration to it, and draw the figure in a way that shows the process you envision might repeat itself over and over.

Of course, I would also urge you to consider that there might be other, hidden variables that get changed over the duration of the measurement, while the aspects that are actually measured stay constant throughout the measurement process. And going to a relativistic picture naturally gives you exactly the right number of extra parameters to make this work. See my arXiv paper (0706.4075) if you're interested in how this might work...

Cheers!

Ken

Thanks for your kind comments. I think you might find a lot of connections between your research and the approach of Larry Schulman (I cited his book as a reference in my essay). He also is trying to "nudge" the wavefunction into a system that can match two-time boundary conditions.

I like certain aspects of your essay very much -- particularly getting away from this "instantaneous" aspect of measurement that many people seem over-reliant on -- but I just wanted to comment that I think it's important to treat measurement and preparation on the same footing. After all, any preparation process could also serve as a non-destructive measurement of a yet-earlier preparation. So if you're going to use diagrams that make a clear distinction between the preparation and the measurement, I'd suggest showing that the final measurement might *also* have a time-duration to it, and draw the figure in a way that shows the process you envision might repeat itself over and over.

Of course, I would also urge you to consider that there might be other, hidden variables that get changed over the duration of the measurement, while the aspects that are actually measured stay constant throughout the measurement process. And going to a relativistic picture naturally gives you exactly the right number of extra parameters to make this work. See my arXiv paper (0706.4075) if you're interested in how this might work...

Cheers!

Ken

John,

You write: "...reality is the present, with time flowing by it..."

Yes, I realize that's how most people see matters. And that's exactly why I made such a concerted attempt in this essay to try to convince the reader that such a picture just doesn't make sense when it comes to physics.

But here's another point I didn't go into in much detail. The analogy of time "flowing" is dangerous because the very word "flowing" (and the general concept of motion) is meaningless without a prior concept of time. Given our primitive concepts of space and time, flow and motion both make sense. The danger comes in when one tries to make an *analogy* between, say, the flow of water and the "flow of time". I tried to make the point in the essay that it's a terrible analogy, because now instead of flow velocity = distance/time, one ends up with the non-sensical notion of "time velocity"=time/time, which isn't anything meaningful at all.

We do have primitive notions of space and time (for more on this, see "The Stuff of Thought" by Steven Pinker) -- the question is how to overcome these intuitive concepts so that we can think about physics objectively. And the best way to do this is to move to a static block universe. If my essay didn't convince you, at least give Huw Price's book a try (Time's Arrow and Archimedes' Point). It's by far the best generally-accessible, non-trivial book on time that you'll find.

Best,

Ken

You write: "...reality is the present, with time flowing by it..."

Yes, I realize that's how most people see matters. And that's exactly why I made such a concerted attempt in this essay to try to convince the reader that such a picture just doesn't make sense when it comes to physics.

But here's another point I didn't go into in much detail. The analogy of time "flowing" is dangerous because the very word "flowing" (and the general concept of motion) is meaningless without a prior concept of time. Given our primitive concepts of space and time, flow and motion both make sense. The danger comes in when one tries to make an *analogy* between, say, the flow of water and the "flow of time". I tried to make the point in the essay that it's a terrible analogy, because now instead of flow velocity = distance/time, one ends up with the non-sensical notion of "time velocity"=time/time, which isn't anything meaningful at all.

We do have primitive notions of space and time (for more on this, see "The Stuff of Thought" by Steven Pinker) -- the question is how to overcome these intuitive concepts so that we can think about physics objectively. And the best way to do this is to move to a static block universe. If my essay didn't convince you, at least give Huw Price's book a try (Time's Arrow and Archimedes' Point). It's by far the best generally-accessible, non-trivial book on time that you'll find.

Best,

Ken

Dear Ken,

“I'd suggest showing that the final measurement might *also* have a time-duration to it, and draw the figure in a way that shows the process you envision might repeat itself over and over.”

Thank you for the suggestions. I totally agree with you. In fact, in a more detailed description, in the original article “Smooth Quantum Mechanics” (http://philsci-archive.pitt.edu/archive/00004199/), I specified it:

“After each observation, the quantum system gets entangled with the measurement device. Thus, even if the system is found in a precise state by the measurement, the entanglement with the measurement device makes its state to be again undetermined. The next measurement selects again an initial condition, to specify the state of the observed system. But now the system gets entangled with the measurement apparatus used for the last observation, and the cycle continues.”

Unfortunately, in the essay I omitted, because of the length limitation, to describe the cycle, as well as in the version I cite in the essay (the second one) of my paper SQM (http://philsci-archive.pitt.edu/archive/00004344/).

And you are right, a picture with this cycle will help a lot.

Thank you,

Cristi Stoica

“Flowing with a Frozen River”,

http://fqxi.org/community/forum/topic/322

“I'd suggest showing that the final measurement might *also* have a time-duration to it, and draw the figure in a way that shows the process you envision might repeat itself over and over.”

Thank you for the suggestions. I totally agree with you. In fact, in a more detailed description, in the original article “Smooth Quantum Mechanics” (http://philsci-archive.pitt.edu/archive/00004199/), I specified it:

“After each observation, the quantum system gets entangled with the measurement device. Thus, even if the system is found in a precise state by the measurement, the entanglement with the measurement device makes its state to be again undetermined. The next measurement selects again an initial condition, to specify the state of the observed system. But now the system gets entangled with the measurement apparatus used for the last observation, and the cycle continues.”

Unfortunately, in the essay I omitted, because of the length limitation, to describe the cycle, as well as in the version I cite in the essay (the second one) of my paper SQM (http://philsci-archive.pitt.edu/archive/00004344/).

And you are right, a picture with this cycle will help a lot.

Thank you,

Cristi Stoica

“Flowing with a Frozen River”,

http://fqxi.org/community/forum/topic/322

Professor Wharton,

Maybe I shouldn't have used the word "flow." Especially since your view of time is that it exists as a static higher dimension, so let me put this another way; Does the rotation of the earth turn tomorrow into yesterday?

It's not that I view the "present" as a "point" in time, since I view time itself as an abstraction, similar to temperature. My argument is that there is simply what might best be described as "energy" and as the arrangements of this energy change, each arrangement is replaced by the next, so this progression of events goes from future potential to past circumstance. So the only "flow" is an attribute of the energy.

Maybe I shouldn't have used the word "flow." Especially since your view of time is that it exists as a static higher dimension, so let me put this another way; Does the rotation of the earth turn tomorrow into yesterday?

It's not that I view the "present" as a "point" in time, since I view time itself as an abstraction, similar to temperature. My argument is that there is simply what might best be described as "energy" and as the arrangements of this energy change, each arrangement is replaced by the next, so this progression of events goes from future potential to past circumstance. So the only "flow" is an attribute of the energy.

Dear Ken,

Thanks for your reply. I want to press one point b/c I don't know that you appreciate the "problem" I’m trying to convey.

"Mark: I don't see Novikov's self-consistency principle as being an "addition" to GR; it's just a tautology: if you apply so many constraints to a system of physical equations such that there is no solution, then there's no solution. And if the...

view entire post

Thanks for your reply. I want to press one point b/c I don't know that you appreciate the "problem" I’m trying to convey.

"Mark: I don't see Novikov's self-consistency principle as being an "addition" to GR; it's just a tautology: if you apply so many constraints to a system of physical equations such that there is no solution, then there's no solution. And if the...

view entire post

Dear Ken,

Sorry to double team you here, but below is a passage from Halpern's essay which supports Mark's claim about Novikov's self-consistency principle being an add-on to GR.

"To combat such conundrums several proposals were suggested. Hawking formulated the “Chronology Protection Conjecture” as an attempt to forbid backward time travel based on the laws of physics [11]. Igor Novikov took a different tact and proposed a self-consistency principle

that permitted past-directed travel as long as it was fully consistent with what already had transpired [12]."

Cheers,

Michael

Sorry to double team you here, but below is a passage from Halpern's essay which supports Mark's claim about Novikov's self-consistency principle being an add-on to GR.

"To combat such conundrums several proposals were suggested. Hawking formulated the “Chronology Protection Conjecture” as an attempt to forbid backward time travel based on the laws of physics [11]. Igor Novikov took a different tact and proposed a self-consistency principle

that permitted past-directed travel as long as it was fully consistent with what already had transpired [12]."

Cheers,

Michael

Cristi: I guess we're in agreement on most of my earlier points. I'm trying to incorporate a finite-duration measurement myself (at least for non-destructive measurements), but so far the closest I've come is to allow the exact interaction time to be part of the overall solution space, with its own probability distribution. For more details you'll have to wade through arXiv:0706.4075.

John: I'm afraid I don't understand your question. I will say that retreating from "flow" and "motion" to a more general "change" can't explain anything fundamental about time, because the very concept of "change" *relies* on our primitive notion of time to even make sense. (What could change mean without time?) I'll continue to argue that the best way to get rid of these primitive temporal notions and focus on the physics is to use a block universe framework.

Mark and Michael: Your points are well taken, and serve to remind me that I've been mentally inhabiting a block universe for so long that I forget most people don't think that way. From the traditional "time-evolve the initial boundary conditions" perspective, it's absolutely correct that something like this would appear mysterious and in need of an additional postulate to prevent paradoxes.

I guess my revised point is that *any* consistent block-universe perspective (mine or yours) can deal with this problem almost trivially, without additional postulates. To recap how my particular type of model would solve this problem, one would impose external boundary conditions on both the space-time region in question and on the particle itself, but the precision at which one can impose all of those boundaries is limited by the uncertainty principle. The probability of any given outcome is then directly related to the number of acceptable solutions to the boundary-value problem. If some particular solution (say, the particle going around the loop) isn't self-consistent, then it's not a solution, and the probability of that outcome will be exactly zero. Simple as that.

We've moved far enough off-topic here we should probably retreat to email if you're not happy with such an answer... and I'll "see" you both soon over at your own essay thread!

Ken

John: I'm afraid I don't understand your question. I will say that retreating from "flow" and "motion" to a more general "change" can't explain anything fundamental about time, because the very concept of "change" *relies* on our primitive notion of time to even make sense. (What could change mean without time?) I'll continue to argue that the best way to get rid of these primitive temporal notions and focus on the physics is to use a block universe framework.

Mark and Michael: Your points are well taken, and serve to remind me that I've been mentally inhabiting a block universe for so long that I forget most people don't think that way. From the traditional "time-evolve the initial boundary conditions" perspective, it's absolutely correct that something like this would appear mysterious and in need of an additional postulate to prevent paradoxes.

I guess my revised point is that *any* consistent block-universe perspective (mine or yours) can deal with this problem almost trivially, without additional postulates. To recap how my particular type of model would solve this problem, one would impose external boundary conditions on both the space-time region in question and on the particle itself, but the precision at which one can impose all of those boundaries is limited by the uncertainty principle. The probability of any given outcome is then directly related to the number of acceptable solutions to the boundary-value problem. If some particular solution (say, the particle going around the loop) isn't self-consistent, then it's not a solution, and the probability of that outcome will be exactly zero. Simple as that.

We've moved far enough off-topic here we should probably retreat to email if you're not happy with such an answer... and I'll "see" you both soon over at your own essay thread!

Ken

Ken,

Humor me for a moment and reconsider a reality in which change and motion are acceptable. The arrow of time goes from what comes first, to what comes second. For the observer, past events proceed future ones, so we observe time as going from the past to future. On the other hand, these events are first in the future, then in the past, so their arrow goes the opposite direction. Throughout history, in fact the very description of the narrative construct we call history, the understanding of time is of the first arrow. That events proceed along this universal path, whether Newton's absolute time, or Einstein's relative time, from past to future.

Yet the only reality ever experienced is of the present. So lets examine the consequence of viewing reality as a fixed present consisting of energy in motion, thus causing change and as each arrangement described by this energy is replaced by the next, these events go from future potential to past circumstance. Therefore past and future do not physically exist because the energy to manifest all such events is only manifesting one moment at a time.

So rather than a fundamental dimension, time becomes an emergent description and consequence of motion, similar to temperature. Temperature, as a scalar average of motion, doesn't exist if we only consider singular motion, but only emerges when measuring a mass of activity. So time, as a sequencing of units of motion, doesn't effectively exist if we cannot define a progression. It is just quantum fuzziness. The present can't be a dimensionless point either, since it is a description of motion and would only be dimensionless if all motion has stopped, so, like temperature, the measurement becomes fuzzy when examined closely.

Whether time proceeds along some dimension from past to future, or is caused by the progression of events from future to past, might seem semantic, yet consider the consequences; If time is that dimension moving toward the future, we need to explain how it deals with potentialities. Either we go with multi-worlds, in which all potentials are taken, or block time, where the potentials are illusionary and it is fundamentally deterministic. Now if we view it from the other direction, where time is the events moving from future potential to past circumstance, the collapsing wave of probabilities makes sense, since it is only energy in motion and time is simply an emergent description of the process, not some fundamental dimension.

What is primitive is the narrative assumption that time is a linear projection from the past into the future.

Humor me for a moment and reconsider a reality in which change and motion are acceptable. The arrow of time goes from what comes first, to what comes second. For the observer, past events proceed future ones, so we observe time as going from the past to future. On the other hand, these events are first in the future, then in the past, so their arrow goes the opposite direction. Throughout history, in fact the very description of the narrative construct we call history, the understanding of time is of the first arrow. That events proceed along this universal path, whether Newton's absolute time, or Einstein's relative time, from past to future.

Yet the only reality ever experienced is of the present. So lets examine the consequence of viewing reality as a fixed present consisting of energy in motion, thus causing change and as each arrangement described by this energy is replaced by the next, these events go from future potential to past circumstance. Therefore past and future do not physically exist because the energy to manifest all such events is only manifesting one moment at a time.

So rather than a fundamental dimension, time becomes an emergent description and consequence of motion, similar to temperature. Temperature, as a scalar average of motion, doesn't exist if we only consider singular motion, but only emerges when measuring a mass of activity. So time, as a sequencing of units of motion, doesn't effectively exist if we cannot define a progression. It is just quantum fuzziness. The present can't be a dimensionless point either, since it is a description of motion and would only be dimensionless if all motion has stopped, so, like temperature, the measurement becomes fuzzy when examined closely.

Whether time proceeds along some dimension from past to future, or is caused by the progression of events from future to past, might seem semantic, yet consider the consequences; If time is that dimension moving toward the future, we need to explain how it deals with potentialities. Either we go with multi-worlds, in which all potentials are taken, or block time, where the potentials are illusionary and it is fundamentally deterministic. Now if we view it from the other direction, where time is the events moving from future potential to past circumstance, the collapsing wave of probabilities makes sense, since it is only energy in motion and time is simply an emergent description of the process, not some fundamental dimension.

What is primitive is the narrative assumption that time is a linear projection from the past into the future.

Might it be that instead of there being a retrocausality of wave functions that they configure themselves at one time according to the a future time as determined by a block structure?

Lawrence B. Crowell

Lawrence B. Crowell

John, Thanks for your comments, but I don't really have much to add to my previous response to you. I just don't see how time can be said to emerge from a picture that starts with a primitive concept of change or motion, because those concepts require an even more primitive concept of time to even make sense. Such an approach is therefore doomed to being a circular argument.

Lawrence, I think I agree with what you're trying to say, but the way that you said it is technically wrong. When you use the phrase "at one time", surely you don't literally mean "at one particular temporal coordinate", because you're talking about a block structure that spans some range of time. Instead, I'm guessing that you're talking about some wavefunction that finds a global solution in a block universe framework, which is exactly what I'm arguing for in this essay. But to say this happens "at one time" or "all at once" is flatly incorrect -- the solution spans many time coordinates, not just one. (The key is to avoid temporal language entirely when thinking in a block universe framework, or else you fall into the trap of imagining some meta-time that is *not* included in the block universe.)

Ken

Lawrence, I think I agree with what you're trying to say, but the way that you said it is technically wrong. When you use the phrase "at one time", surely you don't literally mean "at one particular temporal coordinate", because you're talking about a block structure that spans some range of time. Instead, I'm guessing that you're talking about some wavefunction that finds a global solution in a block universe framework, which is exactly what I'm arguing for in this essay. But to say this happens "at one time" or "all at once" is flatly incorrect -- the solution spans many time coordinates, not just one. (The key is to avoid temporal language entirely when thinking in a block universe framework, or else you fall into the trap of imagining some meta-time that is *not* included in the block universe.)

Ken

Ken,

You are right that it is primitive, but physics is about understanding the basics. Motion doesn't exist without time, because time is units of motion. Just as collective motion doesn't exist without temperature, as temperature is averaging of motion.

Time as a dimension doesn't accord the physical reality of the present any precedence over the physical non-existence of the past and future. You may not have a problem with that, but I like my understanding of reality to accord with reality. Therefore I view time as the series of events which go from being in the future to being in the past, created and consumed by the process which is the present. Not as a non-dynamic dimension along which we exist. Yes, time is relative. If you speed up the motion, time speeds up, just as temperature increases. It is only when you are assuming some fundamentally static dimension that this seems illogical.

You are right that it is primitive, but physics is about understanding the basics. Motion doesn't exist without time, because time is units of motion. Just as collective motion doesn't exist without temperature, as temperature is averaging of motion.

Time as a dimension doesn't accord the physical reality of the present any precedence over the physical non-existence of the past and future. You may not have a problem with that, but I like my understanding of reality to accord with reality. Therefore I view time as the series of events which go from being in the future to being in the past, created and consumed by the process which is the present. Not as a non-dynamic dimension along which we exist. Yes, time is relative. If you speed up the motion, time speeds up, just as temperature increases. It is only when you are assuming some fundamentally static dimension that this seems illogical.

I wanted to thank you for this excellent read!! I definitely loved every little bit of it. I have you bookmarked your site to check out the new stuff you post.

192.168.1.1

report post as inappropriate

192.168.1.1

report post as inappropriate

Login or create account to post reply or comment.