Search FQXi

If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Contests Home

Current Essay Contest

Contest Partners: Astrid and Bruce McWilliams

Previous Contests

Trick or Truth: The Mysterious Connection Between Physics and Mathematics
Contest Partners: Nanotronics Imaging, The Peter and Patricia Gruber Foundation, and The John Templeton Foundation
Media Partner: Scientific American


How Should Humanity Steer the Future?
January 9, 2014 - August 31, 2014
Contest Partners: Jaan Tallinn, The Peter and Patricia Gruber Foundation, The John Templeton Foundation, and Scientific American

It From Bit or Bit From It
March 25 - June 28, 2013
Contest Partners: The Gruber Foundation, J. Templeton Foundation, and Scientific American

Questioning the Foundations
Which of Our Basic Physical Assumptions Are Wrong?
May 24 - August 31, 2012
Contest Partners: The Peter and Patricia Gruber Foundation, SubMeta, and Scientific American

Is Reality Digital or Analog?
November 2010 - February 2011
Contest Partners: The Peter and Patricia Gruber Foundation and Scientific American

What's Ultimately Possible in Physics?
May - October 2009
Contest Partners: Astrid and Bruce McWilliams

The Nature of Time
August - December 2008

Forum Home
Terms of Use

Order posts by:
 chronological order
 most recent first

Posts by the author are highlighted in orange; posts by FQXi Members are highlighted in blue.

By using the FQXi Forum, you acknowledge reading and agree to abide by the Terms of Use

 RSS feed | RSS help

Eckard Blumschein: on 11/5/09 at 23:08pm UTC, wrote Dear George, You are not the only one who ignores my objection but you...

Lawrence B. Crowell: on 10/30/09 at 12:30pm UTC, wrote A plausible way of looking at this is with cloning. Quantum mechanics...

Anonymous: on 10/27/09 at 19:58pm UTC, wrote Do you agree that a summary of your essay’s logic is along these lines? ...

Lawrence B. Crowell: on 10/24/09 at 12:45pm UTC, wrote What intrigues me is the possible role of gravity in this matter. I won't...

Tejinder Singh: on 10/20/09 at 17:44pm UTC, wrote Dear George and Darwin, I have been following the interesting discussion...

George Ellis: on 10/20/09 at 4:50am UTC, wrote Finally: Does this have any consequences? Not for most quantum theory and...

George Ellis: on 10/20/09 at 4:44am UTC, wrote Continuing: So is the problem any worse in the case of networks of...

George Ellis: on 10/20/09 at 4:30am UTC, wrote Darwin, the points you make are very relevant and to try to do them justice...


Joe Fisher: "Dear Steve, Please try to understand that infinite surface am not a..." in Watching the Observers

Steve Agnew: "Supposing the universe is infinite is simply another way of supposing the..." in Watching the Observers

kurt stocklmeir: "spring constant of time and space is not linear - this influences a lot of..." in Alternative Models of...

Kevin Adams: "Very interesting theme! Thanks a lot for this information. I just going to..." in Multiversal Journeys —...

Colin Richardson: ""According to quantum mechanics, a vacuum isn't empty at all. It's actually..." in Manipulating the Quantum...

Lorraine Ford: "Dear Rajiv, I have already addressed your 3 points, but I will put it to..." in FQXi Essay Contest 2016:...

Peter Morgan: "An e-mail sent to me by Springer Nature today tells me that because I am at..." in Manipulating the Quantum...

munized ward: "Variety exists inside all populaces of life forms. This happens somewhat in..." in Natural Selection in...

click titles to read articles

Watching the Observers
Accounting for quantum fuzziness could help us measure space and time—and the cosmos—more accurately.

Bohemian Reality: Searching for a Quantum Connection to Consciousness
Is there are sweet spot where artificial intelligence systems could have the maximum amount of consciousness while retaining powerful quantum properties?

Quantum Replicants: Should future androids dream of quantum sheep?
To build the ultimate artificial mimics of real life systems, we may need to use quantum memory.

Painting a QBist Picture of Reality
A radical interpretation of physics makes quantum theory more personal.

The Spacetime Revolutionary
Carlo Rovelli describes how black holes may transition to "white holes," according to loop quantum gravity, a radical rewrite of fundamental physics.

June 24, 2017

CATEGORY: What's Ultimately Possible in Physics? Essay Contest (2009) [back]
TOPIC: On the applicability of quantum physics by George F. R. Ellis [refresh]
Bookmark and Share
Login or create account to post reply or comment.

Author George F. R. Ellis wrote on Oct. 2, 2009 @ 11:59 GMT
Essay Abstract

Tony Leggett has suggested that quantum theory cannot be applied to complex macroscopic objects. This essay supports that idea by giving two specific examples of complex systems where this is true, because of the essential nature of the quantum measurement process, which cannot be described by standard quantum theory (none of the alternatives proposed in the end get round this limitation, in practical terms). I then place this result in the larger context of the ubiquitous occurrence of top-down causality in complex systems, which is the key process whereby genuine complexity emerges from the underlying physics. The implication is that the ability of physics to comprehend the dynamics of complex systems, such as life, is strictly limited: physics underlies and strongly constrains what happens, but in the end does not determine the unique outcome that actually occurs. This is determined by autonomous emergent higher level dynamics.

Author Bio

George Ellis is a relativist and cosmologist living in Cape Town. Recently he has been considering the way in which physics underlies the functioning of complex systems, including human life.

Download Essay PDF File

Leshan wrote on Oct. 3, 2009 @ 16:33 GMT
Dear George Ellis,

For teleportation of macroscopic body we must transform one into a quantum object. In fact, quantum theory must be applied to complex macroscopic objects for teleportation. For this purpose we must cut out all interactions between the body and environment by creation of absolute isolation. I propose to envelop a body with a closed hole surface.

Can quantum theory be applied to complex macroscopic objects using this method?

Sincerely, Leshan

report post as inappropriate

Lawrence B. Crowell wrote on Oct. 4, 2009 @ 18:06 GMT
I thought your article was insightful and points to a serious problem with the foundations of physics. I think this is germane to problems of Ising models and lattice gauge systems in particular. I will just say that I have written in my essay:

where I look at how the cosmological constant is set according to a quantum phase transition. The argument in part stems from black hole complementarity, but this also leads to the matter of quantum phase transitions. This is a situation where quantum fluctuations are strong enough in a low temperature system so that the Euclideanized version of time acts as the temperature in determining the order of a system t ~ ħ/kT. The ordering has to do with the F_4 fluxes on D7-branes which determines the cosmological constant. The flux of the 4-form on the brane is a combinatorial structure, and the computational (quantum computational) nature of this “network.” This turns out to be NP-complete. Take a look at Abhijnan Rej essay

for more details. So this leads to an Ising spin-like structure or statistics of states which settles on a quantum critical point, or a renormalized value that deviates from that.

This seems to my mind to lead to a twist on the issue of complex networks. It seems to me that the occurrence of the classical domain is connected to the occurrence of the feedback structure. It is at this point the quantum system enters into decoherence. Quantum computers only run bounded quantum probability algorithms, which encompass polynomial time algorithms. This then suggests there is some level of algorithmic complexity where quantum systems fail to function properly and the system becomes increasingly unstable to decoherence. This is I think tied to the mass-gap, for massless particles such as photons are remarkably stable against decoherence. It is massive particles and systems which are the most vulnerable to decoherence.

Thnaks for you thoughts here,

Cheers LC

report post as inappropriate

Eckard Blumschein wrote on Oct. 7, 2009 @ 15:21 GMT
Dear John Ellis,

I recall your essay on time as rather reasonable. My essay quotes a book by Schulman in a context, which suggests that his strange frontier of physics is simply based on a mathematical flaw.



report post as inappropriate

George Schoenfelder wrote on Oct. 9, 2009 @ 20:04 GMT
Dear Dr. Ellis,

I have read your superb essay On the applicability of quantum physics. It is an excellent summary of the current state of the art of quantum mechanics and its inability to connect with the macro world. I also respect your efforts that lead to the Templeton Prize. My lifetime motivation to study physics, engineering, and molecular biology has been mainly spiritual, but with no hocus-pocus.

Clearly, your work shows something fundamental is missing. Whatever is missing is either at the “top” or “bottom.” The problem today is there are many issues and many mechanisms, some top-down and some bottom-up. I think you would agree that one missing mechanism would be better than many. It would be best if it were 1) already in the empirical record, 2) could bridge the spiritual gap without hocus-pocus, and 3) was consistent with quantum computation, quantum mechanics, entanglement, relativity, measurement, QED, embryogenesis, etc.

I contend that quantum computation, as nature uses it, is the fundamental “missing variable.” This natural quantum computation is a network of all atomic computational systems. This natural network occurs in “hidden time.” Kurakin resolves the single double slit experiments via hidden time. As Kurakin says, “The whole Universe makes the choice” which atomic detector “measures” not the photon per se, without violating Bell’s theorem (Kurakin, 2004).

In my FQXi essay section 6, I summarize my proposed solution to the dilemmas your essay described. It presents one relatively simple mechanism that is both top-down and bottom-up. It is top-down in that nature’s use of quantum computation is all encompassing and in that sense is spiritual. It is bottom-up because it is at the quantum level of proteins and where free will resides. It is 100% scientific.

I hope you find my essay interesting and would be honored by any comments and suggestions.


George Schoenfelder


Kurakin, Pavel V. (2004). Hidden variables and hidden time in quantum theory. Keldysh Instituted of Applied Mathematics, Russian Academy of Sciences.

report post as inappropriate

Jeffrey Nicholls wrote on Oct. 10, 2009 @ 05:51 GMT
Dear Prof Ellis,

First, I must admit to being a 'fan' of yours since reading 'large scale structure . . .

Second, As you might see from my essay, I like the idea hierarchical network structure you propose,

It seems to me that networks provide a natural framework to understand how deterministic systems (modelled by Turing machines) can become non-deterministic when connected into a network, since one machine interrupting another can send it off on an entirely different trajectory (cf Turing's 'opracle machine').

One can then see superposition as an algebraic representation of a network in which many different states are stored in the memories of different machines. When one connects to the network, one may "observe' these different machines. So we may look at the internet as a superposition of files ('state vectors').

Further, we may model the onset of determinism in a dynamic system by the halting of a Turing machine in a particular state which is not visible to the observer while that machine is running. This is our normal experience when using a computer and waiting for it to write some output.

The behavious of a layered network is constrained both by the users of the network and by the software available within the network,thus accomodating the top-down and bottom up approach.

Finally, it seems to me that quantum mechanics becomes rather intuitive to us when we look at in terms of the conversations in the bar rather than the hard sphere dynamics of the pool table.

All the best,

Jeffrey Nicholls

report post as inappropriate

Terry Padden wrote on Oct. 10, 2009 @ 09:42 GMT

I agree with your conclusion that complex entities cannot currently be addressed by established physics because of "Top Down" issues. A couple of points:

1. Physics has 2 sides: the empirical experimental side and the theoretical side expressed in some formalism (logic & maths). By "physics" you effectively mean modern "mathematical" theoretical physics. Your conclusion applies to the inadequacy of existing logic and maths, the formalisms side of physics.

2. The top level entity need not be biological etc.. It can be an ordinary physical object with "emergent" "Top Down" properties. Two Nobel physicists (Anderson "More is Different" and Laughlin "A Diffrent Universe") have written extensively on this. "Top Down" and "Emergent" are effectively synonyms.

3. The fundamental issue is the failure of our best formalisms to provide the tools we need to handle "Emergence". Our Logic and Maths are simply not good enough. Although they are very good, they limit science to Reductionism.

4. The problem is not confined to the maths. There are major deficiencies in the tools of logic currently available.

This issue is one of the 10 points in my essay which substantiate a similar conclusion to yours. Only I emphasise the limit on physics is a consequence of deficient formalisms.

report post as inappropriate

Narendra Nath wrote on Oct. 10, 2009 @ 14:17 GMT
Dear George,

yours is an excellent attempt to work out the scope of present day Physics where one may attempt to explain complex processes outside Physics to be understood in a fundamental way using quantum and classical physics together. Nice job done as it may enable some one to work out a novel way out of our present ' confinement '. Let me introduce an obscure term ' consciousness '...

view entire post

report post as inappropriate

Arjen Dijksman wrote on Oct. 12, 2009 @ 09:32 GMT
Dear Professor Ellis,

Thank you for these original thoughts about how macroscopic behavior emerges from complex assembled systems. Focusing on nonlinearity, rather than on decoherence, makes sense, although both provide valid explanations. Thermostat and adaptive selection are good examples. Do you have other examples that are nearer to the quantum world? I was thinking of non linear optics. If the field intensity (=photon intensity) exceeds a given threshold, media respond non-linearly, as if above a certain degree of complexity, one-to-one interactions no longer govern the physical behavior of the system. The simple interactions are 'buffered' by delayed interactions, which induce possibilities for feedback. These delayed interactions must be taken into account for the dynamical description.

I also appreciated your appendix which is a very good condensate for the principles of quantum measurement. Just after equation (4), personally, I would moderate the formulation in order to be closer to experimental physics: "Immediately after a measurement the state of the system is known to be a specific eigenstate, and any immediate further measurements will give _eigenstates and eigenvalues very close to the first result_." The effective results are indeed always subject to experimental uncertainty. By assumption, we consider these results to be the same as immediately before but I don't know of any experiment that validates this assumption.

By the way, I promote the FQXi contest on my twitter profile and my blog. Would you mind if I post quotes of yours, linking to your essay?



report post as inappropriate

Tejinder Singh wrote on Oct. 12, 2009 @ 14:04 GMT
Dear Arjen,

Hello. I wanted to add a few remarks to the interesting observations you make above. Decoherence by itself cannot explain measurement, and must be accompanied by the many worlds interpretation. This is because while decoherence destroys interference amongst alternatives, it preserves superposition amongst them, because it works in the framework of the standard linear quantum theory. Many worlds branching is needed, so that we percieve only one branch, and hence it appears to us as if superposition has broken down.

On the other hand, nonlinearity can in principle explain wave function collapse by itself, without having to invoke many worlds.



report post as inappropriate

Darwin wrote on Oct. 14, 2009 @ 15:46 GMT
I might not get it, but:

1. isn't it the case that quantum feedback (which goes under the name of "quantum control") is quite an active area of research, and, so far as I can see, there is nothing to prevent feedback such as that in a thermostat to be accomodated with QM. So far as I know, experiments have been done on this (in quantum optics) and it works nice. Again, according to the usual rules of Q.M.

2.Nonlinearity in QM: there are plenty of nonlinear Schrödinger equations in Q.M.. They appear when there is interaction between the components of the system. The resulting macroscopic wavefunction (e.g. Gross-Pitaevski) obeys the same rules when it comes to measurement for example. And this is why just by adding nonlinearity you cannot get rid of the conceptual problems of Q.M.

Sorry, I just don't understand why nonlinearity is a problem for standard Q.M.: once interaction is properly taken into account (such as in second quantization) it emerges naturally.

report post as inappropriate

Author George F. R. Ellis wrote on Oct. 16, 2009 @ 04:47 GMT
Thank you all for the many posts. I regret that due to pressure of work I can't reply in detail to all of them, except to say yes please refer to my essay in a blog if you wish; however I must repond to the interesting queries by Darwin. I will do so in two parts, so that my reply is not too long.

Quantum control

In any quantum control process, the result of a measurement is used to determine the future evolution of the system. Such devices can be constructed and function beautifully; but just like the Copenhagen interpretation of any ordinary measurement, their operation can only be understood through a mixture of classical and quantum theory, because the measurement part of the operation can’t be described by the Schroedinger equation. This is true even when continuous measurement takes place, because (see arXiv:quant-ph/0611067) such measurements involve first a unitary measurement between the system and an auxiliary system, and then a von Neumann measurement of the auxiliary system. It is the latter event that is not described by standard quantum theory. The theory of quantum feedback control shows how the effect of the measurement changes the master equation for the system state (equations (4) and (5) in arXiv:quant-ph/9912107), but not how the measurement takes place. There is to be sure a phenomenological equation for the measurement process (equation (7) in arXiv:quant-ph/9912107) but that equation is not linear: hence it does not satisfy the superposition principle, and is not derivable from the Schroedinger equation. This is where the non-quantum aspect of the process is implicitly introduced

Author George F. R. Ellis wrote on Oct. 16, 2009 @ 04:54 GMT

The non-linear Schroedinger equation:

From Wikipedia: “In theoretical physics, the nonlinear Schrödinger equation (NLS) is a nonlinear version of Schrödinger's equation. It is a classical field equation with applications to optics and water waves. Unlike the Schrödinger equation, it never describes the time evolution of a quantum state. It is an example of an integrable model.” The reason it does not describe the time evolution of a quantum state is because it does not obey the superposition principle, which is central to standard quantum theory, see (“Quantum superposition is the fundamental law of quantum mechanics. It defines the collection of all possible states that an object can have.”). Hence for example it does not describe either interference or entanglement.

In any case, when proposing any equation as a description of a physical system, you have to show it actually correctly predicts the dynamics of the system. Neither the linear nor non-linear Schroedinger equation describes either a feedback control system or adaptive selection, neither of which obeys the superposition principle. Taking the former case, suppose we could describe the temperature by a quantum state |Temp>. The dynamics of a feedback control loop are such that for an initial state |Temp_1>, we find |Temp_1> --> |Temp_goal>, where |Temp_goal> is the desired output temperature. For a different initial state |Temp_2>, we agagin find |Temp_2> --> |Temp_goal>. Superposition requires that that if input A_1 produces response X_1 and input A_2 produces response X_2 then input (A_1 + A_2) produces response (X_1 + X_2) (see This simply does not happen when we consider a feedback control system. You can't describe what is happening by writing down a Schroedinger equation for |temp>, so standard quantum theory does not apply to such feedback systems.

Darwin wrote on Oct. 18, 2009 @ 10:08 GMT
Thanks for the answer. Let me start by saying that I appreciate you taking the time to answer this. I think the problem you touched upon is really important. One (unimportant) terminology issue: you refer to "standard" Q.M. as if it would exclude the measurement postulate. The "standard" Q.M. we learn (or some of us teach) at universities usually includes the measurement postulate.

If I...

view entire post

report post as inappropriate

T H Ray wrote on Oct. 18, 2009 @ 14:20 GMT
George Ellis,

Thanks for an extraordinarily clear and well written exposition on the weaknesses of quantum mechanics and the strengths of the alternative--complex systems science.

That laterally-distributed control (Y. Bar-Yam, New England Complex Systems Institute)--vs. the conventional hierarchical view--explicitly defines varieties of change in a time-dependent complex network, makes for a powerful explanation of the world's apparent nonlinear order. We are informed that order is self-organization with feedback.



report post as inappropriate

George Ellis wrote on Oct. 20, 2009 @ 04:30 GMT
Darwin, the points you make are very relevant and to try to do them justice I'll give a three part response.

There is a profound hiatus at the core of quantum theory: the process of measurement, projection of the state vector to an eigenstate, cannot be described by the standard quantum theory process of unitary evolution of the state vector e.g. due to the Schroedinger equation. There are broadly speaking four attempts at resolution: (1) the Copenhagen interpretation, namely macro objects are not governed by unitary evolution, nothing more needs be said; (2) one needs to add on to the theory a second kind of evolutionary process, characterized as collapse of the wave function, and then try to determine rules for when and how this happens; (3) the Everett route of assuming a continually branching wave function, and associated “many worlds”; (4) one claims decoherence solves the problem.

I won’t repeat here my arguments as regards the last two approaches, please see my essay in that regards (I just repeat here that decoherence gets rid of entanglement but does not give a unique physical measurement outcome). The point I want to make here is that if this issue is mentioned at all in treatises on quantum theory, the postulate of the measurement process is added on at a late time as an optional extra postulate, it is not presented as a key part of the standard theory (see e.g. Isham’s book). But most books on quantum theory don’t even mention that there is a problem. Here’s an exercise for you: take a random selection of text books on quantum theory (not popular books: that’s a different story) and see if the measurement process is mentioned in the index. I’ve tried it; less than 30% of quantum text books even mention the issue; and it is not mentioned at all in almost every book on quantum field theory (Wald’s book is the only one I have seen that does mention it). Now ask an average graduate student in the field what she or he can tell you about the problem. Most of them have not even heard of it. So for those who are properly steeped in the subject, your statement is largely true; but for most of them, it is not. It's not a part of the subject as taught to most physics students.

report post as inappropriate

George Ellis wrote on Oct. 20, 2009 @ 04:44 GMT

So is the problem any worse in the case of networks of interactions than it is in the standard measurement problem? Yes I think it is. See arXiv:0904.4483 by Chiribella et al for a nice presentation of the growing field of quantum networks, obtained by assembling a number of elementary quantum circuits into a network and resulting in quantum channels. Does not that counter my essay? No it does not, because that theory only applies to networks with no cycles (see p.5 of that paper). These can be represented by Directed Acyclic Graph, and so do not include the feedback loops I consider in my essay. The point is that in the case of those loops, one feeds the result of the measurement process back into the system in order to determine its future evolution. You can’t do that in terms of a theory involving unitary evolution alone, because that can’t handle the measurement issue. In the standard measurement issue per se, this problem does not arise: the measurement result resides in the macro world and does not get fed back into the micro world.

You can handle this conceptually in terms of a combined classical and quantum description broadly in the spirit of the Copenhagen interpretation, i.e. you add in to the system some elements not covered by standard unitary quantum evolution rules. So yes you can make the relevant models, but only by adding non-quantum elements into the theory. So we are back at the issue of how that can possibly make sense, if macro objects are based in quantum physics at the micro level.

report post as inappropriate

George Ellis wrote on Oct. 20, 2009 @ 04:50 GMT

Does this have any consequences? Not for most quantum theory and quantum field theory applications, which calculate energies of states and statistics of outcomes so successfully. But there is an area where it does matter: namely when people start talking of the wave function of the universe, and develop consequences of that idea, claiming inter alia that time is an illusion (see The End of Time by Julian Barbour, for example).

These theories assume that quantum mechanical ideas can be applied to the universe as a whole, using some form of unitary evolution and without any process of collapse of the wave function being included in the description. My essay says that won’t work, because feedback control loops exist in the real universe (e.g. in the human brain). To my considerable surprise this ends up supporting the Copenhagen view to some degree: it can be legitimate to suppose there are macro objects not described by standard quantum theory as exemplified by the Schroedinger equation.

If true, then a significant task is to determine all the cases where such arguments are applicable, and I have suggested that Darwinian selection is another such case (rather analogous to quantum field theory, which needs separate consideration). What others might there be? Additionally a key task for the future is to extend quantum theory to give some “quantum_plus” proposal that does give a satisfactory description of the measurement process. Maybe the issue will look different in the light of such future theories.

report post as inappropriate

Tejinder Singh wrote on Oct. 20, 2009 @ 17:44 GMT
Dear George and Darwin,

I have been following the interesting discussion you are having here. May I draw your kind attention to my essay in this forum : there I argue that quantum theory by itself predicts a quantum_plus theory [which addresses the measurement process] when one notes that the dependence on an external classical time is an unsatisfactory feature of quantum theory. I predict that mesoscopic objects obey a mechanics wshich is neither classical nor quantum, and hopefully experiments in the coming decades will test this.



report post as inappropriate

Lawrence B. Crowell wrote on Oct. 24, 2009 @ 12:45 GMT
What intrigues me is the possible role of gravity in this matter. I won't write maths here, but only give a description of something. Assume there is a wave function with some domain in space with a "support." This region of support is defined as where the wave function assumes its value and outside of it the wave function drops off to exponentially small values. We now consider this region of support as on a frame which is falling towards a radially symmetric gravity field, such as a black hole. Of course the Weyl tensor kicks in and distends this region into a more highly elliptical shape as the frame falls inwards. The wave function will then distend and it is not hard to do a Wigner quasi-probability calculation to show that the wave function becomes parametrically amplified or squeezed.

This seems to suggest a possible role for gravitation in this problem. I will as a caveat say I have troubles with Penrose's idea of there being Planck scale fluctuation in some R process during a measurment. Yet for quantum gravity and cosmology this apparent role of gravity in the squeezing of quantum states seems to be a possibility. The early measurement of the CMB and other aspect of cosmology indicate that the universe started in a uniquely low entropy configuration. The small anisotropies of the universe would suggest that the averaged Weyl tensor in any local region was very small, at least after inflation.

With decoherence it is the case that reduction of the density matrix to its diagonal form, with off diagonal elements for overlaps reduced to zero, that there is no dynamical prediction for an outcome. I would then argue this reduction simply defines a macro-state for the entropy of the decoherence or measurement. Any possible real observation is equivalent from an information perspective to all others. What we lack is the information required to determine how an "actual" outcome obtains. We are faced with a sort of reduction of quantum probabilities to classical-like probabilities, and then in the real world with some sort of classical collapse. So the picture has two steps, or what might be thought of as a seam that is not elegant.

So we are faced with the prospect of there being some sort of "top-down" process which is involved with quantum outcomes, or for that matter the emergence of the macroscopic or classical world. Zurek has advanced the decoherence hypothesis to einselection to address this question. Your essay is another attempt to frame this quesiton as well.

Cheers LC

report post as inappropriate

Anonymous wrote on Oct. 27, 2009 @ 19:58 GMT
Do you agree that a summary of your essay’s logic is along these lines? Because the present state of the art of physics has only been able to measure a small fraction of atomic and molecular mechanisms at the bottom, then it follows that a top down mechanism may control what is missing down there?

Somebody should look.

report post as inappropriate

Lawrence B. Crowell wrote on Oct. 30, 2009 @ 12:30 GMT
A plausible way of looking at this is with cloning. Quantum mechanics tells us that a quantum state can't be duplicated by unitary quantum processes or evolution. Yet from a classical perspective we can clone things, we can duplicate classical objects and we can set up identically prepared quantum states. Admittedly we do these within some "error margin," but we can do it well enough. So one might posit there is some classical "cloning principle," which is a top down rule that does not pertain wo the quantum world.

Cheers LC

report post as inappropriate

Eckard Blumschein wrote on Nov. 5, 2009 @ 23:08 GMT
Dear George,

You are not the only one who ignores my objection but you could be the first one refuting it.



report post as inappropriate

Login or create account to post reply or comment.

Please enter your e-mail address:
Note: Joining the FQXi mailing list does not give you a login account or constitute membership in the organization.