CATEGORY:
Blog
[back]
TOPIC:
Quantum theory escapes locality by accepting uncertainty
[refresh]
Login or
create account to post reply or comment.
Blogger Oscar Dahlsten wrote on Aug. 17, 2014 @ 15:28 GMT
 |
Figure 1 |
When a ballerina does a pirouette she must escape the friction of the ground in order to get the freedom to move. (Figure 1: Photo by Michael Garner, courtesy of English National Ballet.) She does this by restricting her contact with the ground to a point. In a recent paper I and my collaborators Andrew Garner and FQXi's Vlatko Vedral show that quantum theory in a very similar way escapes a fundamental constraint on movement by accepting uncertainty. Quantum systems are associated with states which encode the statistics of future possible measurements. The collection of such states may be represented as a geometric shape. In the smallest possible quantum systems, single qubits (quantum bits), this shape is a sphere, called the
Bloch sphere.
For example, think about a property of a qubit, such as its position: the qubit could be associated with two possible positions, A and B, say, or it can be in a fuzzy superposition where it exists in both of these mutually incompatible states simultaneously, before being observed. If it's in a superposition then although experimenters cannot know with certainty what position they will find it in when they make a measurement, they will have some sense of the probability of getting a certain outcome. The Bloch sphere helps to visualise this odd feature and the probabilistic nature of quantum mechanics. In the example, a vector pointing to the north pole of the sphere could represent position A, while the south pole represents position B. (In a classical system, this would represent the only two options available for a binary digit, or bit, to access). However, a qubit can also be represented by a vector pointing elsewhere on the surface of the sphere, corresponding to the fuzzy in-between states.
 |
Figure 2 |
The maximal state space conceivable would actually be the cube outside of the sphere, as shown in figure 2. The quantum state space is the sphere, but if there were no uncertainty principle all states in the outer cube could be allowed. In this case certain measurements could all have predictable outcomes at the same time, in violation of the quantum uncertainty principle.
One may ask why quantum theory is restricted to the sphere, and accordingly to having the uncertainty principle.
We came across an intriguing answer when thinking about how the cube state space would handle an interferometer. In an interferometer the particle or photon is firstly placed in a superposition of being in two places and then operations are done on each site. Now when you have two different sites fundamental locality restrictions come into play. In particular, we point out that if a system has 0 probability of being found on site B, then an operation on site B must leave the state of the system invariant. Otherwise we could do action at a distance. Contrary to some popular science depictions, quantum theory does not allow action at a distance. The universe would be almost inconceivably odd and complicated if action at a distance were possible. We would not be able to make a statement about an individual system without taking into account what happens everywhere else in the world.
On the Bloch diagram, state transformations move points around, e.g. by rotating the shape. So, if one accepts that this locality restriction holds, it turns out that operations on site B must leave all points (states) on the lower plane of the cube invariant. It is like the points are stuck by total friction between the shape and the lower plane. As a result the cube has a big disadvantage over the sphere because if the entire square face touching the ground is restricted, then the whole cube gets stuck and no states can change.
But now imagine metamorphosing the cube into a sphere, or indeed something else with only one point on the lower plane, like how the ballerina goes up on one toe. Then the shape, with all the quantum states in it, can move. The quantum sphere has the advantage over the cube that it can rotate even if there is full friction with the lower (and/or upper) plane, just as the ballerina accepts the uncertainty of only having a point in contact with the ground in return for the ability to pirouette.
One may say that uncertainty, rather than being just limiting, liberates quantum states to change.
--
Oscar Dahlsten is affiliated with Oxford UniversityThe paper appears in
Nature Communications.
this post has been edited by the author since its original submission
report post as inappropriate
Joy Christian wrote on Aug. 17, 2014 @ 16:47 GMT
"Contrary to some popular science depictions, quantum theory does not allow action at a distance. The universe would be almost inconceivably odd and complicated if action at a distance were possible. We would not be able to make a statement about an individual system without taking into account what happens everywhere else in the world."
The above statements are grossly misleading, if not outright wrong.To be sure, one cannot send a signal violating special relativity with the so-called quantum non-locality, but nonetheless
quantum theory is not a locally causal theory.
But here is a good news: Quantum theory
can be completed (in a manner espoused by Einstein) into a locally causal theory, as done, for example, in
this paper and
this simulation.
For a more compete discussion, see also
this page of my blog, or
this discussion of mine.
report post as inappropriate
Thomas Howard Ray replied on Aug. 17, 2014 @ 17:48 GMT
Joy, I think in a certain technical sense, "action at a distance" -- nonlocal causality -- can be differentiated from nonlocality, without raising the issue of local causality at all.
As is well known to everyone here, I agree with you without reservation that quantum mechanics is not a locally causal theory. Events that are local and causal, however, do not obviate events that are non-local and non-causal (metaphysically real) which quantum theory would conventionally have us believe is locally real by normalization of the metric -- a mathematical kludge based in observation, with no theoretical support.
It is notable that the author invokes the cube (Bloch sphere) in order to instantiate the sphere -- mathematically, this admits linear superposition as foundational, carrying with it, assumptions of quantum entanglement and probabilistic measure schema. These assumptions are sufficient, though not necessary -- and they entirely obviate a geometrical description of quantum foundations, by nothing more than fiat.
OTOH, a foundations model based on the topological generalization of the Euclidean sphere admits no such assumptions and no such measurement schema -- under the simple condition that the topology is simply connected. I think very few understand that your framework is truly analytical, classically-based and completely local realistic.
All best,
Tom
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:15 GMT
Hi Joy,
When I say action at a distance I do mean it in the operational sense, i.e. that doing something on site A does not instantaneously lead to changes in probabilities of measurement outcomes on site B. (I have the phrase 'spooky action at a distance' in mind as the popular science description that is dangerous.)
The real vectors here represent the probabilities of outcomes of possible measurements. (They are a generalisation of the quantum density matrix.) The 'freezing' of one plane thus corresponds to the measurement statistics of the states in question being invariant under the given set of operations.
I moreover do think we would have serious problems with predicting anything if there were action at a distance in this operational sense.
Best,
Oscar
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:18 GMT
Hi Tom,
Thanks for the post. I don't know Joy's approach but I think I agree with the spirit of what you say.
Best
Oscar
report post as inappropriate
Thomas Howard Ray replied on Aug. 19, 2014 @ 17:03 GMT
Hi Oscar,
Thanks for sticking around to follow up on the comments. Another forum participant with whom I communicated privately wrote: "I found the paper related to the post that I commented very well written and accessible, unlike so many research article that are understandable only by specialists."
I agree and couldn't have said it better myself.
Best,
Tom
report post as inappropriate
Joy Christian replied on Aug. 19, 2014 @ 19:47 GMT
Thanks, Oscar. I appreciate your response.
I have a different view, however, which Tom and Jonathan have already tried to explain.
this post has been edited by the author since its original submission
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 20:37 GMT
Thank you Tom for the encouraging words.
Joy, I am not sure what exactly we are disagreeing about :)
report post as inappropriate
Joy Christian replied on Aug. 19, 2014 @ 21:03 GMT
Hi Oscar,
Here is what bothers me about your statements I quoted above. I appreciate the practical significance of your results, but they seem to be at odds with the fundamental fact that quantum mechanics is not a locally causal theory. This fact is of course well known since 1935---i.e., since EPR formalized their famous argument in terms of locality, reality, and completeness of quantum mechanics.
Now you say that by no action-at-a-distance you mean "doing something on site A does not instantaneously lead to changes in probabilities of measurement outcomes on site B." This may be true
for all practical purposes, but it cannot possibly be true in principle in the light of the EPR argument, unless you have "completed" your quantum theory in some way. But as far as I can see you are using the standard, orthodox quantum theory. ???
Best,
Joy
this post has been edited by the author since its original submission
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 20, 2014 @ 10:06 GMT
Hi Joy,
Thanks for the reply.
I do mean for all practical purposes in the strict sense that in the quantum theory description the density matrix is invariant. As in standard QM I am assuming that the density matrix encodes all measurement probabilities as well as is possible.
To be more concrete, in the case of two systems localised to different regions one would demand that the density matrix representing the reduced state on A is invariant under operations on site B (this is the standard 'non-signalling'). In the case here of one system that can in principle be in different sites, we demand that in the case where the system has probability 1 of being in site A, operations on site B leave the density matrix representing the system's state (or the generalisation thereof in the case that we are not necessarily dealing with quantum theory) invariant.
Best wishes
Oscar
report post as inappropriate
Joy Christian replied on Aug. 20, 2014 @ 12:06 GMT
Thanks, Oscar. I now understand what you are doing :)
report post as inappropriate
Thomas Howard Ray replied on Aug. 20, 2014 @ 12:20 GMT
"I do mean for all practical purposes in the strict sense that in the quantum theory description the density matrix is invariant."
Oscar, I think that's exactly where quantum mechanics sneaks in its hidden (or at best, ignored) assumptions of the equally likely hypothesis, and perfect information, on an inherently uncertain measure space.
The probability density matrix of classical statistical mechanics, where we are certain of the behavior of ensembles of particles -- because we are certain of the boundaries of random behavior distributed continuously (as, e.g., in Brownian motion) -- does not transport smoothly to a space of inherent uncertainty. It is therefore arbitrary to apply the assumption of equally likely measurement outcomes based on perfect information, because we don't have perfect information of the quantum mechanical measure space.
A well defined space of complete measurement functions, OTOH, imposes an extra degree of freedom that allows a continuous function model without the need to assume perfect information. Observed outcomes are pairwise correlated, nonrandomly and continuously, independent of the behavior of the particle ensemble.
Best,
Tom
report post as inappropriate
Thomas Howard Ray replied on Aug. 20, 2014 @ 14:32 GMT
David Mermin's
"Ithaca Interpretation" of quantum mechanics underscores how entrenched is the idea of correspondence between the QM probability density matrix and the notion of mathematical realism:
" ... these are my Six Desiderata for an interpretation of quantum mechanics:
(1) Is unambiguous about objective reality.
(2) Uses no prior concept of measurement.
(3) Applies to individual systems.
(4) Applies to (small) isolated systems.
(5) Satisfies generalized Einstein-locality.
(6) Rests on prior concept of objective probability.
To persuade you that my aspirations are not made entirely of fluff, let me next digress to tell you about two elementary theorems of quantum mechanics that seem only recently to have been noticed."
I would encourage one to recognize the difference between mathematically constructing an "interpretation" of an observed physical event (e.g., the two slit experiment) and constructing a mathematical theory of continuous
functions in a clearly defined measure space.
Trying to debate this point with Richard Gill here, he would never consider that even idea of a defined measure space is relevant. I expect that he, like Mermin, believes "Einstein locality" to refer to the entire universe -- which is quite trivial, since Einstein always averred that all physics is local. That a locally causal quantum theory could never be realistic is proven in Bell-Aspect, though why would one want to give up physical and metaphysical realism for mathematical realism?
A mathematically complete theory of quantum configuration space -- that is, 1 to 1 correspondence of events between quantum configuration space and physical space -- is both local and realistic.
report post as inappropriate
Thomas Howard Ray replied on Aug. 20, 2014 @ 14:35 GMT
Link to Mermin paper:
https://archive.org/stream/arxiv-quant-ph9609013/quant
-ph9609013_djvu.txt
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 27, 2014 @ 21:43 GMT
Hi Tom,
I can see why this locality principle made you think of the Mermin paper. I had a quick read of it (I had seen it quite a while back but understand it much better now). I ended up using http://arxiv.org/pdf/quant-ph/9609013v1.pdf to see the equations.
I like all his desiderata up to the frequent use of the idea of a system which is not so well-defined arguably.
The two theorems seem to refer to interesting but known things, the first one being about the fact that mixed states correspond to reduced states of pure states and the second one about what is now commonly called local tomography, the idea that quantum states can be reconstructed from the probabilities of local measurements and their correlations.
Best wishes
Oscar
report post as inappropriate
Anonymous replied on Aug. 28, 2014 @ 12:05 GMT
Hi Oscar,
I agree with you.
Mermin wades in right away with what I think is the basic problem with quantum theory -- the need to apply a "sensible interpretation." Classical theories have sensible boundary conditions, sensible measure spaces, sensible results. They don't beg interpretation because they are based in experience rather than mathematical, Hilbert space, formalism.
Einstein always viewed the universe as one locality with self-limiting sub domains. He never considered that the universe could be incomplete, requiring a subjective judgment to interpret and complete it.
After eqns (1) & (2)
Mermin asks us to consider whether there is an objective difference between density matrices of opposite probability states. I think this only invokes the "equally likely" hypothesis of probability theory applied to the density matrix regardless of outcome, and therefore gives the appearance of objective probability.
He alludes to Popper's
propensity interpretation of probability; however, Popper's idea of objective probability does not include the equally likely hypothesis. To understand Popper's constructive logic, one has to understand Alfred Tarski's correspondence theory of truth -- which allows a 1 to 1 correspondence of statement to event. Popper takes this framework into an analytical model; in a
2006 conference paper (see 4.0-4.3) I exploited this analyticity to show how probable states are correspondent and continuous with definite mixed states (singular events), independent of the axiom of choice (which is implied by the frequency interpretation of events in an n-dimension Hilbert space).
I do hope this productive dialogue continues!
All best,
Tom
report post as inappropriate
Thomas Howard Ray replied on Aug. 28, 2014 @ 12:07 GMT
Thomas Howard Ray replied on Aug. 28, 2014 @ 18:41 GMT
By the way, I think there is some significant relation between the self-interaction of Mermin's eqns (3) and (4) and Popper's equations
G: and
H: (4.1 in my paper).
Mermin's pair of equations that assume two non interacting states of even parity (-- or ++) neglect that a pair of states of odd parity (+- or -+) interact alternately with the same classical probability.
However, as Popper shows, only a single state
G: of the pair
G: and
H: has any way -- even in principle -- to change, i.e., to deviate from the expected result.
I think that's what makes Oscar's conclusion of state changes based on geometry so important. The necessary added degree of freedom is built into the topology; when the topology includes a point at infinity, Joy Christian's framework logically follows.
Mermin writes: "If objectively real internal properties of an isolated individual system are not to depend on what is done to another non-interacting system, then there can be no difference between these two realizations (i.e., eqns 3 & 4) of the density matrix W."
True enough. However:
The relations only commute in even parity. Christian's noncommutative, nonassociative pairs (generalized quantum correlations) can be shown falsifiable; i.e., truly objective and independent of the experimenter's expectations. That is what Popper's "objective probability" actually means.
report post as inappropriate
hide replies
Steve Agnew wrote on Aug. 17, 2014 @ 17:56 GMT
The good thing about quantum action is that any number of complete or even mostly complete basis sets are solutions to the reality of the Schrödinger equation. The bad thing about quantum action is that any number of complete or even mostly complete basis sets are solutions to the reality of the Schrödinger equation.
Just as the Heisenberg approach is equivalent to Dirac's approach, science can and does argue endlessly about which basis set or approach or interpretation is better. These kinds of arguments are really not that all that useful since action is all about the Schrödinger equation, not really about which basis set you choose or which approach or which interpretation. But the arguments about basis sets go on and on and on...
After all, it is not even necessary to consider action in space as
a priori since matter action as exchange is a complete basis for quantum action without space. Locality is simply a convenient and very intuitive representation of phase for the quantum action of matter waves and there are any number of equivalent ways to deal with the issue of locality in quantum action.
The really important goal in all of this is not to to find a better basis set for quantum action, the goal for science is to find a gravity exchange force that scales in a very nice way from that of the quantum action of charge force.
report post as inappropriate
Thomas Howard Ray replied on Aug. 17, 2014 @ 18:44 GMT
" ... the goal for science is to find a gravity exchange force that scales in a very nice way from that of the quantum action of charge force."
I agree, Steve. And wouldn't that necessarily entail a field theory in a continuum of spacetime over an n-dimension Hilbert space model of probabilistic measure?
report post as inappropriate
Thomas Howard Ray replied on Aug. 17, 2014 @ 20:47 GMT
Jeez, I should have said "
rather than ... an n-dimension Hilbert space ..." instead of "over," which has obvious and unintended mathematical implications.
report post as inappropriate
Steve Agnew replied on Aug. 17, 2014 @ 22:29 GMT
I would say not necessarily. Science has a great field theory for charge force in spacetime that fails for gravity force. Stringy guys add extra dimensions, quantum loopists add little twirly thingys everywhere, multiversey guys seem to explain anything and nothing at the same time, and arithmetics forgo pdf's and just redefine everything with equations and constants everywhere.
I would say that this arena is definitely a mess and it is no wonder that science has not figured this stuff out with such a cacaphony. I think that you could explain just about anything with an n-dimensional Hilbert space...we could also just assign a constant for n particles in the universe and be done with it.
Some say that models must be first of all falsifiable and that is quite important, but really models must first of all be useful and new models must be much more useful than old models in order to even be considered. Fringe physics necessarily and rightly so has an uphill battle and new models must show utility over all else.
report post as inappropriate
Thomas Howard Ray replied on Aug. 17, 2014 @ 23:24 GMT
Steve, I apologize for flubbing the words, but you are applying the interpretation I corrected in my second post, and not answering the actual question.
report post as inappropriate
Steve Agnew replied on Aug. 17, 2014 @ 23:33 GMT
Well okay, but science still has a very good field theory that works really well for charge force. Why is there no quantum gravity? All I am suggesting is that it is the structure of spacetime that is the impediment to a quantum gravity.
Space and time are not independent and are affected by matter and action in complementary ways. It appears possible to effect a quantum gravity with just matter and time and of course quantum charge stuff never really needs space either.
report post as inappropriate
Thomas Howard Ray replied on Aug. 17, 2014 @ 23:47 GMT
"All I am suggesting is that it is the structure of spacetime that is the impediment to a quantum gravity."
Not in an extradimensional model, it isn't.
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:25 GMT
Hi Steve and thanks for the post.
Just to note that these different geometrical shapes do not correspond to different interpretations but rather different physical theories in the sense that they would allow for different statistics, e.g. the cube would have measurements that cannot be measured at the same time (incompatibility) but do not have an uncertainty relation. It is very similar to (and related to) how one could envisage systems that violate Tsirelson's upper bound on non-locality, an operational notion, and these would then not be quantum systems (as Tsirelson's bound only takes quantum theory as the assumption).
I think this framework is actually very interesting for considering quantum gravity from an operational starting point and we are doing some research in that direction.
report post as inappropriate
hide replies
Robert H McEachern wrote on Aug. 17, 2014 @ 18:01 GMT
"One may ask why quantum theory is restricted to ... having the uncertainty principle."
Quantum Theory has an uncertainty principle, because it employs Fourier Transforms to describe systems. Since the uncertainty principle is a property of a Fourier Transform, it is inevitable that any system, being described via Fourier Transforms, will also exhibit the uncertainty principle. Regardless of whether or not the uncertainty principle is a property of the system itself, it is a property of the chosen means for describing systems; hence it will inevitably exist, within the description of the system.
Rob McEachern
report post as inappropriate
Christophe Galland replied on Aug. 17, 2014 @ 20:14 GMT
I agree that uncertainty relationships are intrinsic to Fourier transforms, yet in quantum mechanics it seems to be a more fundamental feature independent of the calculation tool.
In the paper, the key property of the operators that is used is not uncertainty per se, but non-commutativity.
report post as inappropriate
Thomas Howard Ray replied on Aug. 17, 2014 @ 20:38 GMT
" ... the key property of the operators that is used is not uncertainty per se, but non-commutativity."
That's a great point. Non-commutative quantities imply uncertainty in a precise manner. If you don't mind indulging me, the attached work in progress explains why I think so.
attachments:
1_The_CHSH_result_is_free_of_context.pdf
report post as inappropriate
Robert H McEachern replied on Aug. 17, 2014 @ 23:19 GMT
"yet in quantum mechanics it seems to be a more fundamental feature independent of the calculation tool"
It may indeed "seem" that way, but it is not.
It is fundamental, if and only if, there is a single particle being observed. But as soon as one attempts to observe more than one particle, as in any attempt to observe an "interference" pattern, then it is not fundamental at all.
The quantity that is fundamental to an observation, is the absolute limit of an observation's recoverable information content; one cannot recover less than a single bit of information from an observation, and still claim to have made an "observation".
In the case of a single particle observation, Shannon's Capacity for the amount of recoverable information, equaling one bit, yields the uncertainty principle. But when more than one identical particle can be measured, it is possible to recover more than one bit of information, and hence "violate" the uncertainty principle. But using Fourier Transforms is an inappropriate means for obtaining such results.
Rob McEachern
this post has been edited by the author since its original submission
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:32 GMT
Hi Rob,
Thank you for the post. Indeed once you assume the quantum formalism the uncertainty relation follows. So the question we are concerned with is more why should the quantum formalism hold, why is quantum theory the way it is?
Incidentally I agree that if two measurement bases are related by the fourier transform they have a kind of maximal uncertainty relation, though it may be of interest to note that there is a more general set of transforms relating bases for which this is also true, see e.g. http://en.wikipedia.org/wiki/Mutually_unbiased_bases
Best wishes,
Oscar
report post as inappropriate
Robert H McEachern replied on Aug. 19, 2014 @ 16:04 GMT
"once you assume the quantum formalism the uncertainty relation follows."
You do not have to assume anything at all about QM to get the uncertainty principle. If you trace it back to its origins, in information theory considerations, the uncertainty principle simply claims that there is a "smallest possible amount of recoverable information" - a single bit of information, from any information bearing signal.
That single-bit limit, is the entire significance of the uncertainty principle; every other property associated with the principle, can be derived from that fact.
Rob McEachern
this post has been edited by the author since its original submission
report post as inappropriate
Jonathan J. Dickau replied on Aug. 19, 2014 @ 16:33 GMT
I think the point here is the reverse Rob...
Specifically; it should be "once you assume the uncertainty relation, the quantum formalism follows." I argue that the dynamic here is uncertainty = the freedom to vary, and then variation leads to discrete outcomes. The neat thing in their setup is that uncertainty is put in first, then definite outcomes follow because there is a point of contact between the sphere of potentials and the cube of actual outcomes on each face. In my view, and possibly theirs, the information theoretic definition arises from the underlying geometry of the physical considerations in the quantum and classical domain, respectively.
All the Best,
Jonathan
report post as inappropriate
Robert H McEachern replied on Aug. 19, 2014 @ 17:04 GMT
"I think the point here is the reverse"
I agree. My point, is that THAT point, is wrong.
"once you assume the uncertainty relation..."
My point is that one does not need to assume any such thing.
The uncertainty principle is a "fact", not an assumption, precisely because, it is nothing more than a peculiar way of stating that it is not possible to recover less than one bit of information, while making a measurement (and still have made a "measurement").
"then variation leads to discrete outcomes"
No. Information is inherently discrete. Whenever one attempts to recover any information, from a channel that contains only a small amount of information, it is inevitable, that one is going to recover "discrete" amounts of it.
"In my view, and possibly theirs, the information theoretic definition arises from..."
In my view, that is incorrect. It can easily be shown that Shannon's Capacity Theorem is simply the result of counting the number of bits required to reconstruct a time-limited, band-limited, finite signal-to-noise ratio signal, from a set of discrete samples; it simply equals the number of samples required to recreate the signal, multiplied by the number of bits per sample, required to preserve the SNR.
Setting Shannon's Capacity equal to one bit, of total recovered information, yields the uncertainty principle. It has nothing to do with QM, geometry, or anything, other than simply counting the number of bits required to reconstruct a signal, that only encodes one bit of information - the least possible amount.
Rob McEachern
report post as inappropriate
Jonathan J. Dickau replied on Aug. 19, 2014 @ 17:10 GMT
Well then Rob,
I see you are starting from what I would like to conclude from yet more basic assumptions, and we are in disagreement about what is foundational. So be it. What is your opinion on the point Oscar makes about MUBs. I think understanding that a range of bases exists, all of which can be mutually unbiased or internally independent, is a crucial point here. Are we agreed on this?
All the Best,
Jonathan
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 20:32 GMT
Hi Rob and Jonathan,
Thanks for the further posts.
Robs statement about the uncertainty principle following from one bit is very reminiscent of 'a foundational principle for quantum theory' by A.Zeilinger (http://link.springer.com/article/10.1023%2FA%3A101882041090
8#page-1) which is part of a strand of thought that is often traced back to Weizsaecker. There are also recent papers by Zeilinger and Brukner, Dakic and Brukner as well as others in this direction.
There are some subtleties here. One can envisage systems where only one bit can be both encoded and decoded but actually two or more bits are encoded. (This is called a random access code.) The cubic state space corresponds to a scenario where three bits can be encoded (the values of the three measurements involved, X, Y and Z), but only one can be decoded because it turns out that once one measures one of them the state is reset to be consistent with the new measurement outcome, which is independent of the values of the other bits.
So the general idea that only one bit should transferable by a qubit is somewhat subtle, one has to be precise about what one means.
Your statement about the information capacity being one bit implying the uncertainty principle is quite precise. I think it should be roughly correct, because there is something called Nayak's bound which follows from Holevo's bound which I believe is what you are referring to. Holevo's bound is often phrased as saying that you cannot put more than one classical bit in a qubit. Nayak's bound says that you cannot encode an arbitrary number of bits provided that Holevo's bound holds, even if you only need to be able to read one bit out.
So for example the qubit state space would indeed violate Holevo's bound. I do not know whether it can be turned into a tight statement that 'the' uncertainty relation follows from Holevo's bound, I would guess there is a small gap.
It is also important to note that one can envisage hypothetical experiments where the uncertainty principle does not hold, like in the cubic state space. I would therefore insist that this is an extra principle, postulate or whatever one wants to call it. We do not want to assume the uncertainty principle or the similar 'one-bit' principle but rather consider why/if it should exist.
Best wishes,
Oscar
report post as inappropriate
Robert H McEachern replied on Aug. 19, 2014 @ 21:04 GMT
Jonathan,
"yet more basic assumptions"
What is more basic than the "assumption" that a "single bit of information", cannot be resolved into smaller quantities of information? One cannot achieve "sub-single-bit" accuracy. That is all that the uncertainty principle states.
While I certainly agree that there are many different sets of basis functions, other than those used in Fourier Analysis, that could be used, my own opinion is that using any such "clever" mathematical techniques is the very source of the problem, not the solution.
The whole point of using such techniques, is to minimize the requirement for using a priori information, about potentially better models (parametric models).
For example, orthogonal functions, like Fourier Transforms, can "represent" frequency modulated signals (or any other), perfectly. But they are useless when it comes to demodulating such signals; that is, they are useless for maximizing the recovery of the encoded information. That is why communications engineers do not use orthogonal functions, in the information recovery portions of high-information-content systems.
If one wishes to maximize the amount of information recovered from a system (a stated goal of QM measurement theory), then non-parametric, "basis function" based transforms and representations, are not the way to go. Such techniques certainly have their uses. Unfortunately, for QM, maximizing information recovery from measurements, is not one of those uses. And to make matters worse, their usage has resulted in an almost complete lack of any intuitive understanding of what is actually, physically, happening.
Rob McEachern
report post as inappropriate
Robert H McEachern replied on Aug. 19, 2014 @ 21:36 GMT
Oscar,
"One can envisage systems where only one bit can be both encoded and decoded but actually two or more bits are encoded."
Encoding more bits than can be decoded, has never been much of a problem. The problem has always been to encode the maximum number of bits that a channel can support, and then decode every one of them, with no bit errors. In the past few decades, modern...
view entire post
Oscar,
"One can envisage systems where only one bit can be both encoded and decoded but actually two or more bits are encoded."
Encoding more bits than can be decoded, has never been much of a problem. The problem has always been to encode the maximum number of bits that a channel can support, and then decode every one of them, with no bit errors. In the past few decades, modern communications signaling techniques have come quite close to achieving that long-sought goal.
"The cubic state space corresponds to a scenario where three bits can be encoded (the values of the three measurements involved, X, Y and Z), but only one can be decoded because it turns out that once one measures one of them the state is reset"
That is similar to decision-directed decoding in comm signals.
But quantum states, like spin, may not actually encode three bits. That is an assumption, based on the fact that the mathematics of spinors uses multiple components to describe the relation "between" the observed and the observer. But all the actual observations seem to be consistent with there actually only being a single bit being encoded into the observable; the spinors do not describe the observable, they describe the relationship between the observer and the observable. The two are not the same. Why assume that they are?
"Your statement about the information capacity being one bit implying the uncertainty principle is quite precise."
Thanks, I try to be precise; even accurate.
"which follows from Holevo's bound which I believe is what you are referring to."
No. As I stated, it is easy to derive the Shannon's Capacity theorem, from simply considering the Nyquist criterion for how many samples are required, to reconstruct a time-bandwidth limited, continuous signal, from a set of discrete samples, together with how many bits per sample are required to preserve the SNR. Then, merely setting the Shannon limit, to a single bit, yields the uncertainty relation.
"It is also important to note that one can envisage hypothetical experiments where the uncertainty principle does not hold"
It does not hold in cases involving more than a single particle. Multiple particles, in the same state, such as those used in interference experiments, is synonymous with a high SNR, which enables "violations" of the uncertainty principle. FM radio transmissions have exploited that fact, for 80 years.
Rob McEachern
view post as summary
this post has been edited by the author since its original submission
report post as inappropriate
hide replies
Akinbo Ojo wrote on Aug. 17, 2014 @ 18:07 GMT
North and South poles are
relative positions. Somewhere, A and Nowhere, B are
absolute positions. From cosmology, our universe seems to tell us that both states, A and B can be occupied, viz. 'Before' Big bang (Nowhere), the current epoch with 'Somewhere' increasing in size, and a Big crunch back to Nowhere (equal to non-existence).
I am not a professional Quantum Machinist, but if it can be contemplated on this blog and stated that, "For example, think about a property of a qubit, such as its position: the qubit could be in two different positions, A and B", why must the choice of positions only be relational and not absolute, when a geometric system as big as the Universe, and which itself is a collection of all that exists, including '…the smallest possible quantum systems, single qubits (quantum bits)', can occupy the two
absolute positions, A and B, how much more a qubit? Does anything prevent a qubit from disappearing to Nothing (Nowhere) and appearing from Nothing (Somewhere)? Are virtual particles not said to behave similarly?
Akinbo
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:42 GMT
Hi Akinbo and thanks for the post.
I think you touch on some very profound issues. I did not understand all your questions, but I note that for the paper in question we describe the experiments, and the two positions, from the perspective of some given observer who is the same every time the experiment is ran. But it may be very interesting to see what one can get out of applying consistency conditions between different observers, that they should all agree on the measurement statistics (objective reality) even if they disagree on what to call different positions involved.
Best
Oscar
report post as inappropriate
Akinbo Ojo replied on Aug. 21, 2014 @ 09:40 GMT
Thanks Oscar for your reply. Not just 'profound'. In my opinion, foundational in keeping with the original motivation of FQXi, not the current dabbling into politics and social science.
If you do not understand some of the questions, I in turn can understand your non-comprehension. It appears 'bizarre', but till it can be falsified logically or experimentally it must be on the table,...
view entire post
Thanks Oscar for your reply. Not just 'profound'. In my opinion, foundational in keeping with the original motivation of FQXi, not the current dabbling into politics and social science.
If you do not understand some of the questions, I in turn can understand your non-comprehension. It appears 'bizarre', but till it can be falsified logically or experimentally it must be on the table, especially when it resolves many of the paradoxes confronting physics and philosophy.
Your commendable experiment is a relational one. But if you wish to go beyond that later in your research to the absolute, some of the questions I raised is recommended to you.
If the Universe contains 'qubits' and there was a time when the Universe had no positional properties and did not exist, then there also must have been a time when the qubits you theorize about also had no position and did not exist (i.e. 'nowhere' call it state 0).Now your qubits exist and are 'somewhere', call it state 1. At the end of a big crunch, your qubits again go back to state 0.
Why must this possible states, 0 and 1 that can be occupied only be of cosmological significance? What stops such changes of position/state of individual qubits from 0 to 1, and 1 to 0 from not occurring even today, if it has happened before? That is my question.
Perhaps, you have heard of the
Alcubierre drive for hyperfast travel, where 'spacetime' in the direction of travel disappears and changes to nothing, while that in the opposite direction expands. While extremely speculative that we can travel faster than light, is there any contrary mathematical or physical evidence, that while even walking about in your room, space is not changing from 'somewhere' to 'nowhere' in the direction of your motion (between you and the wall), while in the opposite 'nowhere' is changing to 'somewhere', between you and the wall with which you were originally flush for instance? If unfalsifiable, then the cosmological evidence (of nowhere changeable to somewhere and somewhere changeable to nowhere) suggests that 'As it was in the beginning, so it is NOW and ever shall be'. Amen!
This may be a distraction for your current endeavor but falsify the proposition when you are done with the current one.
All the best,
Akinbo
view post as summary
report post as inappropriate
Thomas Howard Ray wrote on Aug. 17, 2014 @ 23:20 GMT
To make a point that mathematically astute readers already know, that I nevertheless think deserves singling out:
Comparing the Bloch model to a framework of generalized Euclidean spheres (known as the science of topology) -- be reminded that regardless of the "cubifying" of the Bloch sphere, it is still in Euclidean terms, a 2-sphere, i.e., a three dimensional object. The faces of the...
view entire post
To make a point that mathematically astute readers already know, that I nevertheless think deserves singling out:
Comparing the Bloch model to a framework of generalized Euclidean spheres (known as the science of topology) -- be reminded that regardless of the "cubifying" of the Bloch sphere, it is still in Euclidean terms, a 2-sphere, i.e., a three dimensional object. The faces of the cube represent the fundamental 2-dimension nature of the Hilbert space calculus; a point in that complex space is represented by a line of real and imaginary parts. So like a die imprinted with six values, each value for every face tangential to the observer, we can be sure of the unseen correlated value opposite the observer, because the relationships between pairs of numbers on opposite sides of the die are constant.
A die contains what mathematicians call perfect information; i.e., because the relationships between pairs are constant, we can perfectly predict (by the law of large numbers) the outcomes for rolls of a die or a pair of identical dice. There are very definite quantum numbers on the closed interval [0,1] that discretely predict the number of times the observed result will come up over a sufficient number of throws of the dice.
Quantum mechanics use this local information schema to extrapolate perfect information to the entire universe of quantum configuration space by a mathematical process of normalization. Points in superposition are parametrized into a real line, leading to the result that no quantum measurement is both local and realistic, which drags along the rest of the nonclassical quantum mechanical baggage.
Joy Christian's topological framework does away with the need for perfect information -- either actual or imagined. Quantum correlations are measured as continuously correlated points of the Hopf fibration, dependent on a coordinate free initial condition (obtained by dichotomic variables). All the measurement information is local, with no boundary between quantum and classical domains on the simply connected parallelizd 3-sphere. The framework is in full accord with relativity and the rest of classical physics.
view post as summary
report post as inappropriate
Jonathan J. Dickau replied on Aug. 18, 2014 @ 12:23 GMT
Regarding 'cubifying' the sphere...
Some of this is sorted out by David M. Keirsey here:
Section 2.1.3 on 'complicating' measures and also the following section (2.1.4) on how we define measure, number, and space.
Keirsey has been working independently on developing information-theoretic measures, and this work would probably be of interest. It is notable here that this brings up a point I've made several times already; while in hypercubic measures, which I think refers to Hilbert spaces as well, the assumption is that volume increases with n, the number of dimensions, with a hyperspherical measure there is a volume and area peak.
It would appear that the rule pertaining to Quantum Mechanics must be the one for higher-dimensional spheres, rather than cubes, and that the difference is often overlooked or ignored for the sake of linearization. As Keirsey explains, the Ricci flow maps out the areas and extent of disagreement or frustration, between the two measures - round and flat. But what Dahlsten writes about above is intuitively obvious to me. My thought is that this work by Dahlsten, Garner, and Vedral is only an advancement because people are so strongly invested in a sort of cubism in the Math of Physics.
All the Best,
Jonathan
report post as inappropriate
Thomas Howard Ray replied on Aug. 18, 2014 @ 15:36 GMT
Thanks, Jonathan -- this should spark some important discussion. As an aside, I hadn't realized that this Keirsey is the son of the late David W. Keirsey of "Temperament Sorter" fame. Having been curious for a while, I took the test and (not surprisingly) came up INTP, my career as a technical writer being one of the top 10 career choices of the type. Impressive.
Back to the mathematics, after digesting.
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:54 GMT
Hi Tom and Jonathan,
Just to emphasise that these shapes correspond to different measurement statistics being allowed. Consider for simplicity systems that have three measurements that are incompatible in the sense that by assumption we cannot measure them simultaneously. We may label these, in analogy with the quantum Pauli operators, X, Y and Z and take their outcome to be +1 or -1. The state vector is then represented as [E(X), E(Y) ,E(Z)] where E(.) is the average of (.). (we choose that arbitrary ordering of the letters for certain reasons). Now quantum theory says that (for pure states) E(X)^2+E(Y)^2+E(Z)^2=1. If you think about it that gives you a sphere as the state space, and moreover the operational meaning is that of an uncertainty relation, because if E(X)=1 say, X is predictable, but then E(Z)=E(Y)=0 which means +1 or -1 are equally likely for those, they are uniformly random. Knowing X means Y and Z are unknown.
However the cube allows for e.g. [1, 1, 1], they can all be predictable at the same time. It does not respect the uncertainty principle.
Another point you may find interesting is that one can get higher dimensional spheres in other theories, notably quaternionic quantum theory where the analogue of the qubit is a 5-dimensional sphere.
Best wishes
Oscar
this post has been edited by the author since its original submission
report post as inappropriate
Jonathan J. Dickau replied on Aug. 19, 2014 @ 15:37 GMT
Thanks Oscar,
I think the key point is that the uncertainty is identified as the freedom to vary, without which there is no action, and that this ranges over a sphere. I think the point Tom and I keep making is that some of this underlying geometry is the true invariant, while any probabilistic measure is only a projection of how variations n a sphere of variability map onto the presumed cube of probabilistic outcomes. So stating that the sphere touches the cube at a point becomes a way of stating that the outcome for any given face can only be a 1 or 0 measurement, while the quantum wavefunction is varying over the entire sphere, but can only be detected via a point of contact on the cube.
We are more focused on the continuous reality that subsumes the probabilistic measures - which are necessarily a hypercubic projection - within the sphere of variability or uncertainty that contains the quantum reality. This echoes what has come out of my correspondence with Dieter Zeh, as well. That is; the fundamentals of QM are more crucial to understand - in order to apprehend reality - than are its utilitarian interpretations. But by stating what should be obvious, that uncertainty is a necessary precursor to variations, you are doing the rest of the scientific community a great service.
All the Best,
Jonathan
report post as inappropriate
Thomas Howard Ray replied on Aug. 19, 2014 @ 18:20 GMT
Thanks so much, Oscar. You write " ... to emphasise that these shapes correspond to different measurement statistics being allowed ..."
That's just the issue, isn't it? One can construct the statistics such that what one discovers objectively is like a Penrose triangle -- an object that looks connected and coherent in 2 dimensions, though it cannot possibly be constructed in 3 dimensions. It's the same with the case that " ... one can get higher dimensional spheres in other theories, notably quaternionic quantum theory where the analogue of the qubit is a 5-dimensional sphere ..." because what looks coherent in 4 dimensions (giving the illusion of a time parameter) decoheres in 5 dimensions. It just blows up at the dimension limit.
An analytical model of continuous functions has clearly prescribed boundary conditions that converge on a precise solution. It does not assume a boundary that cannot be constructed in the higher dimension where the limit is only believed to exist. To prove that the limit exists involves infinite regression; i.e., adding infinite dimensions, which generates infinite proofs.
The most important theorem in modern topology -- the Poincare Conjecture now proven by Gregori Perelman via the Ricci flow model of Richard Hamilton based on William Thurston's geometrization conjecture -- completes our general knowledge of the n-dimension set of Euclidean spheres. They are simply connected; every loop can be contracted to a point.
This property of simple connectivity -- trivial fundamental group -- wedded to the limit of division algebras which admit factorizability (octonions), is what allows Joy Christian to construct a measurement framework in the 8-dimension (7-sphere) space which can be shown physical in the 4-dimension (quaternion) space of the parallelized 3-sphere.
Point is, Joy's construction requires not a single ad hoc assumption. It is mathematically complete.
All best,
Tom
report post as inappropriate
Thomas Howard Ray replied on Aug. 19, 2014 @ 18:30 GMT
hide replies
John Brodix Merryman wrote on Aug. 19, 2014 @ 11:39 GMT
As a sort of meta-observation, might it be worth considering the dichotomy of information and energy, in that because information is static, we assume it must have some Platonic permanence, yet when dynamic conditions are being described statically, the potential information goes to infinity. Now our neural functions are designed to extract and process information, so this creates an infinite feedback loop, but if we want to put it in context, then we need to acknowledge that dynamical basis.
Writing this on a phone, so pardon run on sentences.
report post as inappropriate
John Brodix Merryman replied on Aug. 19, 2014 @ 12:21 GMT
PS,
While this might not be the focus of particular qm debates, it very much goes to the relationship of order and chaos/complexity and from there to explaining various political dynamics and the breakdown of civil and social order occurring around the world, which will eventually affect even academia.
That being the inherent expansion of energy and consolidation/contraction of order. Which would also explain why these debates invariably go to very focused points of conflict.
report post as inappropriate
Blogger Oscar Dahlsten replied on Aug. 19, 2014 @ 14:59 GMT
Hi John,
Ah, defining energy and information, this is worthy of another post and debate at least.
Best
Oscar
report post as inappropriate
John Brodix Merryman replied on Aug. 19, 2014 @ 18:41 GMT
Thanks Oscar,
Given the fact energy transmits information and, conversely, information defines energy, it is surprising the relationship doesn't elicit more discussion
As biological organisms, we have evolved a central nervous system to process information and the digestive, respiratory and circulatory systems to process energy. While there is an obvious intellectual bias toward consideration of information, while it seems energy gets dismissed as "undefined," it is evident that the physical properties of energy very much set the limits of what can be done with information.
report post as inappropriate
Vladimir Rogozhin wrote on Aug. 20, 2014 @ 07:52 GMT
Hi Oscar,
"The crisis of representation and interpretation" (T.Romanovskaya) in quantum mechanics - the crisis of the philosophical foundations of the QM and all fundamental physics. It is true, the way to overcome the crisis - is a further deepening of the Geometry, but rather in the "origin of Geometry" (E.Husserl) and the dialectical- ontological unification of matter, search for
the absolute foundations of physics and knowledge, the absolute generating structure. Necessary to consider limiting (absolute, unconditional) state of matter: absolute motion (rotation, "vortex", discretuum) + absolute rest (linear state, continuum)) + absolute becoming (absolute wave -"figaro" of states = discretuum + continuum).
Then geometrized basis of QM: "sphere" + "cube" + "cylinder". Each limit (absolute, unconditional) state of its way - the absolute vector, the vector of the absolute state. This "triangle" of absolute states of matter - the ontological representation of the triune foundation - "origin of geometry", the beginning of physics, the beginning, framework and carcas of knowledge. This is what David Gross calls - "general framework structure" (
D.Gross, an interview "Iz chego sostoit prostranstvo-vremya/What is in the space-time) the same for the QM and for GM. Today QM and GM are parametrical theories without ontologic justification.
All best,
Vladimir
this post has been edited by the author since its original submission
report post as inappropriate
Thomas Howard Ray wrote on Aug. 27, 2014 @ 15:33 GMT
This thread is too important to let wither and die.
I think Oscar's parting statement is quite profound:
"One may say that uncertainty, rather than being just limiting, liberates quantum states to change."
The subtraction of a few words, and the addition of one other, however, changes the meaning from uncertain to determinate:
One may say that uncertainty, being...
view entire post
This thread is too important to let wither and die.
I think Oscar's parting statement is quite profound:
"One may say that uncertainty, rather than being just limiting, liberates quantum states to change."
The subtraction of a few words, and the addition of one other, however, changes the meaning from uncertain to determinate:
One may say that uncertainty, being self-limiting, liberates quantum states to change. How do we justify this statement? -- Oscar already has done so:
"On the Bloch diagram, state transformations move points around, e.g. by rotating the shape. So, if one accepts that this locality restriction holds, it turns out that operations on site B must leave all points (states) on the lower plane of the cube invariant. It is like the points are stuck by total friction between the shape and the lower plane. As a result the cube has a big disadvantage over the sphere because if the entire square face touching the ground is restricted, then the whole cube gets stuck and no states can change.
"But now imagine metamorphosing the cube into a sphere, or indeed something else with only one point on the lower plane, like how the ballerina goes up on one toe. Then the shape, with all the quantum states in it, can move."
The 'metamorphosis' of the cube into a sphere is what defines the S^2 topology. A generalized Euclidean cube does not differ from the 3 dimensional sphere of ordinary space.
What Oscar describes as an extra degree of freedom for quantum states to change -- the single point that the ballerina exploits -- is the identical added degree of freedom that exists one dimension higher, on S^3, where a point at infinity compactifies the real numbers by which quantum events are recorded.
In more technical terms, Joy Christian's framework of parallelized 3 sphere shows why the point of topological self limitation guarantees quantum correlations with no boundary between quantum and classical domains -- the co-domains are relative at every scale.
view post as summary
report post as inappropriate
Thomas Howard Ray replied on Aug. 27, 2014 @ 18:07 GMT
The co-domains are covariant, I mean.
report post as inappropriate
Vladimir Rogozhin wrote on Aug. 28, 2014 @ 10:39 GMT
Overcoming the "crisis of understanding" in fundamental physics is only possible through the introduction of standard ontological justifications addition to the empirical standard. Requires broadening and deepening of the limits of knowledge through a deeper ontological interpretation and representation of of the experimental data for the formation of the ontological framework, carcas and foundation of knowledge. "Quantum" is not ontologically grounded concept - this parametric concept. Accordingly, the "quantum theory" - a theory is not ontologically grounded, it is not fundamental in the full sense of the word.
The information revolution requires the introduction of the conceptual structure of the Universum the new ontological concepts that represent the deepest meanings of the "LifeWorld". Otherwise, the "uncertainty" "will hold by a throat" fundamental knowledge, first of all physics and cosmology.. This is indicated by the results of the fifth Contest FQXi. The winner of the fourth contest FQXi Robert Spekkens concluded:
«Rest in peace kinematics and dynamics. Long live causal structure! . But where is it - «causal structure» with the ontological justification? I offered
"The Absolute generating structure". But where open competition of alternative, ontologically and empirically based "general framework structure" of the Universum and knowledge that will enable to overcome the
ontological uncertainty ?
this post has been edited by the author since its original submission
report post as inappropriate
Han Geurdes wrote on Sep. 15, 2014 @ 11:34 GMT
Dear Oscar,
Thanks. I really like the connection you make to the ballerina and quantum non-locality indeterminacy. You apparently seem to refer to a "Swan lake" type of dance and for instance not to "Le Sacre du Printemps". I might be mistaken there too. Modern dance isn't exactly similar to a Tsjaikovski ballet. But ok.
My question to you relates back to an arXiv paper of mine http://arxiv.org/abs/1409.0740 or\and (much more elaborated) to my loophole CHSH paper Results in Physics 4, (2014), 81-82. I appologize for this self-referring approach, however, I see no other way.
If we have Swan lake escapes from determinism then why can one arrive at the quantum correlation from Bell's correlation formula (maybe that formula is far too primitive to describe what is needed ?) or why is it then possible to see that CHSH holds a probability loophole. The arXiv paper shows a numerical argument for the derivation of a.b = cos(phi_a-phi_b) from Bell's formula.
I think it is time to realize that lhv and go-away-lhv are perhaps both wrong. The lhv camp served to show the way however.
It is concrete mathematical incompleteness that is bothering us here. Discussions in our quarters often degenerate to ordinary exchanges of nothing between, most of the time, adversaries with a hearing problem. Some of the adversaries also show a kind of "Gilles de la Tourette" type of response to criticism on a favourite theorem. I am- and probably Joy also is- sorry for all the inconvenience.
For your convenience to kill me off instantly, I attach the RinP paper (although it is very densly written one can verify the argument easily).
attachments:
1-s2.0-S2211379714000254-main.pdf
report post as inappropriate
Login or
create account to post reply or comment.