It has recently been demonstrated that Quantum Correlations can be Produced Classically with detection efficiencies higher than supposedly possible for any non-quantum system. (Note: The paper reports double-detection efficiencies (0.72) rather than the more commonly reported conditional detection efficiencies. For the model presented, the latter is equal to the square root of the former:...
view entire post
It has recently been demonstrated that
Quantum Correlations can be Produced Classically with detection efficiencies higher than supposedly possible for any non-quantum system. (Note: The paper reports double-detection efficiencies (0.72) rather than the more commonly reported conditional detection efficiencies. For the model presented, the latter is equal to the square root of the former: sqrt(0.72) = 0.85)
In the attached figure below, it can be observed that the classical and quantum curves are intimated related: the quantum correlation curve, is simply the scaled, first harmonic in the Fourier Series defining the classical correlation curve. This relationship is purely mathematical, independent of either quantum or classical physics. The classical curve consists of a series of discrete, odd harmonics, with rapidly diminishing amplitudes. The quantum correlation curve is simply the first harmonic of this series.
This suggests that the quantum correlation curve is obtained, by a process that merely filters-out (fails to detect) the particle-pairs that contribute to the upper harmonics in the classical curve and then re-normalizing (scaling) the result, to make the correlation peaks equal to +/- 1. Such re-normalization is "built into" the expression for computing the quantum correlations: The denominator is set equal to the number of particle-pairs detected.
Since the peak of the classical curve is pi/2 and the peak of the first harmonic is 4/pi, this suggests that the re-normalization, corresponding to the double detection efficiency must equal the ratio (4/pi)/(pi/2) = 0.81, resulting in a conditional detection efficiency of 0.90, indicating that a classical process should be capable of perfectly duplicating the quantum correlation curve, with detection efficiencies of 90%, higher than anything reported in supposedly "loophole free" Bell-Inequality type tests.
So is "spooky action at a distance" just a grossly misunderstood classical phenomenon?
Rob McEachern
view post as summary
This would suggest that..
What we are seeing as quantumness is simply nature's truncation (or its failure to represent and/or propagate) the higher harmonics of the (Classical) variational waveform via microscale dynamics.
All the Best,
Jonathan
report post as inappropriate
Jonathan,
Actually, what it suggests is that nature's "identical particles" have the exact same interaction behavior, that identical submarines have, that are attempting to detect each other, and that results in behaviors identical to "quantum tunneling" and "virtual particles".
If they cannot detect each other's existence, in an ocean of noise, then they can sail (tunnel) right past each other as though the other does not even exist, with no interaction whatsoever. But when they do detect each other, they sound general quarters, "ALL AHEAD FULL! DIVE! DIVE DIVE!" and make such a disturbance that even a distant destroyer (observer) on the surface can detect the sudden appearance of the formerly undetectable, "virtual" subs. But if the subs subsequently lose contact (the ability to detect a single bit of information) with each other, then they return to running silent, running deep (not interacting), and they disappear, back into the ocean of noise from which they first materialized; and the distant observer is left to wonder if they were ever really there.
Rob McEachern
report post as inappropriate
Jonathan J. Dickau replied on Apr. 11, 2017 @ 01:19 GMT
That's a cool image..
Silent running and non-interacting until spotted. Hmm. I must think on this.
Regards, JJD
report post as inappropriate
Robert H McEachern replied on Apr. 11, 2017 @ 12:23 GMT
Jonathan,
It all results from behaviors being driven by a single-bit of information: unlike more familiar, classical interactions, it is all or nothing. If the required bit cannot be detected, then it triggers no response whatsoever. But if it is detected, it triggers an a priori established response. It must be a priori, since a suitable behavior cannot be deduced from a single bit.
While you are thinking on this, think also about a
One Time Pad in which each bit is manifested as one of the coins (matched filters) described in the above paper: to recover the underlying message, each coin in the message sequence, must be matched with another in the pad, with the exact A PRIORI KNOWN phase angle, in order to recover the underlying message with few, if any, bit errors.
It is foolhardy to attempt to recover the underlying message by using random coin phase angles, since that will result in large numbers of bit-errors. But that is exactly what Bell-type tests do; leaving physicists to wonder why they cannot decode the underlying significance of their own experiments. Spooky ERRORS at a distance.
Rob McEachern
report post as inappropriate
Jonathan J. Dickau replied on Apr. 12, 2017 @ 16:10 GMT
This is starting to make sense Rob..
The detection process has incorporated a kind of hysteresis effect, where it gets stuck in place once it assumes one value or another. I will examine the 'One time pad' reference and comment further later.
All the Best,
Jonathan
report post as inappropriate
Robert H McEachern replied on Apr. 12, 2017 @ 18:54 GMT
Jonathan,
After reading about the one time pad, read about double and triple Stern-Gerlach experiments (attached below), where measurements are either repeated with A PRIOI known phase angles, or not. Then, it should do more than just start to make sense. In the former case, you recover the one and only bit value that is present within the object being observed. In the latter, you observe the result of measuring only the random noise, that is also present within the object being observed, since the detection process produces "no signal" output, when the detector is orthogonal to the polarization axis.
Rob McEachern
attachments:
Stern-Gerlach_experiments.jpg
report post as inappropriate
Colin Walker replied on Apr. 12, 2017 @ 23:27 GMT
Hi Rob
Treating the quantum correlation as a problem in communications and signal processing is a novel approach, which has clearly allowed you to produce some interesting results. Signal, noise and bit rate, related through Shannon's theorem, are fundamental to the concept that quantum correlations are associated with processes that provide one bit of information per sample.
In your coin image model, the "ocean of noise" is an essential consideration, which seems to be missing from quantum mechanics, aside from when noise is studied explicitly. Far from being a nuisance, noise of this sort is more like a resource.
I would be suspicious of some system that could produce a detection rate close to the maximum you calculated - it would have to approach perfection in making selections to eliminate higher harmonics, while leaving the lowest harmonic untouched. That would be spooky!
Apart from the larger question you ask, classical models like yours are being used for simulating quantum computation. I tried to speed up your coin model by reducing it to a signal vector plus a noise vector, but that likely involved some over-simplification.
The noisy vector model is simple enough to allow a probabilistic treatment of threshold crossings, thus avoiding time-consuming Monte Carlo trials. Some notes on the vector model have been posted at
sites.google.com/site/quantcorr. There is also a file with C functions for calculating correlations based on the "geometric probability" of the noise vector crossing a threshold.
The classical concepts you employ seem so powerful, and reasonable, I tend to agree that there must be some deep involvement in quantum phenomena which has been overlooked.
Colin
report post as inappropriate
Robert H McEachern replied on Apr. 13, 2017 @ 00:53 GMT
Colin,
"Treating the quantum correlation as a problem in communications and signal processing is a novel approach..." all too true, unfortunately, even though Information Theory is now 70 years old. In My 2012 FQXi essay, I noted that "In one of the great scientific tragedies of the past century, “Modern Physics” was developed long before the development of Information...
view entire post
Colin,
"Treating the quantum correlation as a problem in communications and signal processing is a novel approach..." all too true, unfortunately, even though Information Theory is now 70 years old. In
My 2012 FQXi essay, I noted that "In one of the great scientific tragedies of the past century, “Modern Physics” was developed long before the development of Information Theory."
"it would have to approach perfection..." That is what error detection and correction coding is all about. Shannon proved (at least in the case of a multi-bit signal) that it should always be possible to create such a code, resulting in perfect detection, right up to the Shannon limit. The final generation of telephone modems (before they became obsolete) came pretty close.
"...making selections to eliminate higher harmonics, while leaving the lowest harmonic untouched..." In another context, this is exactly what raised-cosine filters and square-root-raised-cosine filters are all about: placing nulls at discrete points, to completely eliminate unwanted components, while completely preserving the desired fundamental, and without requiring an impossibly sharp filter cut-off.
"there must be some deep involvement in quantum phenomena which has been overlooked" Exactly. I believe the fact that Shannon's very definition of a "single bit of information" turns out to be the Heisenberg Uncertainty Principle, is that overlooked item: you cannot make multiple, independent measurements, on a single bit of information.
Another second thing that has been overlooked, is that:
The "squared" Fourier transforms at the heart of the mathematical description of every wave-function, are equivalent to the mathematical description of a histogram process, which is why the process yields probability estimates (the Born Rule) – regardless of the nature (particle or wave) of the entities being histogrammed. In other words, the math only describes the histogramming of observed events, not the nature of the entities causing the events, as has been assumed, in the standard interpretations of quantum theory. When you compute a Power Spectrum, you get the total energy accumulated in each "bin". And when the input arrives in discrete quanta, dividing the MEASURED energy in each bin, by the energy/quanta, enables one to INFER the number of received quanta in each bin. There is no INTERFERENCE pattern. Rather, there is an INFERENCE pattern. And if you compute the Power Spectrum of a double (or single, or triple...) slit's geometry, you get the famous INFERENCE pattern: independent of either quantum or classical physics. Particles/waves striking the slits merely act like radio-frequency carriers. All the information content within the INFERENCE pattern, is spatially modulated onto those carriers, by the slit geometry. In other words, the pattern is a property of the slits themselves, not the things passing through the slits.
A third and related overlooked item, is that QM only describes the statistics of DETECTED entities. It does not describe undetected entities at all. That is why it is unitary. Probabilities will always add to unity, when you only compare them to observed counts, that have been normalized via the number of DETECTED entities.
Rob McEachern
view post as summary
report post as inappropriate
Olivier Serret replied on Aug. 2, 2017 @ 10:53 GMT
Hi Rob,
You write
'So is "spooky action at a distance" just a grossly misunderstood classical phenomenon?' I have the same question. Maybe will you be interested by this article on EPR paradox vs. Bell's inequality:
http://file.scirp.org/pdf/JMP_2015103010590224.pd
fIt deals about a classical explanation, based on the distinction between
- the angles of the set up (alpha, beta)
- and the polarization measures (a, b)
report post as inappropriate
Robert H McEachern replied on Aug. 2, 2017 @ 12:48 GMT
Hi Olivier,
I'll take a look at your paper.
You might find my comment about
Bell Tests, Schrödinger's coin, and One-Time Pads interesting in this context.
Also, note that all Bell-type theorems and experiments only deal with the EPR-B paradox rather than the much more general, original EPR paradox. In other words, Bell only deals with David Bohm's version of the paradox, that is restricted to observables with only two observed states, like spin-up and spin-down. But the original EPR paradox deals with the Heisenberg Uncertain Principle. Why can't CONTINUOUS variables (like position and momentum), that can take on any value in the classical realm, be simultaneous measured in the quantum realm? The fact that Shannon's Capacity, when evaluated at number-of-bits-of-information=1 turns out to be identical to the Heisenberg Uncertainty Principle, provides the answer, as to why that happens in general, and not just in the resticted Bohm version of the paradox.
Rob McEachern
report post as inappropriate
Robert H McEachern replied on Apr. 6, 2019 @ 21:27 GMT
For some further insights into the nature of "Quantum Weirdness", see my
"Socratic Dialog" with Tim MaudlinRob McEachern
report post as inappropriate
hide replies