Search FQXi

If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Forum Home
Terms of Use

Order posts by:
 chronological order
 most recent first

Posts by the blogger are highlighted in orange; posts by FQXi Members are highlighted in blue.

By using the FQXi Forum, you acknowledge reading and agree to abide by the Terms of Use

 RSS feed | RSS help

Robert McEachern: on 8/30/16 at 15:33pm UTC, wrote “Spookiness” Confirmed as a Misunderstood Classical Phenomenon Rob...

Steve Dufourny: on 6/10/16 at 6:30am UTC, wrote The Wheel turns like all in fact but not the universal sphere with my...

Steve Dufourny: on 6/10/16 at 5:58am UTC, wrote The well wins always and the dark side of the force always falls down...

Joy Christian: on 6/9/16 at 18:48pm UTC, wrote Hi Fred, I agree with you fully, of course. But a flaw is a flaw, and the...

Fred Diether: on 6/9/16 at 17:46pm UTC, wrote Hi Joy, Even though these guys might have found a flaw in the experiment,...

Steve Dufourny: on 6/9/16 at 10:14am UTC, wrote Hi Joy ,Happy to see you on FQXi.How are you fine I hope.In the past I was...

Joy Christian: on 6/9/16 at 10:09am UTC, wrote Here goes one of the most over-hyped "loophole-free" experiments to rubbish...

John Cox: on 1/23/16 at 16:29pm UTC, wrote Joy, I think what you are up against is the long established use of...


Alvina Amanda: "That's a very good concept of technology i think it's help for Write My..." in Hyung Choi and the nature...

Ajay Pokhrel: "Hello Everyone, I had posted a draft on "Exceeding the..." in Alternative Models of...

Eduardo Morris: "Raise public awareness and interest in theoretical physics and cosmology! ..." in Multiversal Journeys —...

Eduardo Morris: "Fantastic! An impressive list of topics! Visit this website for instant..." in 80 Years of EPR —...

Eduardo Morris: "Our junk DNA already has been turned to this purpose. json formatter..." in Are We Merging With Our...

lynn libbrecht: "You have posted a very detail document. I read all of your article and I..." in FQXi Essay Contest 2016:...

Mike Witot: "Great. Thanks a lot for posting that info! click here" in A Self-Gravitational...

Mike Witot: "I really like to read this informations click here" in Dimensional Reduction in...

click titles to read articles

Our Place in the Multiverse
Calculating the odds that intelligent observers arise in parallel universes—and working out what they might see.

Sounding the Drums to Listen for Gravity’s Effect on Quantum Phenomena
A bench-top experiment could test the notion that gravity breaks delicate quantum superpositions.

Watching the Observers
Accounting for quantum fuzziness could help us measure space and time—and the cosmos—more accurately.

Bohemian Reality: Searching for a Quantum Connection to Consciousness
Is there are sweet spot where artificial intelligence systems could have the maximum amount of consciousness while retaining powerful quantum properties?

Quantum Replicants: Should future androids dream of quantum sheep?
To build the ultimate artificial mimics of real life systems, we may need to use quantum memory.

October 19, 2017

CATEGORY: Blog [back]
TOPIC: “Spookiness” Confirmed by the First Loophole-free Quantum Test [refresh]
Bookmark and Share
Login or create account to post reply or comment.

FQXi Administrator Zeeya Merali wrote on Aug. 26, 2015 @ 19:05 GMT
Hensen et al, arXiv:1508.05949
Spookiness, it seems, is here to stay. Quantum theory has been put to its most stringent “loophole free” test yet, and it has come out victorious, ruling out more common sense views of reality (well, mostly). Many thanks to Matt Leifer for bringing this experiment -- by a collaboration of researchers in the Netherlands, Spain, and the UK -- to my attention (arXiv:1508.05949).

A few years ago, I wrote a feature for Science about the quest to close loopholes in quantum entanglement experiments, with a number of groups around the world vying to perform the perfect test. ("Quantum Mechanics Braces for the Ultimate Test.") In that article, I quote quantum physicist and FQXi member Nicolas Gisin saying: “This race is on because the group that performs the first loophole-free test will have an experiment that stands in history.”

We may now have a winner.

The test is a version of an experiment set out in the 1960s, by Irish physicist John Bell. He came up with a way of working out whether nature was really as spooky as it seems on the quantum level, or if a more common sense explanation was possible. The “sensible” view of the world, in this case, is taken to be “local” and “realistic.” “Local,” in this context, means that information cannot travel between objects faster than the speed of light, so instantaneous communication is impossible. “Realistic” means that the properties of particles are set before they are observed, and are not affected by measurements made on them. By contrast, quantum theory says that prior to measurement, particles can exist in a murky superposition state where their properties are not clearly defined; it’s only upon measurement that their properties click and become well-defined. And quantum theory allows two entangled particles to become linked in such a way that when a measurement is performed on one (breaking it out of superposition, and clicking it into a well-defined state), the properties of its entangled partner will likewise become defined, instantaneously — no matter how far apart they are separated.

Bell suggested that experimenters should entangle a string of particles and measure how well their properties match up. He derived a theorem showing that the common sense view of the world (local realism) can only account for correlations between the particles up to a certain limit. If experiments measured a violation of that bound, then the common-sense view would have to be given up in favour of the spooky quantum one.

Those experiments were first carried out in the 1970s and, more famously and strictly, in 1980s, and have been performed many times since, and always seem to come down on the side of quantum theory. This has convinced most physicists that the world truly is bizarre on tiny scales.

But all experiments have loopholes, and to get a truly definitive result, these need to be closed. One such loophole is the “detection loophole”. In many Bell tests, experimenters entangle photons and then measure their properties. The trouble is photons zip about quickly, and often simply escape from the experiment before being detected and measured. Physicists can lose as many as 80 per cent of the photons in their test. That means that experimenters have to make a ‘fair sampling’ assumption that the ones that they *do* detect are representative of the ones that have gone missing. For the conclusions to be watertight, however, you really want to keep track of all the subjects in your test.

It is easier to keep hold of entangled ions, which have been used in other experiments. The catch there, however, is that these are not often kept far enough apart to rule out the less spooky explanation that the two entangled partners simply influence each other, communicating at a speed that is less than the speed of light, during the experiment. This is known as the “communication loophole” or the “locality loophole.”

In the new paper by Hensen et al, the authors describe measuring electrons with entangled spins. The entangled pairs have been separated by 1.3 km, to ensure that they do not have time to communicate (at a speed slower than the speed of light) over the course of the experiment.

They cleverly use a technique known as "entanglement swapping" to tie up both loopholes, combining the benefits of photons (which can travel long distances) with electrons (which are easier to monitor). Their electrons are placed in two different labs, 13km apart. The spin of each electron is then entangled with a photon and those two photons are fired off to a third location, where they are entangled with each other. As soon as the photons are entangled, BINGO, so too are the two original electron spins, seated in vastly distant labs. The team carried out 245 trials of the experiment, comparing entangled electrons, and report that Bell’s bound is violated.

From their paper:

”Our experiment realizes the first Bell test that simultaneously addresses both the detection loophole and the locality loophole. Being free of the experimental loopholes, the setup can test local realist theories of nature without introducing extra assumptions such as fair-sampling, a limit on (sub-)luminal communication or the absence of memory in the setup. Our observation of a loophole-free Bell inequality violation thus rules out all local realist theories that accept that the number generators timely produce a free random bit and that the outputs are final once recorded in the electronics. This result places the strongest restrictions on local realistic theories of nature to date.”

As a test of the foundations of reality, for most physicists, these experiments dot the i’s and cross the t’s. It seemed unlikely that given the other Bell tests performed so far — even with their loopholes — that quantum theory would be found wanting, in a loophole-free test. That’s because each of the earlier experiments were so different from each other, and had different weaknesses, that nature would have to have been cunning, in quite different and particular kinds of ways in each previous experiment, to keep fooling us into thinking quantum theory was correct, if it is not. But it is important, nonetheless, to test quantum theory to its limits. After all, you never know.

There are also huge practical applications, though. A major motivation, as I explain in the Science feature, is that loophole free Bell tests are an essential step towards ‘device-independent quantum cryptography’ — creating a security system so tight that you could trust it even if you bought it from your worst enemy.

Such a device would go beyond those quantum cryptographic systems that are already in place, which use entanglement to add create “unhackable” keys. In those systems, you share a string of entangled particle pairs between two parties (the sender and receiver) and they each independently perform measurements of their set of particles to generate a matching string of 0s and 1s to make up a key that only they should know. If a hacker tries to eavesdrop on the system, their presence will disrupt the quantum key, alerting the legitimate users and raising an alarm.

Those systems are fine, assuming you really have been sold a quantum cryptographic system. But an unsuspecting buyer could be tricked by a hacker purporting to sell a genuine quantum cryptographic device, who actually just gives them a black box, preprogrammed with a string of 0s and 1s that she’s set up beforehand. The user would be none the wiser.

To get around this, in 1991, Artur Ekert came up with the idea for a device that had to verify its quantum credentials using a Bell test at the same time as generating the key, so the user would know that it was working correctly, and was genuinely using a quantum process to produce the key. But such “device independent quantum cryptography” can only be trusted if the Bell tests are watertight. As Gisin told me for the Science piece, “It’s unlikely that nature is so malicious that it conspires with the apparatus to hold back particular photons just to fool us into thinking that quantum mechanics works,” but, a “hacker—by definition—is malicious enough to exploit the detection loophole to fool us into thinking that a quantum process has taken place.”

There is still another way that nature could be tricking us in quantum tests. It seems a bit outlandish, but it’s possible that experimenters are somehow being manipulated into measuring certain properties in tests and not others, distorting the results. This is sometimes called the “freedom-of-choice” loophole. Last year, I wrote about a fun experiment that used light from distant quasars to help experimenters choose what measurements to make in the lab — in an attempt to rule out the possibility that the experimenters choices were being mysteriously biased by stuff in the experiment itself. That article appeared in Nature, “Cosmic Light Could Close Quantum Weirdness Loophole”.

The authors touch on remaining loopholes at the end of their paper:

“Strictly speaking, no Bell experiment can exclude the infinite number of conceivable local realist theories, because it is fundamentally impossible to prove when and where free random input bits and output values came into existence. Even so, our loophole-free Bell test opens the possibility to progressively bound such less conventional theories: by increasing the distance between A and B (testing e.g. theories with increased speed of physical influence), using different random input bit generators (testing theories with specific free-will agents, e.g. humans), or repositioning the random input bit generators (testing theories where the inputs are already determined earlier, sometimes referred to as “freedom-of-choice” ). In fact, our experiment already excludes all models that predict that the random inputs are determined a maximum of 690 ns before we record them, because the inequality is still violated for a much shorter spin readout.”

(Updated to include my write up about this experiment for Nature, 27 August 2015: Quantum 'spookiness' passes toughest test yet. With comments from FQXi members Nicolas Gisin, Anton Zeilinger, and Matt Leifer.)

this post has been edited by the forum administrator

report post as inappropriate

FQXi Administrator Zeeya Merali wrote on Aug. 26, 2015 @ 19:07 GMT
FQXi member Richard Gill (who commented on the Nature story for me) has also written up an account of the new work, including his own contribution here.

report post as inappropriate

Dan B Cohen replied on Aug. 27, 2015 @ 20:30 GMT
Nobel laureate Gerald Edelman wrote a series of technical books about human consciousness, beginning with The Mindful Brain (1978). His theories of consciousness remain dominant in psychology, neuroscience and psychology. He summarized his viewpoint in an article Naturalizing consciousness: A theoretical framework.

Edelman's held that human consciousness is a product of brain function. To prove this scientifically he proposed a theory to account for the properties of consciousness and provided a framework for the design and interpretation of experiments.

To place consciousness within a biological framework requires a theory based on a set of evolutionary and developmental principles which provide a unifying account of conscious phenomena. He acknowledged that to succeed in solving the mystery of consciousness using scientific investigation required rejecting strange physics and any extra-physical assumptions.

With "spooky action at a distance" confirmed by credible physicists, it is now reasonable to pick up the other end of the stick in regard to solving the "hard problem of consciousness?" That is, can a theory of consciousness that requires the rejection of strange physics be credible? Or does the uncertainty principle and quantum entanglement need to be considered and accounted for?

report post as inappropriate

Joy Christian replied on Jan. 22, 2016 @ 03:21 GMT
Richard Gill's key paper is based on elementary mathematical mistakes, and has been comprehensively debunked on PubPeer.

report post as inappropriate

John R. Cox replied on Jan. 23, 2016 @ 16:29 GMT

I think what you are up against is the long established use of spherical trig in complex analysis which mistakes Topology as simply a different way of extracting a result from a (Cartesian) pre-existent, complicated evolution of rotations, spherical co-ordinate system. When what Hamilton apparently recognized was that all geometric measure, including chirality, could be reduced to the first principles mathematically simply by an initial choice of either a Right-Handedness or a Left-Handedness. That gets overwhelmed by all the probabilistic arguments, when to properly approach any reconciliation between the two systems firstly requires recognition that Topology is NOT simply 'a different take' of results, but a different complete wholistic measurement system. Compatible if carefully done, but different. Thanks for the baptismal, jrc

this post has been edited by the author since its original submission

report post as inappropriate

Fred Diether wrote on Aug. 26, 2015 @ 22:34 GMT
It is great that we have further experimental validation that the predictions of quantum theory are correct for the EPRB type scenarios. But this also confirms former member Dr. Joy Christian's classical local realistic model.

So this experiment doesn't really have anything to do the with confirming or denying of local models. Bell was simply wrong.

post approved

Anonymous replied on Aug. 27, 2015 @ 00:15 GMT
Thanks, Fred.

It is really sad to see how brainwashed the entire physics community has become about the absurd notion of "non-locality", and how they continue to spread the false propaganda here at FQXi about the so-called "theorem" by Bell --- which, as we know, has been discredited for many years. I guess mysticism sells and politics in more important than truth.

But as you point out, the experiment under discussion is more naturally and rationally understood as confirming my manifestly local-realistic model for the EPR-Bohm correlation, as we have been discussing here and here.



report post as inappropriate

Joy Christian replied on Jan. 13, 2016 @ 06:12 GMT
I now have a more detailed response to Richard Gill's challenge to my proposed macroscopic experiment to test Bell's nonsensical claims.

I am waiting for Gill to prove that he is a "gentleman" and pay up 10,000 Euros to me.

report post as inappropriate

Joy Christian replied on Jun. 9, 2016 @ 10:09 GMT
Here goes one of the most over-hyped "loophole-free" experiments to rubbish bin.

And the flaw in the experiment is not even subtle. It violates the no-signalling principle for heaven's sake:

"We analyze the data from the loophole-free CHSH experiment performed by Hensen et al, and show that it is actually not exempt of an important loophole. By increasing the size of the sample of event-ready detections, one can exhibit in the experimental data a violation of the no-signaling principle with a statistical significance at least similar to that of the reported violation of the CHSH inequality, if not stronger."

You can find the paper here.

And you can find my own latest argument against Bell's so-called theorem here.

report post as inappropriate

Thomas Howard Ray wrote on Aug. 27, 2015 @ 13:32 GMT
There's a gigantic loophole built into the experiment. The assumption of linear superposition which brings along with it other assumptions -- entanglement, probability measure.

None of these are observable.

On the other hand, assume a nonlinear model, and the quantum jump phenomenon is included by default, as a consequence of square integrability, which makes the quantum jump observable. This solves the problem of an " ... infinite number of conceivable local realist theories ... " because the square integrable requirement commands a finite domain, fully compatible with special relativity without renormalization, and including the time parameter in a natural way.

The failure of entanglement-based quantum computing hints at why measures on an infinite undefined space will never be conclusive. The structure of spacetime precedes measurement.

this post has been edited by the author since its original submission

post approved

Thomas Howard Ray replied on Aug. 27, 2015 @ 16:16 GMT
Here's what I mean, abstracted from my paper on a metric space created by parallel sequences of Sophie Germain primes. The sequences are self-limiting. Therefore the space is finite though unbounded.

Point is, every event can in principle be indexed, doing away with the prior assumption of probability.

attachments: Table_3.pdf

report post as inappropriate

Don Limuti replied on Sep. 21, 2015 @ 20:46 GMT
Hi Thomas,

Your first sentence says it all (IMHO)

"There's a gigantic loophole built into the experiment. The assumption of LINEAR SUPERPOSITION ...."

This is the source of a lot of misunderstandings.

Appreciate your post,

Don Limuti

report post as inappropriate

Thomas Howard Ray replied on Sep. 22, 2015 @ 01:40 GMT
Thanks, Don. I would rather appreciate the experiment for what it is, not for the overreaching claims of what it pretends to be.

The extraordinary amount of unquestioning press astounds me.

report post as inappropriate

Sabri Al-Safi wrote on Aug. 27, 2015 @ 14:13 GMT
A couple of pedantic (but nevertheless important) quibbles.

"Local" doesn't necessarily have anything to do with information. A more rigorous definition might be that the physical state of affairs in one location is not immediately influenced by a deliberate intervention (e.g. a choice of measurement) in a separate location. Bohmian mechanics is an example of a theory that is explicitly non-local, but in which instantaneous communication is still impossible.

"Realistic" is slightly more ambiguous, but it tends to mean that physical systems have objective states which determine the outcome statistics of any given measurement on that system. This has little to do with how properties are affected by measurements: I suppose collapse models would be an example of a realistic theory in which performing a measurement actually does alter the outcome statistics of potential future measurements.

report post as inappropriate

FQXi Administrator Zeeya Merali replied on Aug. 27, 2015 @ 18:29 GMT
Thank you for this Sabri.

report post as inappropriate

Richard Gill replied on Aug. 31, 2015 @ 07:33 GMT
There's quite a debate (and not consensus is emerging) about whether or not one can separate the concepts of locality and realism. My take on this is the following: looking at theories, one calls them local or not depending on what one takes as being "real". For instance, in QM, if you take the wave function to be real, then QM is nonlocal. Bohmian mechanics exactly reproduces the predictions of QM, and is a non-local theory, since the guiding waves are taken to be "real". Many worlds interpretation makes QM local by denying the reality of collapse. All the branches of the wave function coexist for eternity, measurement outcomes don't actually happen ... or they are some kind of illusion.

What is rather interesting is the recent work Pawlowski et al (1999) on more refined notions of information causality. It appeared in Nature and I think rightly so - this was quite a fundamental breakthrough. We know that QM is compatible with relativistic causality: in the CHSH / EPRB setup Alice can't learn what Bob is doing. There is "no-signalling". Pawlowski et al looked at higher order versions of no-signalling. First order information causality is: suppose Bob now starts using classical means to transmit one bit of information per trial of the CHSH experiment to Alice. Does this, together with their participation in the experiment, to enable Alice to learn about what is Bob is doing ? In other words she tosses her coins and does her measurements and observes her outcomes, and every time Bob also sends one bit of information (whatever he likes). Answer: precisely because QM satisfies the Tsirelson bound CHSH

report post as inappropriate

Thomas Howard Ray replied on Aug. 31, 2015 @ 10:37 GMT
Well, these are all loopholes.

report post as inappropriate

Thomas Howard Ray wrote on Aug. 27, 2015 @ 18:37 GMT
Okay, what's going on? Are the theorists so insecure that they entertain no competing theory of origins?

Or else let them defend their claim of "no loopholes". If a probability argument is required -- prove it.

report post as inappropriate

FQXi Administrator Zeeya Merali replied on Aug. 27, 2015 @ 18:42 GMT
Apologies about the missing posts Tom. I was looking forward to reading what you had written. I will try and get them back and find out what's happening.

We also seem to be having a site glitch that stopped me posting earlier. I think that's a separate issue.

report post as inappropriate

Richard Gill replied on Aug. 28, 2015 @ 12:24 GMT
No (experimental) loopholes means that the experimentalists adhered rigorously to the experimental protocol set out in Bell (1981) "Bertlmann's socks", see the text around Figure 7. Here's the figure in question: (sorry I gave it the wrong name).

Here's Bell's (1981) text: "You might suspect that there is something specially peculiar about...

view entire post

report post as inappropriate

Richard Gill replied on Aug. 28, 2015 @ 12:37 GMT
Sorry, end of my posting got lost, adjacent < < got interpreted as html tag.

Here's the rest:

"We can arrange that c delta < < L, where c is the velocity of light and L the length of the box; we would not then expect the signal at one end to have any influence on the output at the other, for lack of time, whatever hidden connections there might be between the two ends.

"Sufficiently many repetitions of the experiment will allow tests of hypotheses about the joint conditional probability distribution P(A,B|a, b) of results A and B at the two ends for given signals a and b. Now of course it would be no surprise to find that the two results A and B are correlated, i.e., that P does not split into a product of independent factors: P(A,B|a,b) != P1(A|a)P2(B|b). But we will argue that certain particular correlations, realizable according to quantum mechanics, are locally inexplicable. They cannot be explained, that is to say, without action at a distance."

report post as inappropriate

FQXi Administrator Zeeya Merali wrote on Aug. 27, 2015 @ 18:41 GMT
OK, a lot of posts seem to be vanishing from this thread...At least two from Fred, one from Tom, and one from Akinbo (I think) and one from me.

I will try and get these restored.

In the meantime, can I request that people do not delete posts simply because they do not agree with them? The abuse button is there so people can report inappropriate language or spam. Please do not abuse the abuse button!

Thank you.

report post as inappropriate

Thomas Howard Ray replied on Aug. 27, 2015 @ 18:45 GMT
Thank you, Zeeya. I think that a claim of "no loopholes" deserves at least a proof that foundational research must start with assumptions of linear superposition and probability.

Especially when the alternative leads to different conclusions.

report post as inappropriate

Fred Diether replied on Aug. 28, 2015 @ 01:00 GMT
Thanks Zeeya. They really should change the behavior of the "report post..." link. On my forum if someone reports an existing post, it doesn't get deleted right away. It warns a moderator to take a look at it. Most forums work that way.

report post as inappropriate

FQXi Administrator Zeeya Merali replied on Aug. 28, 2015 @ 13:44 GMT
Hi Fred,

We originally had it set up so that posts were not deleted immediately, as you recommend. That was fine, for the most part. Unfortunately, we had a couple of threads that got completely out of hand, with thousands of posts per thread, including comments that were personally rude and insulting to other posters (and to people who were not even involved in the conversation) and some that were libellous.

We tried asking people to be polite, but (some) paid no attention. In the end, we had to just step in and become very heavy handed. We were unhappy about that. And as you can see, it's other, polite posters who are still paying the price for that. But it does at least mean that we do not risk having insulting and libellous messages sitting on the site for any significant length of time.

I should also add that one of the reasons we've been particularly slow to reinstate the missing comments is that this post brought in a huge amount of traffic to the site, that actually cause it to slow down massively yesterday (and today). That actually prevented the person who monitors removed posts from going back in and retrieving them...the site just kept freezing. So yesterday (and today) we've been trying to fix that too.

this post has been edited by the forum administrator

report post as inappropriate

Jason Zsiba wrote on Aug. 27, 2015 @ 21:47 GMT
I had to create an account, login and reply to this to address this part of this article, "a “hacker—by definition—is malicious enough to exploit the detection loophole to fool us into thinking that a quantum process has taken place.” "

Hackers are most definitely not malicious by definition. A hacker is several times more likely to care about a solid encryption scheme than say a government agency who'd rather we all use the password "abc123". Hackers are the ones testing such systems for exploits and making them better. Without them, you'd have no clue if your box was compromised in the first place.

As a resource and platform for discussing scientific subjects that are often times reported on TERRIBLY by the general media I would think that this site would shy away from using the term Hacker like it is in the movies or on Fox news.

report post as inappropriate

FQXi Administrator Zeeya Merali replied on Aug. 28, 2015 @ 15:51 GMT
Hi Jason,

I appreciate your comment. In the original Science article that I pulled Gisin's quote from I think (or I hope) it was clear from the context that he was using "hacker" to mean someone with criminal intent who is trying to steal someone's details for personal gain. He runs a quantum cryptography company, so he is keen to lock out such attempts to steal data. His use of the word "malicious" was I think to contrast the fact that such people have a deliberate intent to cause harm to users, whereas nature would not have such an 'intent' to disguise its underlying laws.

However, there are also professional hackers who are hired to test system's security, and people who hack systems to expose nefarious goings-on by organisations. So I take your point that you can argue with the "by definition" part of his comment.

report post as inappropriate

George Humphrey wrote on Aug. 28, 2015 @ 11:09 GMT
Missing decimal point?

In the article it states that the entangled electrons are separated by 1.3 km, and then a little further on, it states that the photons are separated by 13 km.

i see no reason why this cannot be the case, but I would like to know if a decimal point was dropped from the second distance number.

Were the entangled photons separated by 1.3 km as well as the electrons or were they actually 13 km apart?

(the abstract for the paper By Hensen, et al, does not reveal this. )

report post as inappropriate

Richard Gill replied on Aug. 28, 2015 @ 12:20 GMT
1300 meters, or 4000 feet. Since light travels one foot in one nanosecond, Alice and Bob have each 4000 nanoseconds = 4 microseconds to toss a coin, press a button setting a measurement angle, get an outcome +/-1 and store it in a computer file.

Hope I did the math right.

report post as inappropriate

FQXi Administrator Zeeya Merali replied on Aug. 28, 2015 @ 13:37 GMT
Hi George,

As Richard says it is indeed meant to be 1.3km and that was just a typo, since corrected. Unfortunately, it took me a while to correct the typo because -- happily -- so many people have been visiting the page to read the post, it temporarily crashed the site, and the post would not accept edits.

I have also now corrected the name of the first author, which is Hensen. There's also a Hanson involved, who is the group leader, and I managed to combine their names to create a third fictitious person in my post "Henson".

report post as inappropriate

Richard Gill wrote on Aug. 28, 2015 @ 12:30 GMT
If you want to criticise this experiment, you should focus on the fact that they had N = 245 and hence observed S = 2.4 with an estimated standard error of 0.2. Thus two standard deviations departure from local realism. Such an extreme result could occur entirely by chance with probability 0.025 (one in fourty times). In fact if you are paranoid and don't trust usual statistics of iid observations then it turns out that local realism could do this good with probability 0.039.

What we now need is replications of this experiment, preferably in other labs. Or that the experiment is done with N ten times larger so that we get a three times smaller standard error and hence a 6 standard deviation departure from local realism. But it is very very difficult to keep everything working as it should for longer time periods. The experiment we are talking about now ran for a week. You can be sure that if they could have kept it running and everything stable for 10 weeks, they would have done so.

report post as inappropriate

FQXi Administrator Zeeya Merali replied on Aug. 28, 2015 @ 13:38 GMT
I think you're spot on there Richard. I'm glad that you've posted that comment to Nature's website too.

this post has been edited by the forum administrator

report post as inappropriate

Fred Diether replied on Aug. 28, 2015 @ 19:26 GMT
Yep, peer review and replication would be good to have done for this experiment as some flaws may have already been found. However this experiment really has nothing to do with local realism and everything to do about confirming the predictions of QM, -a.b for EPRB type scenarios.

report post as inappropriate

Richard Gill replied on Aug. 29, 2015 @ 05:01 GMT
The result was S = 2.4 (s.d. = 0.2) not 2.828... They did not actually find the the EPRB correlations!

They found entanglement but not maximum entanglement.

report post as inappropriate

John R Dixon wrote on Aug. 31, 2015 @ 19:21 GMT
Does this rule out the source of the emissions predicting (imperfectly) the detector settings? If I understood the paper correctly, both are based on "quantum" processes. Perhaps this "large" separation wasn't large enough to prevent the three quantum processes from becoming "coupled" somehow, enough for the source to predict the detector settings with enough accuracy to give quantum correlations. See my paper, which I think demonstrates this is a possibility. This would leave another loophole still open.

report post as inappropriate

Richard Gill replied on Sep. 1, 2015 @ 06:27 GMT
That's the point of the "paranoid" p-value of 0.039. Sure it could be that way but if so, then a fairly unlikely event has taken place.

report post as inappropriate

John R Dixon replied on Sep. 1, 2015 @ 15:58 GMT
I'm actually wondering if this setup can be decisive with any sample size. That is, I wonder if the detectors settings can be imperfectly predicted by "nature" with enough accuracy to give the quantum correlations. Maybe nature makes forecasts, and some physical processes (in particular this experiment's way of setting the detectors) are easier for nature to forecast than others. I have read of two published Bell experiments which went against quantum mechanical predictions. That is still on the level of anecdote. And maybe a flaw in the setup systematically collapsed the wavefunction too soon in these two experiments. But what about publication bias: how many experiments with findings contrary to quantum mechanics were not reported as the experimenters concluded "we must have had a flaw in the setup that systematically collapsed the wavefunction too soon"? In sum: perhaps nature is making forecasts to keep quantum correlations, but this breaks down under more complex processes. Which could be investigated by more "complex" ways of setting the detectors.

report post as inappropriate

Richard Gill replied on Sep. 2, 2015 @ 06:38 GMT
John, that was the whole point of my work in and which subsequently got taken up and refined up by the physicists and has now even been used in this experiment.

Yes: this setup can be decisive (for all practical purposes) as long as the sample size is large enough. There always remains a chance that any result can be achieved ... purely through chance! We can get that chance as small as we like by taking the sample size large enough! Tell me what chance you would consider "decisive" and I will tell you how big to take the experiment.

report post as inappropriate

Richard Gill wrote on Sep. 1, 2015 @ 06:26 GMT
Here's an attempt to explain the principles of the new experiment and compare it with the usual ones. In terms of two variants of "the Bell game".

Probably it is impossible to explain this to science-journalists. I think I explained once to an audience of Buddhists and neuro-scientists. But I had an hour and there was a very lively discussion. …...

view entire post

report post as inappropriate

Thomas Howard Ray wrote on Sep. 5, 2015 @ 11:14 GMT
Why is the independence of mathematics from physics important? And how does Bell's theorem fail the test?

Mathematics is an art. There are things that exist in mathematics that do not exist in local physical reality. It's a sure bet that if non-local action is required to uphold a theory, that the physical picture is incomplete.

Consider mathematically complete theories,...

view entire post

report post as inappropriate

Steve Dufourny replied on Sep. 5, 2015 @ 11:33 GMT
Hello Mr Howard Ray,

:)I am recognising your taste of maths instead of physics,don't forget it is a tool explaining when it is well utilized our physical Universal sphere in 3d :)


report post as inappropriate

Steve Dufourny replied on Sep. 5, 2015 @ 13:48 GMT
Mr Howard Ray and fqxi,

I beleive it is from LinkedIn the hacking,but from who, I don't know,

Tom have you several algorythms in maths to stop them please,it is tiring you know.

Best Regards

report post as inappropriate

John R. Cox replied on Sep. 6, 2015 @ 01:54 GMT

Always a good and necessary point, that you distinguish the formality of math as its own truth but that physical phenomenon and experiment must agree and the math must then also match those results.

Your last: "In other words, in a time dependent evolution that sees time as a random walk, an event follows the path of time as encoded in the structure of space."

Like a spider spinning a thread as it moves toward a destination through one of those playhouse 'ball-pits'. Only let's use a bunch of different size balls to mimic the physical reality of interstellar space, with each ball being a field associated with a star or planet, or asteroid etc.. And to make it more like a picture of reality, lets put air jets in the floor and sides so the balls will be continually jostling around. Enter our woe begotten spider just trying to get to the other side, taking the straightest path of least resistance it can find at any given moment. Spider silk is per circular mil, stronger than steel. And Minkowski's 'many spaces' relativity, is those field ball volumes.

The advantage of topological modeling is that it can encapsulate not only the spacetime in one measure space, but incorporate those balls of field and the dynamics of their interactions. It always seems curious to me that QM keeps rejecting Minkowski when the very zero point center it seeks is also found from the surface curvature, as per GR, of those field balls. And given that finding center from the surface of a sphere is as probabilistic as finding center of a broken stud bolt with a center punch, the quants should be happy with it. Oh well... :-) jrc

report post as inappropriate

Steve Agnew wrote on Sep. 7, 2015 @ 19:22 GMT
Bell’s theorem takes the simple notions of the phase coherent superpositions of quantum states and makes them into really complex notions of nonlocal realism of gravity objects and intuition. This experiment is a perfect example of the all too human tendency to make something simple complex. The experiment is both difficult to describe and equally difficult to understand and all it really does...

view entire post

report post as inappropriate

Steve Dufourny replied on Sep. 8, 2015 @ 08:23 GMT
Hello Mr Agnew,

Interesting point of vue.Vanities of vanity dear scientist and so the humility permits to increase your karma Inside a beautiful sphere in spherisation.To be or not to be, that is the question after all ...


report post as inappropriate

Thomas Howard Ray replied on Sep. 10, 2015 @ 17:39 GMT
It's easy to get wound around the axle with black hole thermodynamics, John. Let's see if we can simplify it a bit. Separating the particle dynamics from the wave function gives us a non-linear picture. That is the Unruh effect. How it connects to topology, quantum computing and Bell's theorem is matter of tracing the function to its origin -- and that's where we find the...

view entire post

post approved

Steve Agnew replied on Sep. 12, 2015 @ 19:03 GMT
Quantum futures are never certain and that uncertainty is the basis of free will and free choice. The spooky neural phase between two people is what bonds them with compassion or separates them with selfishness and conflict.

Although we accept the coherent phase between two people and therefore know that that coherence links their actions as attractive or conflicting, it seems spooky to many to have two particles also linked with phase coherence. When something happens to one particle, we then know something about the other, even across the universe.

Why this simple fact of phase coherence is layered into such complexity is a mystery to me, but obviously not to most others.

report post as inappropriate

Steve Dufourny wrote on Sep. 10, 2015 @ 18:10 GMT
Copenaghen helps us :)




if hidden variables exist so it is above our relativity I am insisting.

The gravitational ether can imply hidden variables but not with bosons !!!

De Broglie_Bohm have forgotten the equation of Schrödinger.

It is not possible with bosons !!! I doubt even with gravitons and their smallest spherical volume and the speeder velocities than c.

Borh and Cpopenaghen, help us because they confound computing,simulations and our deterministic realism !!!

They are not violated in reality these bell's inequalities

report post as inappropriate

Steve Dufourny replied on Sep. 10, 2015 @ 18:29 GMT
The waves functions are bad understood because the einsteinian ether is bad interpreted simply.

Maths must have their limits and physical laws even for our gravitons and bosons.

Reversibility also has its own limits.

The duality wave particles is rational even with a pilot wave ,please Copenaghen Help us

It is totally the same with the informations and their codes!!!

We arrive at the protoconsciousness ...AND THE GRAVITATIONAL WAVES FROM THE MAIN CENTRAL COSMOLOGICAL SPHERE !!!but all this story is purely deterministic !!!

report post as inappropriate

Steve Dufourny replied on Sep. 10, 2015 @ 18:31 GMT
Furthermore I am not sure that the thermodynamics purely linked with bosonic encodings can be correlated rationally spealing with the black body énergies !!!

Irriting but so true !!!

report post as inappropriate

Steve Dufourny replied on Sep. 10, 2015 @ 18:40 GMT
The pilot wave ,me I want well but ....MESURES ...EMPIRISM ...AND DETERMINISM please

Feynman helps us Borh helps us .....Copernic Helps us !!!

report post as inappropriate

Vladimir F. Tamari wrote on Sep. 14, 2015 @ 16:23 GMT
Dear Zeeya I admire your openness to new concepts; here is one, not completely new but in need of reviving, as I hope you will come to agree:

Imagine that we are living in a universe where all interactions are local, causal and linear, and that there is no probability inherent in nature there. Suppose that in that universe there is no photon particle, no duality. In other words when a...

view entire post

report post as inappropriate

Stefan Weckbach wrote on Sep. 15, 2015 @ 17:26 GMT
Hi Vladimir,

looks like a nice attempt to save determinism. I think to test your hypothesis one should in a first step calculate the probability of how often such a detector clicks without intentionally firing a "photon" onto it.

In a second step, with the now known knowledge of the frequency of such a detector firing under the mentioned circumstances, one could calculate the data for already executed Bell-experiments under assignment of the detectors' bias' and look, if the result violates the used Bell inequality.

One thing i am pondering about when thinking about your hypothesis is, that why only atoms which are part of a detector should be able to resorb some "photon"-energy. How can there be photon waves in a dark laboratory that hit a detector in its face and not have been absorbed already by other stuff's atoms?

report post as inappropriate

Vladimir F. Tamari replied on Sep. 16, 2015 @ 10:39 GMT
Thanks Stefan

Indeed a quantitative analysis of the expected clicks in "my" scenario is necessary. Of course this scenario has always been called the semi-classical explanation, but the whole notion and the term was eclipsed by Born, Bohr, Copenhagen and the 'shut up and calculate' school of thought. However my notions of physics start out as mechanistic images or models, and in some cases...

view entire post

report post as inappropriate

Stefan Weckbach replied on Sep. 16, 2015 @ 20:48 GMT
Hi Vladimir,

thanks for your response. I am open for all possibilities, despite the fact that i surley have my own preferences. I cannot judge Eric Reiter's achievements, so i must leave it open for myself what he claims to have obtained. But surely it would be nice, if there would be much more money at institutions like fqxi to revisit some already settled questions by funding the needed experiments. And be it only for the sake of ruling out these possibilities. Presumably, other experimenters have already ruled out Eric Reiter's results by having identified something within his procedure that looks for them as a serious experimental or theoretic mistake and therefore are not interested to experimentally test Reiter's claims. I cannot judge if anyone is defnitely right or wrong here, but appreciate all efforts to make science further progressing.



report post as inappropriate

Vladimir F. Tamari replied on Sep. 17, 2015 @ 03:25 GMT
Hi Stefan

As they say great claims need great proof. My understanding is that nobody has taken the trouble to repeat Reiter's lone experiments to prove or disprove his claims. On his website he has actively invited such tests. From my own experience anyone claiming an interpretation QM or of Relativity different from the one put forward by the establishment is ignored or worse, for example preventing him or her from posting research on ArXiv. And it is an establishment defending its hard won reputation, tenured positions, grants and a whole panoply of excuses not to rock the boat by studying and adopting revolutionary ideas. That is why I was hoping that FQXI, a site purporting to study fundamental questions, should sponsor and encourage such experimentation and the theoretical questioning that lead to them.Alas nobody wants to rock the boat on FQXI.

This is nothing new of course and innovators have faced such obstacles throughout history.



report post as inappropriate

Steve Dufourny wrote on Sep. 15, 2015 @ 17:59 GMT
after the USA saying this about the spôokyness at distance,China also on live sciences.I discussed just now with Mr Clouston on LinkedIn about this.

It is not possible with photons, but how is it possible to say this.That does not respect nor the SR ,nor the standard model.

How is it possible that so many scientists focuson these violations of our foundamentals.It is not possible with photons.Sometimes I don't understand this planet.


report post as inappropriate

Steve Dufourny replied on Sep. 15, 2015 @ 18:07 GMT
after all ,it is the same with our broken symmetries,let's take a CP violation,It is not really against the standard model even during a desintegration.

It is just a change of sense of rotation permitti,ng to a particle to change its charge. So ??? It is just a question of rational interpretation.


report post as inappropriate

Eric Stanley Reiter wrote on Sep. 16, 2015 @ 07:29 GMT
A much simpler test of quantum mechanics has been publicly available for 10 years. Please see my well documented working experiments to demonstrate the failure of quantum mechanics in general. It is a beam-split coincidence test that defies QM chance. A photon should go one way or another at a beam-splitter, but I show it going both ways at rates like 20 times chance. I do it with gamma-rays to lay rest to the photon model. I Also defy chance with alpha-rays to lay rest to the always applicable massive particle. See

see my website

Also, ask me for my SPIE paper. the abstract is linked from my website.

Also I entered this material in a past FQXI essay.

Thank you. Eric Reiter

report post as inappropriate

Steve Dufourny replied on Sep. 16, 2015 @ 09:59 GMT
Hello Mr Reiter,

Here is my line of reasoning.

We cannot go faster than c due to our SR.A spookyness at distance is not possible because wgravitatione use the electromagnetic waves correlated with c and these waves also are Under SR.

If it was with gravitational spherical waves correlated with dark matter, it is possible but the problem is that we do not utilise this technology.Indeed we search still these particles and the correlated waves.

Let's take a CP vilotion seen in a desingration for example,it is not really a violation but just a changement of sense of rotations,so giving a different charge simply,it is just a trasformation of senseof rotation,an electro n in proton or a a matter in anti matter.It is not a real violation.

We cannot create gravitational waves going faster than c in logic, the only one thing possible is that we can utilise them but the problem is how,I have never seen this discovery still. A phton cannot be Under these gravitational waves due to their volumes if I link with my équations.

The pilot wave or the etehr are relevant like the universal gravitationalwaves from the main central universal BH sphere.But the probelm is that it is with black particles so ,not with photons.I doubt that these photons encode these dark matter, it is more logic to say that it is the nucleis which encode like in the line of reasoning with bosons.

That is why it is not possible with photons even if they encode dark matter because there we have so still a photon Under the SR.

It is logic in fact simply.


report post as inappropriate

Steve Dufourny replied on Sep. 16, 2015 @ 10:22 GMT
But I have forgotten a parameter,the economy and the investments of course.

So I am able to understand that sometimes a lab or a team or searchers need investments.I d say even that they are also obliged to publish or to be in competition at the globaleconomical scale.But can we utilize monney like that?The real question is there,indeed the global state of our planet is so sad in pure altruistic analyse.The fundsunfortunally are not infinite.Or perhaps it is time to change our global economical system.That will permit to continue with compétitions towards a kind of Van neuman equilibrium.But at this moment it is not the case it seems to me humbly speaking. It is just a general global analyse.

I am not against the compétitions me,I am just against the waste of time and money simply.


report post as inappropriate

Akinbo Ojo replied on Sep. 16, 2015 @ 11:13 GMT
Hello Eric,

Just to give more encouragement and kudos to your work. I believe in the fullness of time its importance would be revealed. I have been following your work and we exchanged correspondence some time ago. I will read your paper and probably exchange correspondence directly by email.

The importance of your work to me is in reassuring my belief in the wave model for...

view entire post

report post as inappropriate

Vladimir F. Tamari wrote on Sep. 17, 2015 @ 04:07 GMT
Hi Eric

Very happy to see your message, and look at your new SPIE presentation. Very impressive, thorough theoretical and experimental work that needs serious study.

I was particularly glad to see the slide entitled "particles don't diffract" with its theoretical proof, because -decades ago - using different reasoning I came to the same conclusion. I wondered what makes a photon explode into photonettes and spread as a Huygens wavelet the moment it arrives at an imaginary point in the aperture plane? My 1987 streamline diffraction theory described a wave spreading out in all directions, no place for particles there.

Some good news: Gerard 't Hooft the Nobel prizewinner last year published a paper proving the possibility of a physics as cellular automata bypassing regular QM

Best wishes


report post as inappropriate

Vladimir F. Tamari wrote on Sep. 17, 2015 @ 06:31 GMT
Dear Akinbo

You are quite right about Space = the ether or the aether as it was called in Maxwell, Lorentz and Hertz's time = all of these mentioned took seriously the concept of a medium for light to be transmitted in. Einstein unnecessarily eliminated the need for the ether in his Special Relativity paper by the too-clever notion that time and space expand and contract - not the speed of light change...anyway in his way of thinking it made sense because a particle-photon can fly in a vacuum without a medium. He reached the right results by the wrong conceptual models...



report post as inappropriate

Akinbo Ojo replied on Sep. 17, 2015 @ 14:56 GMT
Hi Vladimir,

Thanks for your comment.

I didn't see, Eric's slide entitled "particles don't diffract" with its theoretical proof" that you referred to. I would like to see it.

There were quite a number of interesting abstracts at the conference.

I am a bit confused about Eric's alpha particle behaviour. Did he say electrons are waves and not particles?



report post as inappropriate

Vladimir F. Tamari replied on Sep. 20, 2015 @ 02:33 GMT
Dear Akinbo

I am glad you are taking a serious interest in Eric's work. The slide is seen in the red link "Lecture Slides" on Eric'e website

I am not an expert on particles - but alpha waves are not electrons. In any case according to deBroglie's dictum each particle has an associated wave-field.

Eric is saying gamma-rays (high energy electromagnetic photons) are not particles, hence particle-wave duality does not exist, hence the basis of Born's Quantum probability is explained - Schrodinger's equation refers to a real wave in a real medium and there is no spookiness in QM. This is a major experimental discovery!

With best wishes


report post as inappropriate

Steve Dufourny wrote on Sep. 17, 2015 @ 10:30 GMT
I beleive that several utilise the names of serious thinker for a confusion.

I agree that the works of Hooft are relevant about the weak electro magnetic interactions.The chromodynamics are relevant indeed ,that is why he has had the noble prize for his research in quantum mecanics.

He has corectly utilized the correspondance for the correct calculations of mass mesons.In one space, one time !!!His works are rational about QCD.That is why he podered his magnetic monopoles.

If now people confounds his works with the spookiness at distance, so where are we going ???

Never he d say that electromagnetism can imply these exrapolation, because him, he understandS the SR and its laws.If people utilizes his name, so respect his works. Please see that he respects the SR,him.His works permits to class the QCD and its phases, but all these phases respect the SR and the standard model.

It is the same with the works of Mr Witten.Please respect their works about our foundamentals. When you name them, respect their lineof reasoning.

A lot of people confounds a lot of things at my humble opinion.I am asking me how is it possible to mix all without real general determinism.A real soup anti deterministic ,that is all.


report post as inappropriate

Steve Dufourny replied on Sep. 17, 2015 @ 10:33 GMT
That said, I respect your team Stefan,eric and Vladimir.The medium ,it is that ?

report post as inappropriate

Steve Dufourny replied on Sep. 17, 2015 @ 10:39 GMT
Aether is not with electromagnetic waves ,how are you going to understand that you confound the waves!!! Utilise the gravitational waves and black particles please with the spherical volumes, all will be clearer for your spookiness at distance.

Be sure! I don't understand why you want that these electromagnetic waves do not respect this SR.It is not rational this line of general reasoning,so why you insist.It is bizare.We are not in a film of sciences fiction it seems to me.

report post as inappropriate

Steve Dufourny replied on Sep. 17, 2015 @ 12:02 GMT
it is how we use the complexs and the correlated physical foundamentals.So the conclusions about the phase waves can be misunderstood if the physical limits are not inserted.

The information phase needs a real rationality like for our genomic encoding for example.pi ,i,-i,o are pure imaginaries.So it is not with the pseudoscalar geomtery the problem but the conclusions implyied when we analyse the charge, the sense of rotation. I am persuaded that the sphericalvolumes are important for phases of encoding.Fourier can help at my humble opinion.If the superposition of waves can be in phases of encoding considering the spherical volumes and the velocities of rotation,it becomes relevant for the sortings and superimposings of encoding considering the external waves and the intrinsic waves(quantum entanglement).This logic is for the electromagnetism ,but it can be extrapolated for the gravitational waves and the black particles and their encoding in nucleis. The 3D is essential for the respect of SR, the standard model and the generalrelativity.The volumes of sphères produced ,cosmologically speaking ,are so relevant for the phases of encoding when we consider that electromagnetism is produced by stars and gravitational waves by the BH cosmologica sphères.

it is relevant if the harmonical analysis is inserted for the phases of encoding of evolution spherastion with an entropical increasing correlated with this mass in increasing also.

It is relevant for the mathematical explainations of encoding of bosons and gravitons also if the spherical volumes and their rotations orbital and spinal are inserted with determinism like in our genomic system.

report post as inappropriate

Steve Dufourny wrote on Sep. 20, 2015 @ 14:54 GMT
There is nothing new and bizare with our Anisotropy.It is just a tool known for several technologies.It permist to correct the errors simply.Not necessary to discuss about this during a long time.

It is just a tool.

report post as inappropriate

Steve Dufourny replied on Sep. 20, 2015 @ 15:04 GMT
It is just a property permitting to calculate corrzectly several things in several centers of interest(geology,electricity, optic and this and that)

Why so?Just a lost of time ,that is all and the direction is not the problem like a rotation.

report post as inappropriate

Don Limuti wrote on Sep. 21, 2015 @ 05:18 GMT
Hi Zeeya,

Lots of strong opinions on this one. I have one also:

What is going on?  John S. Bell theorized that no “hidden variable” could exist that would not make quantum mechanics “grossly non-local”. So, a hidden variable would not help explain the quantum mechanical result. Alain Aspect went on to prove Bell’s inequalities and said “The experimental violation of Bell’s inequalities confirms that a pair of entangled photons separated by hundreds of metres must be considered a single non-separable object — it is impossible to assign local physical reality to each photon.”

Bell and Aspect came to a very interesting conclusion but it did not answer Einstein’s objection about information traveling faster than the speed of light. And we are left in the interesting position that everybody is right and something is wrong. That something is uncertainty and superposition.

My strong belief is that this discussion on the reality of QM will never be resolved until the uncertainty principle and superposition are corrected.


Actually, compared to a thread I saw a few years ago this one is very civil.

And Hi to all my friends at FQXi

Don Limuti

report post as inappropriate

Steve Dufourny replied on Sep. 21, 2015 @ 09:46 GMT
hello from Belgium,

It is cool that you are there also


report post as inappropriate

Joe Fisher wrote on Sep. 21, 2015 @ 14:12 GMT
The real unique Universe am infinite. There is no way finite spookiness could exist so your absurd claim to have found some is typical abstractions addicted white male theoretical physicist’s codswallop.

Joe Fisher, Realist

report post as inappropriate

Steve Dufourny replied on Sep. 21, 2015 @ 16:41 GMT
Mr Fisher,

You must develop, you must explain, you repeat always the same words, but please develop scientifically speaking.

If you are realist, please be realist in developping.

report post as inappropriate

Don Limuti replied on Sep. 21, 2015 @ 20:35 GMT
Hi Joe and Steve,

Glad to see you both. FQXi threads are still alive and well. I hope to enter the next essay contest (if it is a good topic) and hope to see you both there.

Steve: I hope things are tuti fantastico!

Joe: You caught me thinking again. I find I engage in this "spooky" activity on occasion.

"I think, I think I am, therefore I am, I think.

Don Limuti (I think)

report post as inappropriate

Steve Dufourny replied on Sep. 22, 2015 @ 09:51 GMT
Hello Don,

I am well, thanks .For the contest,my English is not good enough

Happy to see you again


report post as inappropriate

Teresa Mendes wrote on Oct. 3, 2015 @ 17:00 GMT
What bothers me is a lie told too many times.

As Lenin once said: "A lie told often enough becomes the truth".

There is no "loophole free Bell test" successful experiment violating Local Realism.

If you check Hensen et al.'s protocol you will realise that there is biased sampling.

If you want to see a comprehensive and detailed discussion on this subject, check this debate at Scott Aaronson's blog, and make up your own mind:

Inicial post: Bell inequality violation finally done right

Discussion (from #117 to #143), with all major Bell's experiments addressed: begins here

Please, don't just read the titles and the conclusions of those articles and never bother to check if they really prove what they claim. This is just one more article pretending to violate Local Realism, with a misleading title, like all the others "successful" violations of Local Realism.

report post as inappropriate

Steve Dufourny replied on Oct. 3, 2015 @ 17:51 GMT
Hello Ms Mendes,

I am happy to see on other pERson liking the Bell's theorem.Determinism after all is the torch of truth .

Empirically yours.

report post as inappropriate

Steve Dufourny replied on Oct. 3, 2015 @ 19:11 GMT
I totaly agree about a realistic paradigm of our physics.

It is essential to respect what are our foundamental équations and their nmeaning.The rule of teachers is to show the rational road.The local realism is so important.The creatiity or the free will or the imaginations can be empiric also.It is even still more wonderfull to encircle the steps of entropy in the quantum sense and the cosmological sense.The spirituality can be rational also respecting the intrinsic laws of our universe.We evolve and the équations, theorems,postulates can be imrpoved in a pure dterministic road.It is not necessary to break our laws.It is not necessary to violate our accepted foundamental physical laws.The invariances of lorents, the CPT symmetry,the bell'inequalities ,or this or that..... don't need an irrational break.We can extrapolate our équations and models but in respecting theiruniversal meaning.

Let's take a simple example of SR.We cannot go faster than c with photons,bosons.Why so persons try to see a photon, a boson,a electromagnetic wave faster than c.It is not rational.If you take a graviton which is not a boson(it must be classed in an other rule)and if we extrapolate with theprobable linear velocity very important in function of its spherical volumes,so we can suppose that it is faster than c , but there it is not against our SR because we don't utilise a photon or a boson but a graviton Under the laws of gravitational spherical waves.It is an other logic than SR but thislogic respects it simply.The local realism so at all scales can be empiric and be a real universal paradigm .Teachrs at universities or centers or labs, always must respect the foundamental laws and équations,models and theorems accepted by the international scienes community.The rest is vain after all.

report post as inappropriate

Teresa Mendes replied on Oct. 3, 2015 @ 19:58 GMT
Thank you Steve for the thumbs up and complementary comments.

You are right. People should accept those fundamental laws of Physics - engineers do - but I don't see anything wrong when scientists try to break those limits. I think it's their job to push the limits.

The problem, today, is that the "laws" of Quantum Mechanics are the "accepted" new laws and anyone trying to question them is seen as "heretic".

For example, I would think it would be logic and honest, knowing that all the previous Bell's experiments were inconclusive (that is what loopholes are all about), there should be a rule that should make those authors change their article's title - because most scientists under "normal" Kuhn's phase, just read the title and conclusions because they are only interested in getting validation.

Aspect et al. 1982 experiment, the most influential experiment in the physics community validating Quantum Mechanics and entanglement, has the following title: "Experimental Realization of Einstein-Podolsky-Rosen-Bohm Gedankenexperiment: A New Violation of Bell's Inequalities".

Does any specialist in Quantum Foundations believes that that title is true?

I believe those Titles are just misleading the future generations.

this post has been edited by the author since its original submission

report post as inappropriate

En Passant wrote on Oct. 6, 2015 @ 01:30 GMT
I will only say this once, and I will only say it here. (Frankly Dear...)

Correlations were observed. A cause for those was postulated. But it needed a name. In Western language, a thing has attributes, and can cause things to have effects. At least that is how we formulate things so they make sense to us.

So we saw these correlations, and we thought they had to be caused by something. So we gave it a name. Entanglement. That does not give it existence. No one defined how it actually works. How does entanglement take effect. Explain the mechanism by which the putative entanglement works. No one can do that, and it is in plain English "hogwash."

This whole enterprise of QM is nonsense. Yes, its formulas work (thanks to the genius of its formulators), but their ideas and analogue explanations are dead wrong.

There is no entanglement (unless you can explain its precise mechanics). It is just an excuse for things we don't understand.

I could tell you much more, but I am an alcoholic, and I am barely struggling to get along.

report post as inappropriate

En Passant replied on Oct. 6, 2015 @ 02:29 GMT
OK, I got a bit more energy (for inexplicable reasons).

When people observed certain correlations, they reasoned that there had to be a "thing." After all, everything is caused by a "thing" (ain't it?).

A thing has attributes and effects (at least in our language). So, in the absence of understanding why certain correlations were observed, we posited a certain thing that caused it, and had to give it a name "entanglement." There is no possible way to observe "entanglement" (it is entirely inferred), but giving it a name makes it believable.

Quite simply, a correlation was observed, and in our way of thinking there had to be a cause. We never observed that cause, but we gave it a name ("entanglement"). It is a fiction.

For those who believe in it, please give the exact mechanics by which it works. How does it couple two (or more) elements together, and how does it exert its effect across infinite distances in zero time? If you cannot answer these questions, then your idea of entanglement is just a "word" to let you live with the paradigm you like.

Let's face it. There is no experimental support for the idea of "entanglement." It is just a magic word to explain away something for which you do not have a full understanding.

Of course there is no such thing as entanglement. This is just so preposterous that I am doubting my own sanity of being smarter than the best brains in physics. You are better than I am. So get to work, and figure it out for real.

report post as inappropriate

Don Limuti replied on Oct. 6, 2015 @ 05:50 GMT
Hello En Passant,

I agree with your very interesting post.

When you say a thing has attributes, you touch on something that is fundamental to Hindu philosophy, and I remembered a most remarkable lecture. You will not need any alcohol for this lecture by Jay Lakhani. I highly recommend it.

I would also recommend my website, and in this case it would probably be a good idea to have a few shots ready :)

Don Limuti

report post as inappropriate

Richard Gill replied on Oct. 6, 2015 @ 09:55 GMT
Well, now there is experimental evidence for the real world existence of a phenomenon (violation of Bell inequalities in a loop-hole free experiment) which can be predicted from the mathematical framework of quantum mechanics. We associate it with a mathematical feature of this framework which we choose to call "entanglement". Since quantum mechanics cannot be reduced to a primitive physical picture (of billiard balls hitting one another) which we call local realism, our little brains do not "understand" it. There is a mathematical theorem (Bell's theorem) which also says that the quantum mechanica framework cannot be reduced to a primitive picture of billiard balls moving about and hitting one another.

Maybe one problem is that we insist on using words like "entanglement". What's in a name?

And why should we have the arrogance to demand that our little brains can "understand" everything, when "understand" means no more than "reduce to a picture of billiard balls bouncing off one another"? Fortunately our brains are good enough to allow us to use mathematics to go beyond the limitations of our earth-bound imagination. Our brains evolved on this planet by evolution. They have satisfied their "purpose" pretty well since there are a lot of us about now and we've almost destroyed our planet now. There's no reason why they should ever feel comfortable when confronted head-on by physical and mathematical evidence that the universe cannot be understood in a simplistic way.

this post has been edited by the author since its original submission

report post as inappropriate

Teresa Mendes wrote on Oct. 6, 2015 @ 13:16 GMT

You can say as many times as you want that "there is experimental evidence for the real world existence of a phenomenon (violation of Bell inequalities in a loop-hole free experiment) which can be predicted from the mathematical framework of quantum mechanics".

In the seventies everybody said the same about Clauser's experiment.

In the eighties, the same with Aspect's.

On and on, teachers tell students. Tv documentaries are made, there are thousands of animated videos on youtube, scientific publications and media republish opinions ...

As Lenin once said: "A lie told often enough becomes the truth".

Just go and look at each one of these experiments - and you know it is not true, Local Realism was NOT rejected on those experiments.

Even regarding this last experiment, the Delft experiment: how can you claim the violation of Local Realism, when what you have is biased sampling so you can claim again the existence of this prediction of a theory, named entanglement, just to justify to investors the millions spent in technologies, like quantum computers, that are supposed to work in our world? That is what the Delft team did, and they've a 50 million US$ investment by INTEL Corp.

Which quantum computer? Have you seen the protocol that they are using for instance at D-Wave? Asmann's ! Did Ansmann et al. claimed to have violated Local Realism? Yes they did claim it !

Do you mind looking at it again? Do you care to explain, knowing that they got to that conclusion using Korotkov inequality, why does that inequality have different limits in the negative and positive side of the inequality? Doesn't it look "strange" to you? Korotkov's inequality is wrong due to a calculus mistake.

Perhaps the scientific community need to better check it's foundations and stop doing so much propaganda.

report post as inappropriate

Steve Dufourny replied on Oct. 6, 2015 @ 14:24 GMT
Happy to see your words.I thought that I was crazzy :)

What a world dear Mr Mendes.

report post as inappropriate

Teresa Mendes replied on Oct. 6, 2015 @ 14:44 GMT

You have to understand that this is a "classical" fight, as predicted by Thomas Kuhn, in scientific revolutions.

Each side will always find criteria to support their view. Changing paradigm is a personal choice.

I don't think anybody can win anything by discussing emotionally. There is no need to escalate emotions.

This discussion has to be a rational one. We need to create enough "reasonable doubt" in the minds of physicists. They will have to do that decision on their own, that will be hard, because funding and career and peer pressure are in the middle.

I do think an alternative local realist theory is already out there. It could be yours. But in "normal" science no one will listen to you. There has to be a paradigm shift so that Local Realist theories can be discussed, analysed, experimentally tested and purged, ou not, as a normal process in science.

Changing paradigm is where the fight is needed - alerting people that there are two scientific viable paradigms - Quantum Mechanics and Einstein's Local Realism.

[It is always nice to have Einstein on our side :) ].

We will need all those beautiful and extremely well trained minds of all physicists in the world to overcome the revolutionary phase and get to a more fruitful normal phase again under a new paradigm.

I would recommend, for now, that we focus on this issue - entanglement, Bell's inequalities and experimental tests, and avoid dispersing.

Here is where the scientific revolution begins.

What do you think?

report post as inappropriate

Steve Dufourny replied on Oct. 6, 2015 @ 15:47 GMT
I am understnding and I agree.

I just want to test and experiment my theory and équations me,I just want to evolve and to complete my works on a rational roads respecting Copenaghen.I have begun on net to shareit more than 8 years ago.I improve my works and I study in the same time.I think in all humility that my general works is relevant.I don't see an other general logic that this...

view entire post

report post as inappropriate

Teresa Mendes wrote on Oct. 6, 2015 @ 15:15 GMT

why are posts disappearing from this discussion?

Were they considered inappropriate?

They are still listed on the left column ...

(a Richard Gill's and my answer)

attachments: FQXI_desapeared_posts.tiff

report post as inappropriate

Thomas Howard Ray replied on Oct. 6, 2015 @ 15:27 GMT
They're visible to me.

report post as inappropriate

Teresa Mendes replied on Oct. 6, 2015 @ 15:54 GMT

The Richard Gill's post that begins with "Teresa, every time (245 times, in fact) [...] " ?

And my answer: "Richard, I read the article, so did you [...]"

report post as inappropriate

Thomas Howard Ray replied on Oct. 6, 2015 @ 16:46 GMT

report post as inappropriate

Don Limuti wrote on Oct. 6, 2015 @ 23:03 GMT
Hi Teresa,

1. I have all your posts, nothing has mysteriously disappeared ....yet.

2. Overall I prefer Einstein to Bohr, but both viewpoints have problems.

3. I love this bash of viewpoints...and the not deterministic physicists have gone off the deep end of the pool (IMHO).

4. And the deterministic physicists (me) have a very hard row to hoe. And it has nothing to do with Bell's work. The problem is more basic than that:

Kant: The thing in itself is unknown and unknowable by the categories of the mind.

We know an electron or a car by its attributes, and we can argue endlessly about the attributes,

And yet the electron and car in itself are unknowable by direct experience. The only reality about a car or an electron is agreement about the attributes (observables).

In spite of this I think the future holds great promise for the determinists.

Don L.

report post as inappropriate

Anonymous replied on Oct. 7, 2015 @ 00:47 GMT
Hi Don

Nice topics! Let me answer email-style ...


>1. I have all your posts, nothing has mysteriously disappeared ....yet.

[Yes ... I know ... I feel like a "blond". Duhh...]

I'm glad you have read them all. Did you like them? Anything to add?

Did you follow the pointers? J.Especial's article. Asmanni's...

view entire post

report post as inappropriate

Teresa Mendes replied on Oct. 7, 2015 @ 00:51 GMT
[Grrrrr ... I'm not anonymous !]

Cheers, Teresa

report post as inappropriate

Don Limuti replied on Oct. 8, 2015 @ 21:45 GMT
Hi Teresa,

Thanks for your reply. Sorry for my delayed response.

1. I followed your references as well as I could. I did not find them clear or conclusive (my lack of background).

However, your comments that the experimental results were "cherry picked" is very important to this whole discussion.

2. Yes, QM is just a theory and it is evolving.

3. Your...

view entire post

report post as inappropriate

Anonymous wrote on Oct. 8, 2015 @ 04:26 GMT
Isn’t it curious that the originators of QM (theory) were into Eastern mysticism? So they combined mathematics with an explanation that would defy its own explanation.

Let’s at all times keep in mind the distinction between what we think is a representation of the universe and what it actually it is. The latter is not known, other than through the former.

We can only know things via things we already know. How can you start from zero?

The proponents of magic (i.e. entanglement) are driven by the motive to look at things we cannot explain. Meaning, that we cannot explain it in terms we already understand. They enjoy that, and in the evolution of man (equal to woman) this may have been an adaptive trait (to marvel at “things”).

But the kind of magic that most QM interpreters advocate is no longer adaptive. It needs to be consciously abandoned. Tom Ray has some useful ideas, and his treatise about probability is pretty much equal to Einsteinian insight. It’s just that it will not get the attention nor recognition it deserves. I am still thinking about it, and it will take me some more time to understand it. Obviously, I am a slow learner, but I make no apologies (it’s the best I can do).

report post as inappropriate

Thomas Howard Ray replied on Oct. 8, 2015 @ 13:54 GMT
Thank you, anonymous poster.

That was actually the point of my comparing the predictions of the I Ching with quantum computing.

It isn't that one is necessarily technology and the other magic, They are equal parts technology and magic. Equally effective in navigating through life. And your analogy to adaptive behavior is spot on. Quantum theory is not a bona fide theory, nor quantum computing based on entanglement a bona fide technology. They are purely empirical strategies.

If we are to reach the next level of understanding, you're absolutely right -- we must abandon incomprehensiblity rather than (as Richard Gill urges) embracing it. Time is on our side.

Thanks for a most insightful post.

attachments: Ray_FQXi_essay_rev3final.pdf

this post has been edited by the author since its original submission

report post as inappropriate

John R. Cox replied on Oct. 8, 2015 @ 15:25 GMT

Nope, t'was not I that posted that comment, but I wish I'd said it! :-> jrc

report post as inappropriate

Thomas Howard Ray replied on Oct. 8, 2015 @ 15:39 GMT
Wow -- do you know what this means? It means I have more than one supporter. :-) Will edit.

report post as inappropriate

En Passant wrote on Oct. 10, 2015 @ 16:55 GMT
Well let’s discuss epistemology.

At its most basic, it is just a way of knowing how to deal with the world. It is based on paradigms we already understand.

It may be incorrect, or too simple. But it is just a way of using what we already know, and applying it to future events.

What the proponents of most QM theories espouse is that there is a certain magic, and we dare not...

view entire post

report post as inappropriate

Steve Dufourny replied on Oct. 10, 2015 @ 17:28 GMT
I like the S3 space, Hilbert and Minkowski.It is interesting to see our local realism Under the laws of uncertainty principle.:)

report post as inappropriate

Steve Dufourny replied on Oct. 10, 2015 @ 17:53 GMT
I ask me how I can insert these Tools.

The correct serie of spherical volumes

The correct 3D polytop


The QCD improved

Planck lenght

An the correct geomtrical algebras respecting localrealism and uncertainty principle.

My two équations explaining gravitation with the spherical volumes of dark matter.

Bell's innequalities dance still and always around the spheres

report post as inappropriate

Anonymous wrote on Oct. 11, 2015 @ 16:06 GMT
Hi En Passant, Don Limuti, Steve Dufourny (and all other Local Realists)

You are Local Realist. Good. Let me try to convince you to join the scientific revolution movement.

As Realists you believe the world existis and has specific properties that are independent of us, humans, but it doesn't mean that if you don't understand them the world will disintegrate - it means we still don't...

view entire post

report post as inappropriate

Richard Gill replied on Oct. 18, 2015 @ 13:58 GMT
Dear Teresa

I believe you are mistaken to say that the Delft experiment is inconclusive because of biased sampling. The decision as to whether or not any particular pair of NV (Nitrogen-vacancy) measurements is included in the sample is made *before* the two measurement settings are chosen and the two measurements are made on those two particular diamond defects.

The fact that we do...

view entire post

this post has been edited by the author since its original submission

report post as inappropriate

Richard Gill replied on Oct. 18, 2015 @ 14:01 GMT
[Sorry the last part of that last post by me is missing because of an arrow being misread as an html tag]

Before rejecting the Delft experiment out of hand, one should understand the rationales behind the traditional experiments with a layout like: Alice < - Source - > Bob, and "event ready detectors" experiments with a layout like Alice - > Sink < - Bob. Classically, we can create correlation between what we see at Alice and Bob's places by conditioning on a common source. This is well understood. But we can also also create correlation between what we see at Alice and Bob's places by conditioning on what we see at a common sink. The principle works both for classical correlation and for quantum correlation (entanglement) -- that is to say, the mathematical principle works.

I am not assuming that quantum entanglement "exists", I am just saying that post-selection by conditioning on results of a joint measurement at a central location is simply a device for creating an ensemble of correlated bipartite systems. In the Delft experiment, *all* systems created in this way are measured, there is no "selection bias".

If there is any problem with this experiment, it is simply that the number of systems created in the experiment was so small (245), that a sceptical person can still argue that local realism can also generate results like those observed. One in forty times you could get, by chance, something as extreme as was actually observed under local realism.

Let's wait and see if the experiment can be replicated by the same group or others. Personally, I doubt that the Delft experimenters just had a (one in forty) lucky break.

this post has been edited by the author since its original submission

report post as inappropriate

John R. Cox replied on Oct. 18, 2015 @ 19:01 GMT
"Bell envisaged in 1981 particles emitted at the central location..."

which assumes that whether the experimenter is working with 'light particles' (photons) or sub-atomic particles, that the entire energy content of that closed system particle exists in an undefined volume treated as a zero dimensional point. That allows an infinite, unbounded, non-real, probability space. How does that NOT bias any choice of measurements?

Does 'counterintuitive' also extend to the rules of logical form in arguments? I think in the rules of rhetoric that would qualify as an 'amphiboly'.

"She walked into my life/ with those cold and evil (eyes? - lies?)/ and with the length of her mind/ she dark the sun." B. Leadon & G. Clark - old Country and Western song 'She Dark the Sun'.

rock on, jrc

report post as inappropriate

Georgina Woodward wrote on Oct. 24, 2015 @ 19:10 GMT

A cup is suspended between two observers who are facing each other at different sides of a room, though they could be more distantly separated. An experiment is devised whereby the orientation of the cup is such that one observer can see the interior of the cup, which is red and the other can see the bottom of the outside which is black. In this way the em radiation emitted...

view entire post

report post as inappropriate

Georgina Woodward replied on Oct. 25, 2015 @ 00:47 GMT
Lorraine, All

if you take a look at the macroscopic analogy in my previous post it is possible to see that the probability of a 100% certain outcome for the second observer's newly generated sensory output becomes known as soon as the first observer's newly generated output is identified by someone aware of the 'entanglement'.

Without knowledge of the entanglement the probability...

view entire post

report post as inappropriate

Luca Valeri replied on Oct. 25, 2015 @ 11:45 GMT

I'm afraid it is not enough to create pairwise perfect correlations and constraint the observability of different attributes locally. This would be like creating a number of Bertlmann's Socks and destroy the other socks you have brought with you, after having watched one. That would not break Bell inequality. You would have to break the very possibility to observe something for the other observer. And this cannot be done with local interactions in a realistic picture of the world.


report post as inappropriate

Lorraine Ford replied on Oct. 25, 2015 @ 21:51 GMT
Georgina, Luca,

I believe that physics’ majority opinion is correct: there IS “spooky action at a distance” which defies explanations which can be compared to 2 views of a cup/Bertlmann’s socks. Quantum entanglement is real and not an illusion.

This entanglement seemingly results in a law-of-nature relationship that exists between particle spin information, no matter what the separation distance of the 2 particles. This seems to confirm that space itself (i.e. all particle spatial-type information like distance, velocity and momentum) is not a barrier to law-of-nature relationship because space itself is merely the OUTCOME of law-of-nature relationship.

To me, this further seems to imply that gravity is likely to be due to a pure uncomplicated law-of-nature relationship between particle mass information, and not due to complicated graviton interaction events.



report post as inappropriate

Lorraine Ford wrote on Oct. 24, 2015 @ 22:17 GMT
Georgina, Teresa, Don and others,

Laws-of-nature ARE “spooky action”, no matter what the distance.

Seemingly laws-of-nature decree that PARTICLES cannot travel faster than light. But there is seemingly no law-of-nature that says that the LAW-OF-NATURE RELATIONSHIPS themselves have a speed or distance restriction on them. I.e. “influence” could potentially “travel faster than light” i.e. be instantaneous.

It all depends on what exactly is the nature of laws-of-nature.



report post as inappropriate

Steve Dufourny replied on Oct. 27, 2015 @ 12:25 GMT
Hello Ms Ford,

I agreeif weconsider a gravitational aether from the central sphere.In logic if my equation is correct of course, the velocity of gravitons produced is very speed.And the correlated waves also.It exists so akindof instantaneity but that does not affect our photons, bosons andperhaps even our fermions.If a spooky atdistance exists so it is with this gravitational aether instead of a lumin einsteinian aether.That w why a spooky at this moment is not possible, we are too far of 10^-35 m and in the cosmological scale also.We don't know still what are the quantum dark matter sphères encoding these gravitons produced by these cosmologicalBH in logic if my humble reasoning is correct considering the encodings of gravitons by our quantum uniqueness.The gravitons produced by the main centraluniversalBH sphere are it seems instantaneous.A photon cannot pass c byt a graviton in logic yes.It becomes relevant to consider the correlated entropy in considering the gravitons like an quanta of gravitational energy with a kind of heat.There is relevance when we consider the number of gravitons and their velocities.Evenif this graviton has a weak gravitational energy, the number permits tohave others steps of energy more important than our nuclear forces.It islogic due to the number of gravitons encoded and their different sphericalvolumes.The main universal codes so are gravitational and not bosonicinformations which are complementaryin fact.


report post as inappropriate

Don Limuti wrote on Oct. 25, 2015 @ 00:18 GMT
Some Comments,

Luca's example on probabilities in dice rolling is quite interesting. I would rephrase it a bit as follows:

-A die is tossed a million times and 6 comes up each time. A million dollars is bet on the next roll. The bet is between a 6 being rolled and any other number (1,2,3,4,5) being rolled. There are two betters a QM expert and a practical person who needs the money...

view entire post

report post as inappropriate

Georgina Woodward replied on Oct. 25, 2015 @ 01:08 GMT
Hi Don, good point about the influence of the universe. There is a very small probability that the die will land and stay on an edge. I have witnessed this in a game where any dice roll would have won the game for my opponent. I pointed this out, the die was cast but never fell to show a number face up. I don't know what exactly was the cause most likely some irregularity of the surface on which it was cast. It was amazing and I wish now that I had ascertained the exact cause and if it was the die itself kept it to see it perform the same trick again. Still it can happen and so the probability isn't precisely 1 in 6.

I think after a million tosses coming out the same even an unreasonably confident expert would suspect some kind of experimental error : )

report post as inappropriate

Luca Valeri wrote on Oct. 25, 2015 @ 12:17 GMT
Hi Don,

Thanks for your precious feedback from Oct. 23, 2015 @ 20:20 GMT. I agree with you, that 'real is very problematic', since we do not have direct access to reality, but only through our observations (and theories). And I can fully understand, that you prefer to use 'deterministic' instead.

However if we accept, that we have access to properties only through experiment and some experiment destroy the very conditions to make the observation of a complementary observable. For example we need starting position and momentum to predict the future of a particle. But since we cannot know both, because the measurement of one makes the knowledge of the other impossible, we can only make probability forecasts. So if you want to maintain determinism, you'll have also to maintain some sort of realistic picture saying that determinism refers to the never simultaneously observable properties position and momentum. So that these properties (hidden variables) exist, independently whether they are observable or not.


report post as inappropriate

Don Limuti wrote on Oct. 25, 2015 @ 15:39 GMT
Hi Luca,

I appreciate the feedback. Your insight into probability was key.

About the linkup of momentum and position:

1. I like to think of velocity instead of momentum. There is a simple relationship between them but velocity is more intuitive to me.

2. I believe QM has made a monumental blunder by accepting Heisenberg's uncertainty principle. It fails to account for...

view entire post

report post as inappropriate

Luca Valeri replied on Oct. 28, 2015 @ 10:59 GMT
Dear Don,

Since I prefer Bohr to Einstein in what concerns QM - although I think we have to take Einsteins concerns seriously and have to reply to them - allow me to comment on your post from my personal view.

In a to be developed theory I consider, that physical concepts/properties are defined by possible operations/measurements that one can do on an object. Finally I hope that concepts and measurement operations can be derived from symmetry operations.

So from the fact(?) that a measurement devices cannot store all information of a system, it seems natural to start from information/probability relations and then go on to unitary (information conserving) processes. Putting informational relations first and derive the causal structure as unitary transformation I think solves a lot of interpretational problems that come from the opposite direction, which tries to derive probabilities or randomness from the deterministic Schrödinger equation.

I like your hopping picture. It seems to me it tries to solve the problem of the potential continuity of contingent properties/concepts (like space) and the discretionality of facts or phenomena. But I think I did not really understand your hopping theory.

And I also like more velocity than momentum. However momentum is linked to space translation as symmetry transformation and as such to a complementary operation on space. But as I said, I have to work this out. It is also linked to your wavelength lambda (how did you write your lambda in your post?).

I hope we'll find the time to continue this discussion another time in another forum, where it fits more.



report post as inappropriate

Don Limuti replied on Oct. 28, 2015 @ 22:36 GMT
Hi Luca,

1. I just copied and pasted the lambda "λ" from an online search. You can just copy and past the one above in quotes.

2. Sometimes naming things is useful. λ-hopping is an invented term I use to indicate how quantum mechanical objects (things that can produce interference patterns) move discontinuously.

3. A wikipedia entry states that: Later in his life, Bell expressed his hope that his work would "continue to inspire those who suspect that what is proved by the impossibility proofs (of local variables) is lack of imagination."

The missing imagination in the proofs is their latching onto "missing CONTINUOUS local variables" as the problem. Imagination is required to think of this variable as a DISCONTINUOUS local variable.

I believe the variable that is missing in QM is an unsuspected aspect of velocity. And on the quantum level I believe the nature of velocity is discontinuous and not uncertain as standard QM would have it. Discontinuous is deterministic while probabilistic is not deterministic. AND of course experiments are needed to see if this λ-hopping concept has any merit.

4. I think we agree and disagree. Let's keep up the conversation... perhaps in the next essay contest..... is hinting (via its grant program) that it will be about the "nature of the observer". This could be quite interesting.

Yours in certainty,

Don Limuti

report post as inappropriate

Frank Martin DiMeglio wrote on Nov. 9, 2015 @ 22:57 GMT


report post as inappropriate

Frank Martin DiMeglio wrote on Nov. 9, 2015 @ 22:58 GMT

report post as inappropriate

Joy Christian wrote on Nov. 13, 2015 @ 16:42 GMT
According to the following paper published yesterday the much hyped "loophole free" experiment done by the Delft group and discussed in this blog is deeply flawed because it violates the no-signaling condition respected by quantum mechanics:

debunking the hype

I would advise to think twice before trying to "spook."

report post as inappropriate

Thomas Howard Ray replied on Nov. 14, 2015 @ 12:08 GMT
Thanks for the link, Joy. I'm sorry to see the Delft group victimized by its own overreach.

report post as inappropriate

Thomas Howard Ray replied on Nov. 14, 2015 @ 13:22 GMT
We find that once again, the failure to account for the behavior of time is the stumbling block opposing a "loophole free" experiment.

Bednorz concludes:

" .... quantum mechanics may have some strange nonclassical features such as violation of time-reversal symmetry [33] but not necessarily violates local realism. We have demonstrated that the latter can be reconciled with quantum mechanics by making two amendments in the theoretical description: (A) Joint measurement description for all choices simultaneously, (B) introduction of many worlds – the actual system is multirepresented. Contrary to the original idea, the worlds are not splitting (their number is constant) and interact locally and weakly microscopially but strongly marcoscopically making them similar in the observable reality."

I tried to get point (A) across to Richard Gill. The fact remains that in a time-dependent model, experimenters do not physically have the freedom to choose either of two settings at the same time. It's irrelevant that Alice and Bob are independent of each other and have two choices each; they are pair-correlated in spacetime.

I don't think you would agree with point (B), although I do. This may be the next big experimental research area.

Mostly, though, I think your mathematical framework (attached) for 3-sphere dynamics is correct.

attachments: Disproof_of_Bells_Theorem_page_7_of_arXi.pdf

report post as inappropriate

Rick Lockyer replied on Nov. 15, 2015 @ 16:11 GMT
Tom wrote: "Mostly, though, I think your mathematical framework (attached) for 3-sphere dynamics is correct.

attachments: Disproof_of_Bells_Theorem_page_7_of_arXi.pdf"

Really? There is no need for the limits s->a and s->b. The claim A and B are = +/-1 and they are the product of two bivectors imply these pairs are parallel or anti-parallel, not in the limit. The dummy variable s in s->a is not the same dummy variable s in s->b, since Alice and Bob have free choice on a and b. It is blatantly obvious then that one cannot assume the two L's in s can be combined.

Get it??

report post as inappropriate

Joy Christian wrote on Nov. 20, 2015 @ 03:00 GMT
Hi Tom,

Thanks for linking my paper above and for your comments. In the light of that one-page paper I have now substantively expanded my "local causality" paper which you might find interesting:

See especially the new, simplified local-realistic derivation of the EPR-Bohm correlation on page 8.



attachments: Local_causality.pdf

this post has been edited by the author since its original submission

report post as inappropriate

Thomas Howard Ray replied on Nov. 20, 2015 @ 14:40 GMT
Hi Joy,

Thanks. I have no doubt that I will find the revision interesting!

And I want to thank you for introducing me to Adam Bednorz's work. Besides demolishing the "no loophole" myth, he echoes in an earlier paper ( Local Realism in Quantum Many Worlds ) what I have been trying to get across to R. D. Gill, without success:

"Local realism is trivally correct in classical mechanics, because it itself provides the desired local process. This is no longer obvious in quantum mechanics because all we have are detection results with no direct construction of the process."

The pieces are falling in place.



this post has been edited by the author since its original submission

report post as inappropriate

Thomas Howard Ray replied on Nov. 20, 2015 @ 19:51 GMT
Elegantly done, Joy.

The simplicity of the FRW model applied to the quantum scale, is a breakthrough. It underscores the point that cosmological initial conditions are self-similar to an arbitrary moment, K = 0 (K a Constant). From an internet source:

"If K = 0, the space part (t = const.) of the Robertson–Walker metric is flat. The 3-metric (the space part of the full metric) is that of ordinary Euclidean space, with the radial distance given by ar. The spacetime, however, is curved, since a(t) depends on time, describing the expansion or contraction of space. It is often said that the 'universe is flat' in this case, though if the universe is understood as the four-dimensional spacetime (as opposed to a spatial slice), 'spatially flat' would be more correct." []

icle pair correlation is encoded in the cosmological initial condition. "More specifically, we have presented a local, deterministic, and realistic model within such a Friedmann-Robertson-Walker spacetime which describes simultaneous measurements of the spins of two fermions emerging in a singlet state from the decay of a spinless boson." (Christian attachment)

this post has been edited by the author since its original submission

report post as inappropriate

Steve Dufourny replied on Nov. 23, 2015 @ 10:50 GMT
Interesting, but the local realism is not really respected.The fermions at this BIg Bang , you say ...............PROTONS AND ITS STABILITIES ARE RELEVANT CONSIDERING THE MAIN PRIMORDIAL CODES!!!!!But for a real understanding of what is the metric of evolution, you must insert finite numbers for gravitons and photons which areencoded on the space time evolution.The spherical volumes become an universal key.The parralelization of the system of uniqueness, quant and cosm,must be rational respecting the 3D and the 4D evolution i n time withan increasing of mass and entropy .Mass andexpansion contraction must be analysed with the sphericalvolumes and the mass .It permits to calculate the maximum volume before thecontraction,it is in the future.


report post as inappropriate

Joy Christian wrote on Dec. 7, 2015 @ 23:48 GMT
The much hyped "loophole free" experiment discussed in this thread has come under fire also from a different direction, with a serious charge of the "violation" of Bell inequalities achieved in the experiment only by a post-selection of the experimental data --- i.e., only by a deceitful manipulation of the experimental data: See PubPeer. Apparently such a deceitful manipulation of data is not considered fraudulent by the followers of Bell.

report post as inappropriate

Fred Diether replied on Dec. 8, 2015 @ 01:40 GMT
That doesn't really matter. We already know that the "loophole" business is a fraud anyways since Bell's theory is junk physics. All these experiments do is validate the predictions of QM for the EPR scenarios. We accept that to be true anyways.

report post as inappropriate

Thomas Howard Ray wrote on Dec. 8, 2015 @ 00:15 GMT
Thanks for the link, Joy. What fun this is getting to be. ;-)

report post as inappropriate

Eckard Blumschein replied on Dec. 13, 2015 @ 16:10 GMT

You are right, left and right as do right and wrong exclude each other and are in so far pairwise mutually related. However in reality, measured distance r and measured delay after an event are reasonable attributed to only one sign while Cartesian coordinates x, y, z of space and t of time refer to arbitrarily chosen references and may therefore be negative too.

I argue that physics should relate to measurement of distance and delay rather than to unwarranted arbitrariness which may be confusing:

Fourier transformation and application of analytic continuation on the traditionally xyzt-based spacetime of Relativity corresponds to the antisymmetry sin(t)=sin(-t) when crossing t=0. As far as I know, quantum theory assumes the more obviously unnatural mirror symmetry cos(t)=cos(-t).


report post as inappropriate

Anonymous replied on Dec. 15, 2015 @ 15:09 GMT

" ... in reality, measured distance r and measured delay after an event are reasonable attributed to only one sign while Cartesian coordinates x, y, z of space and t of time refer to arbitrarily chosen references and may therefore be negative too."

I think you've put your finger on why Joy Christian's topological framework is revolutionary. It eliminates the fundamental assumption of randomness at a foundational level. QM random choice is invested in the experimenter -- Joy's random choice is invested in spacetime, and so cannot be arbitrary. It's between two orientations that have to be chosen simultaneously -- impossible for experimenters.

Pairwise correlation resolves the contradiction caused by mixing heads and tails. It can't happen in a simply connected topological model, because there is no distinguishing near and far ...

"I argue that physics should relate to measurement of distance and delay rather than to unwarranted arbitrariness ..."

Actually, it is your argument that introduces arbitrariness by way of singularly privileging an initial condition. Then left and right cannot be correlated. Take a pairwise value (+ +) or (- -) as the initial condition -- and the condition is fixed -- relativity teaches us that time is relative to the speed of light.

report post as inappropriate

Thomas Howard Ray replied on Dec. 15, 2015 @ 15:10 GMT
Anonymous was me.

report post as inappropriate

Lorraine Ford wrote on Dec. 9, 2015 @ 01:14 GMT
The only way to represent the discontinuous aspects of the physical outcomes of quantum decoherence is via new one-off mathematical equations. Something truly new at the fundamental-level is only representable (by us human beings) as new mathematical equations e.g. (1) new initial versions of laws-of-nature at the beginning of the universe, (2) the outcomes of quantum decoherence. Clearly, this type of creativity is the nature of reality.

Why do men always believe that the universe is SO UTTERLY BORING that it can be fully specified/represented via an initial finite set of equations?

Obviously physics will keep trying, but the fact is that a fully specifiable system is a boring system. Full stop.

report post as inappropriate

Eckard Blumschein replied on Dec. 21, 2015 @ 04:30 GMT

Isn't spookiness just a human admission of a discrepancy between observation and expectation?

Your word one-off doesn't look spooky to me if I may guess it is just a typo and you did actually mean on-off.

When I prefer saying "the ear cannot know which reference point t=0 at midnight people commonly decided to define" or "the ear cannot hear something that did not yet happen", these statements are not literally meant.

My seeming teleology rather intends appealing on common sense. Everybody can easily apply his trust in causality by imagining as if he was in the position of the ear.

However, I would like to warn about trusting in less elementary beliefs like God who doesn't play dice or in the beginning of the universe.


report post as inappropriate

Lorraine Ford replied on Dec. 21, 2015 @ 22:33 GMT

Re “spookiness”:

I think that experimental physicists know what they are doing, and that they are not interpreting their results incorrectly, or making other mistakes. I think that it is clear that what is called “spookiness” is non-local influence that does not involve waves or particles carrying that influence, and does not involve superluminal communication. I...

view entire post

report post as inappropriate

Fred Diether replied on Dec. 22, 2015 @ 18:32 GMT

Fortunately Joy Christian has proved that your nonsense above is just that; nonsense. Do you think it is unreasonable that upon singlet state creation of pairs of particles that Nature has a 50-50 chance of the pairs as a system being either right or left handed? Quite frankly it is just plain common sense logic as a physics postulate. Thus the "spookiness" mystery is solved simply and very elegantly. Quantum entanglement is just an illusion.

report post as inappropriate

Lorraine Ford wrote on Dec. 21, 2015 @ 03:24 GMT
Re my post on Dec. 19, 2015 @ 22:30 GMT:

Just to be clear: what is seen in biology is not new physics. Biology clearly involves the same old physics, but with new “higher-level” categories of information, and new higher-level information relationships that link the new categories to existing information categories, so that there is an unbroken chain of context for information going right back to the fundamental law-of-nature information relationships. Except for the sheer complexity involved, seemingly the new information relationships would also be representable (by us humans) as mathematical equations, just like the fundamental laws-of-nature are.

(This is why so-called “digital information” is not information at all: it has no unbroken chain of context that links it right back to fundamental law-of-nature information relationships. Only those special human beings who understand the special digital code can decipher the “digital information” symbols, thereby converting the symbols into experienced information. The same can be said for written or spoken words: they are not in themselves information - they only represent information.)

With biology, it is clear that information is subjectively apprehended by living things, and it is clear that as no new physics is occurring, information must also be similarly subjectively apprehended by non-living things: particles, atoms and molecules. This is what I meant when I said that a “correct version” of fundamental physics must look to biology for clues as to the actual nature of reality.

report post as inappropriate

Lorraine Ford wrote on Dec. 31, 2015 @ 00:23 GMT

You seem to be saying that there is: an “object reality” i.e. the really-true solid physical reality that actually exists; and an “image reality” i.e. a mere image of, or point of view on, the actual underlying really-true physical reality. Your “image reality” seems to be exclusively human, implying that new physics would be required to explain it. But I would say that none of this is so.

Also, you lack a hypothesis about the nature of laws-of-nature and numbers. So I would guess that you also believe that abstract entities exist, and also believe that it is not necessary to include these abstract entities in your theory of reality; and you also have no explanation of how these abstract entities might interact with physical reality. Sorry if I sound harsh.

But I would say that, although the universe clearly comprises innumerable things, from particles to atoms to living things, there is NO underlying objectively true PHYSICAL reality: actual physical reality, including laws of nature and numbers, is formed out of subjective information relationships experienced by the things. And except for the most basic particles (strings?), the things themselves are formed out of subjective information relationships.

Though physics would seemingly prefer the flattened “object reality” scenario where special explanations or hidden variables take care of any relativity and quantum "aberrations", actual experimental results do not seem to confirm such a scenario. Relativity and quantum mechanics do not agree with a flattened “object reality” scenario. The more easily mathematically modelled “object reality” that many that post to this website hope for, seems to be merely wishful thinking.

report post as inappropriate

Georgina Woodward replied on Dec. 31, 2015 @ 02:46 GMT
Hi Lorraine,

There are substantial bodies made of atoms and there are the subatomic particles from which such things are made. There is electromagnetic radiation. Those premises are not too controversial, there is evidence to support them. Objects are seen via receipt and processing of electromagnetic radiation that provides information. The output that is seen is not itself the substantial objects made of atoms.

Only if the observer is a human is that process human biology. It could be an inorganic reality interface that intercepts the EM radiation and produces a different kind of output. It is a point of view as what EM radiation is intercepted is relative to the location and motion of the reality interface or the observer.

I don't know why you are calling Object reality flattened. It (source objects and EM information) is the source of all of the different individual relative outputs. It is the relative output that is limited, as it is formed only from the information that is received not all of the information that exists in the environment.

report post as inappropriate

Lorraine Ford replied on Dec. 31, 2015 @ 15:13 GMT

Can you explain in more detail what you mean by the word “information”? Is information zeros and ones?

Your semi-definition of “an observer” would seem to mean that any and every subset of adjacent interacting particles could be considered to be “an observer”. Is that all there is to “an observer”?

Does your explanatory scheme include abstract entities? Can we just assume that numbers and laws-of-nature are abstract entities that can be safely excluded from any explanatory scheme?

Re my “calling Object reality flattened”:

There is an assumption that physical reality has a very conventional structure, e.g. an n-dimensional reality containing particles with interrelated informational properties that can be mathematically represented. But relativity and quantum mechanics show that reality cannot have the conventional structure that you and many others seem to suppose that it does.

report post as inappropriate

Georgina Woodward replied on Dec. 31, 2015 @ 19:53 GMT
Hi Lorraine, thank you for your questions.

By information I meant potential sensory data (psd) That is the phrase I usually use as information has different meanings to different people. Photon information (psd) is frequency and intensity (number of photons), those are important determinants of whether a particular photoreceptor cell passes on a signal or not.

A reality interface interacts with potential sensory data and produces an output that is in most cases of a different kind from the input. A camera receives light but producers a digital file or a chemically altered exposed film emulsion. The reality interface has no understanding of what it has produced. The human reality interface is the visual system. Activity within the visual cortex of the brain is linked to memory stored in other parts of the brain providing association with such things as name and learned attributes, allowing object recognition and understanding. Object recognition sets the observer apart from a mere reality interface. The differentiation is important as it shows consciousness is not required to select the input that will later form a known reality refuting the consciousness causes collapse idea of QM.

report post as inappropriate

Robert H McEachern wrote on Aug. 30, 2016 @ 15:33 GMT
“Spookiness” Confirmed as a Misunderstood Classical Phenomenon

Rob McEachern

this post has been edited by the author since its original submission

report post as inappropriate

Login or create account to post reply or comment.

Please enter your e-mail address:
Note: Joining the FQXi mailing list does not give you a login account or constitute membership in the organization.