Search FQXi


If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at forums@fqxi.org with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Forum Home
Introduction
Terms of Use

Order posts by:
 chronological order
 most recent first

Posts by the author are highlighted in orange; posts by FQXi Members are highlighted in blue.

By using the FQXi Forum, you acknowledge reading and agree to abide by the Terms of Use

 RSS feed | RSS help
RECENT POSTS IN THIS TOPIC

Georgina Woodward: on 1/23/19 at 1:26am UTC, wrote Light clock thought experiment: what is happening in the material light...

Eckard Blumschein: on 1/21/19 at 18:03pm UTC, wrote Georgina, Meanwhile, you and all others careful and honest readers of the...

Georgina Woodward: on 1/21/19 at 8:30am UTC, wrote Eckard yes it "does not provide a reasonable picture of material reality"...

Eckard Blumschein: on 1/21/19 at 6:43am UTC, wrote Georgina, In contrast to SRT, Doppler shift plausibly describes all the...

Georgina Woodward: on 1/20/19 at 22:42pm UTC, wrote Eckard, the synchronization of clocks does not contradict a uni-temporal...

Eckard Blumschein: on 1/20/19 at 9:38am UTC, wrote Georgina, Such “synchronization method” deliberately contradicts to...

Georgina Woodward: on 1/20/19 at 7:19am UTC, wrote Eckard, do you think the synchronization method would not be valid for a...

Eckard Blumschein: on 1/20/19 at 3:19am UTC, wrote A first continuation with Van Flandern’s objection:...



FQXi FORUM
November 26, 2020

ARTICLE: Dissolving Quantum Paradoxes [back to article]
Bookmark and Share
Login or create account to post reply or comment.

Peter Morgan wrote on Aug. 31, 2018 @ 23:30 GMT
"Once you start thinking of an agent as a quantum system in his or her own right, says Renner, things get complicated." The initial conditions get complicated, but the equations of motion might and presumably will still be simple, as they are in classical physics. Planck's constant, that Lorentz invariant scale of action, is still the one that rules them all when it comes to measurement incompatibility and quantum fluctuations, come whatever else may.

I look forward to them finding ways to finesse that, as if to make Planck's constant locally look as if it's smaller, even while in principle it's still a universal constant.

Bookmark and Share
report post as inappropriate


Robert H McEachern wrote on Sep. 1, 2018 @ 23:05 GMT
"One of the foundational insights of quantum theory is that, just by observing a system, you change it."

That is not an insight. It is a delusion. Like the famous cat, a coin is neither heads nor tails, until some observer decides to "call it", but that act of observation did not change the state of the coin - it does not collapse into a one-side coin. The observation merely changed the observer's mental state that models the coin, not the coin itself.

Rob McEachern

Bookmark and Share
report post as inappropriate
Georgina Woodward replied on Sep. 4, 2018 @ 06:44 GMT
Linguistically, live and dead cat are both the cat object. The live cat though has functioning aerobic respiration and many processes occurring in the body that rely upon that biochemistry. The dead cat is not respiring. Many processes are not functioning because of that and other biochemistry such as autolysis, the break down of cells, is happening. The live and dead cat are not the same object if the biochemistry is considered. Linguistically, broken and intact poison flask are both flask object. Their topology is very different though. Shards of glass are different objects to the intact flask if topology is considered, Linguistically, decayed and non decayed radioactive particle are both the particle. However if an alpha or beta particle is lost they are not the same object anymore. Different objects can't be in a state of superposition, only different states that might be observed pertaining to the same object. So the thought experiment is not a good analogy.

Bookmark and Share
report post as inappropriate

Georgina Woodward replied on Sep. 4, 2018 @ 21:47 GMT
If the structure and chemistry of the particle before and after decay is considered they are different objects. Yet linguistically both referred to as particle, so seemingly the same object. This may seem a bit pedantic but I think the use of language is failing to clearly categorize the objects as different things rather than same things in different observable states; before and after radioactive decay has happened, releasing the poison.

Bookmark and Share
report post as inappropriate


Anonymous wrote on Sep. 4, 2018 @ 21:25 GMT
Infinitely trolling with a shallow running crank bait is what's visible.

Bookmark and Share
report post as inappropriate


Brian B wrote on Oct. 23, 2018 @ 01:08 GMT
"Let’s say we would like to decide whether there’s really a superposition of the dead cat and living cat," says Renner, returning to the Schrödinger’s cat paradox. "If we want to do that we have to control the system extremely well," in particular the wavefunction, says Renner. That level of control would require an exquisitely precise clock—one that might be impossible to build.

If Renner and del Rio can show that such a precise quantum clock is a physical impossibility, that would meant that there is no way to discriminate between a superposition and a mixture in a "macroscopic" object like a cat, and the difference between the two states would lose its meaning. "Then the distinction between superpositions and mixtures is just a mathematical curiosity without a ’physically existing’ counterpart," says Renner. "The paradox would dissolve."

A few questions:

1) Are they arguing that we cannot know time accurately due to the time energy uncertainty principle? If so then can they use an observable that commutes with time?

2) What is the theoretical limit of an optical frequency comb?

3) Bell's inequalities tells us there are no local hidden variables but it does not rule out nonlocal hidden variables. If the collapse of a wavefunction is a nonlocal process, like measuring the spin between two entangled particles, then how does that impact our understanding of causality and time? Is there a second higher speed limit for a collapsing wavefunction or must it be instantaneous to prevent violating the conservation of angular momentum?

4) Why is everyone obsessed with quantum gravity?

Bookmark and Share
report post as inappropriate
Robert H McEachern replied on Oct. 23, 2018 @ 17:20 GMT
Brian,

A few answers:

1) Are they arguing that we cannot know time accurately due to the time energy uncertainty principle? If so then can they use an observable that commutes with time?

They seem to be arguing that the uncertainty principle is the ultimate limit, but perhaps other physical circumstances limit what can be done, even before the uncertainty principle limit is reached.

2) What is the theoretical limit of an optical frequency comb?

The limit of all observables is given by Shannon's Capacity Theorem, which in the case of the Heisenberg Uncertainty principle, reduces to the statement that, every set of measurements, must contain one or more bits of information; if you have failed to extract even a single bit of information, from within all the data bits comprising your set of measurements, then you have failed to make anything worthy of being called a measurement.

3) Bell's inequalities tells us there are no local hidden variables but it does not rule out nonlocal hidden variables. If the collapse of a wavefunction is a nonlocal process, like measuring the spin between two entangled particles, then how does that impact our understanding of causality and time? Is there a second higher speed limit for a collapsing wavefunction or must it be instantaneous to prevent violating the conservation of angular momentum?

Bell's inequality is derived from the false assumption that something ELSE always remains to be measured, after the first measurement of an entangled pair has been performed. But that is obviously false, when the entity being measured manifests only a single-bit of information - the Heisenberg limit. In this peculiar case, not only are there no hidden-variables, there are no variables (plural) at all - there is only one bit.

4) Why is everyone obsessed with quantum gravity?

Because, like oil and water, gravity and quantum theories do not mix, but everyone thinks that they should be modified so that they do.

Bookmark and Share
report post as inappropriate


Anonymous wrote on Oct. 24, 2018 @ 15:53 GMT
R. McEachern,

Very good and succinct answers, i through 4.

Bookmark and Share
report post as inappropriate


situs togel online wrote on Nov. 1, 2018 @ 10:08 GMT
No one knows how this particular play will turn out, including del Rio and Renner. By Situs Togel Online

Bookmark and Share
post approved


Lee Bloomquist wrote on Dec. 8, 2018 @ 00:49 GMT
Our understanding of wave function collapse uses the language of “standard analysis,” which is not a good language for talking about existence— which is the problem here. Instead of a “limit”, which necessarily involve statements about numbers on a real number line— which therefore say that before the”Limit” can exist, there must first exist this number line stretching in front of...

view entire post


Bookmark and Share
report post as inappropriate
Robert H McEachern replied on Dec. 9, 2018 @ 20:03 GMT
"The particle jumps from trajectory to trajectory in those computer generated graphics in Bohm and Hiley’s book." But there remains no good reason to suppose that any such jumping happens in reality.

"When it happens to hit the detection screen, who knows what trajectory it would have been following?" No one, precisely because no one ever even attempted to follow it.

"The distribution follows the Schrodinger equation." Precisely, because, rather than attempting to follow anything along any trajectory, the equation, together with the Born Rule, merely describes the detection statistics that can be observed, by a set of stationary detectors, sitting wherever the equation happens to specify. An animal-trap, does not reveal the path the animal took to arrive at the trap. The problem arose when Schrodinger switched from using a single equation/wavefunction to describe a single particle's trajectory, to using the same, single equation/wavefunction to describe something (He knew not what) about ALL particles simultaneously. This does not work correctly - precisely because the latter enables the "jumping" in the solution (entirely due to noise), that does not correspond to any phenomenon in the real world; the "cost-function" being minimized (least-squared error), behaves differently (produces a very different solution to the equation - one enabling "jumping") in the case of ANY noise in ANY measurement or ANY error in ANY potential used in the equation. It will drive ALL and ALL errors to zero, via the "jumping".

Rob McEachern

Bookmark and Share
report post as inappropriate


Eckard Blumschein wrote on Dec. 10, 2018 @ 04:28 GMT
How to interprete a "wave function collapse"? Robert McEachern has the most convincing to me answer: The many delusive worlds of wave function models are turning out to be conceptually different from just one reasonably assumed obvious reality. We don't need non-standard analysis as to understand this and related weirdness.

Having looked in Robert's power point presentation, I just criticize his naive use of Fourier transformation with integration from minus infinity to plus infinity over time. Shannon understood that the definitely real unchangeable past is essentially different from the many predictable and influencable possible futures which are permanently collapsing with growing time. Really already elapsed (past) time is not delusive.

Every human so far was born from exactly one woman and one man, no matter whether or not his family tree is known. Theoretically he has a huge number of ancestors after millions of generations. However, the hypothetic family tree of his grand-grand-grand children will collapse as do wave functions.

Bookmark and Share
report post as inappropriate
Robert H McEachern replied on Dec. 10, 2018 @ 15:28 GMT
Eckard,

"I just criticize his naive use of Fourier transformation with integration from minus infinity to plus infinity over time."

Perhaps it is not quite as naive as it appears. As you know, physics seeks to discover and model predictable phenomenon. So, if perfect predictions are possible, it would enable perfect predictions of the past and future, from a finite duration set of observations. So instead of just integrating over the finite duration of actual measurements, one could integrate over the infinite duration predictions made by the model. This is why the process works for predictable phenomenon. This is why there is an "unreasonable effectiveness of mathematics" when applied to perfectly predictable phenomenon. And it is also why it is, rather less effective, when applied to unpredictable phenomenon. The latter, is what Shannon's theory is all about.

Rob McEachern

Bookmark and Share
report post as inappropriate

Eckard Blumschein replied on Dec. 11, 2018 @ 17:29 GMT
Rob,

PREdiction of the past did imply reversed direction of time. Claude Shannon was not naive. He didn't accept Laplace's determinism instead of common sense. He meant that the past is known in principle but cannot be changed while the future can be influenced but is not known for sure.

I consider you one of the few who understand that reality is quite different even from the best theory. If one integrates as if there was no fundamental difference between past and future, then one ignores that the restriction to the limited number of Laplacean initial conditions implies the loss of the perhaps infinite amount of unconsidered influences.

In case of analysing measured data, measured future data are not yet available.

Perfect PREdictions may only seem possible to the extent one feels safe to exclude unseen erratic influences. In other words, perfectly predictable phenomena are belonging to models, not to reality.

Maybe, you mistook my criticism. I didn't blame you personally but Laplace, Fourier, and current mainstream physics. Integration either from minus infinity to t=0 in case of analysis or from t=0 to plus infinity would not ignore the conceptual difference between past and future of reality. Both of these half-sided integrals are likewise infinite.

In contrast to Wigner, I don't see an unreasonable effectiveness of mathematics. Decomposition into Fourier components is equivalent to decomposition into Cosine components, even if this looks stunning.

Of course, we certainly agree:

Actual measurements imply additional deviations from reality: They are restricted to finite duration and to a finite number of sampled data.

And Shannon's theory contradicts to Laplace's determinism of future.

Eckard

Bookmark and Share
report post as inappropriate

Robert H McEachern replied on Dec. 12, 2018 @ 14:57 GMT
Eckard,

As you may recall from my 2012 FQXi essay, I make a big distinction between computational models and physical models of reality. I think things like Fourier transforms are very useful as computational models/tools. But, like you, I believe they are not good physical models. The actual, physical processes occurring in the world, are not based on any infinite, orthogonal functions. Attempting to interpret them as if they are, is a long-standing problem.

Rob McEachern

Bookmark and Share
report post as inappropriate


Eckard Blumschein wrote on Dec. 15, 2018 @ 09:34 GMT
Rob,

In the 2013 contest "It from Bit or Bit from It?" I wrote an essay Shannon vs. Wheeler where I put the question "Did Alan Oppenheimer improve John Tukey's (real-valued) cepstrum? Do you agree on that the correct answer is no, and it may relate to your approach eyplaining the quantum paradoxes?

Eckard

Bookmark and Share
report post as inappropriate


T.H.Ray wrote on Dec. 16, 2018 @ 02:39 GMT
Speaking of a play within a play:

https://www.researchgate.net/publication/326380733_simu
ltaneity

Bookmark and Share
report post as inappropriate


Eckard Blumschein wrote on Jan. 14, 2019 @ 08:24 GMT
Georgina,

"page 10" of which paper? Einstein's 1905, one out of Klingman's, or Crothers'?

"The confirmation of the Crothers refutation for the special theory of relativity" by Colin James III 2018 has only one page.

EB

Bookmark and Share
report post as inappropriate
Eckard Blumschein replied on Jan. 14, 2019 @ 08:28 GMT
By the way, I didn't give the viXra link because I intended mentioning APS.

Bookmark and Share
report post as inappropriate


Amrit Srecko Sorli wrote on Jan. 15, 2019 @ 18:00 GMT
Popper is demolishing Higgs

attachments: Poppers_demolition_of_Higgs.pdf

Bookmark and Share
report post as inappropriate


Login or create account to post reply or comment.

Please enter your e-mail address:
Note: Joining the FQXi mailing list does not give you a login account or constitute membership in the organization.