CATEGORY:
Questioning the Foundations Essay Contest (2012)
[back]
TOPIC:
Is Quantum Linear Superposition an Exact Principle of Nature? by Angelo Bassi, Tejinder Singh, and Hendrik Ulbricht
[refresh]
Login or
create account to post reply or comment.
Author Tejinder Pal Singh wrote on Aug. 22, 2012 @ 14:20 GMT
Essay AbstractThe principle of linear superposition is a hallmark of quantum theory. It has been confirmed experimentally for photons, electrons, neutrons, atoms, for molecules having masses up to ten thousand amu, and also in collective states such as SQUIDs and Bose-Einstein condensates. However, the principle does not seem to hold for positions of large objects! Why for instance, a table is never found to be in two places at the same time? One possible explanation for the absence of macroscopic superpositions is that quantum theory is an approximation to a stochastic nonlinear theory. This hypothesis may have its fundamental origins in gravitational physics, and is being put to test by modern ongoing experiments on matter wave interferometry.
Author BioAngelo Bassi works on foundations of quantum mechanics and has a Ph.D. degree in Physics from University of Trieste. After completing post-docs at ICTP and University Ludwig-Maxmillian he joined University of Trieste as faculty. Tejinder Singh is Professor at the Tata Institute of Fundamental Research in Mumbai. His research interests are in quantum gravity and measurement problem. Hendrik Ulbricht received his Ph.D. in the surface science experimental group of Gerhard Ertl, and after completing post-docs at Vanderbilt and Vienna he is now Reader at Southampton University where he leads an experimental effort on Matter-wave Interferometry and quantum-nanophysics.
Download Essay PDF File
Edwin Eugene Klingman wrote on Aug. 23, 2012 @ 05:23 GMT
Dear Dr's Bassi, Singh, and Ulbricht,
I enjoyed your essay and the fact that you use real numbers for particle masses and slit dimensions. I was somewhat disappointed to see the emphasis on GRW 'theory' which, as a "phenomenological modification of quantum theory" is as ugly as sin and also violates energy conservation, in addition to requiring two new unexplained universal constants that seem to show up nowhere else (unlike g, c and h). But I was glad that you noted "beyond doubt" that there should be deeper principles underlying this radical approach (though none are known).
You suggest, with Penrose and others, that gravity is responsible for the absence of macroscopic superpositions. Although you note that the GRW/CSL approach is non-relativistic and efforts to relativize it have failed, still you see collapse as "instantaneous and non-local", per Bell.
You are left with a century-old prediction that fails at macroscopic dimensions and a phenomenological model that can only be characterized as 'ugly'.
Although you've invested quite a bit in this model you do suggest that it is possible that linear superposition is a wrong assumption and that something radical is needed. I invite you to read my essay,
The Nature of the Wave Function, for a gravity-based approach that radically departs from the century old assumption of superposition and collapse. I hope you find it interesting, and invite your comments.
Best of luck in the contest,
Edwin Eugene Klingman
report post as inappropriate
Author Tejinder Pal Singh replied on Aug. 23, 2012 @ 12:04 GMT
Dear Dr. Klingman,
We refer to collapse models as phenomenological modifications of quantum theory precisely because they lack the degree of universality and beauty typical of theories such as classical mechanics or electromagnetism. Therefore the comment "as ugly as sin" apart from not being a scientific comment, does not come as a surprise. Moreover, phenomenological models contain phenomenological constants, which are expected to be justified by the underlying theory. It has always been like this in the history of science. So, there is not much to be surprised about the introduction of two new constants.
However, in spite of being phenomenological models, collapse models have some important scientific merits.
1. They provide a consistent resolution to the measurement problem of quantum mechanics. One may not like the way the problem is solved, but it is a consistent solution. And this is important.
2. They achieve what many people thought was impossible beforehand: to reconcile the linear evolution as given by the Schroedinger equation and the collapse of the wave function into a single dynamical principle. The new dynamics is compatible with all known experimental facts, and moreover explains why we see the world the way we see it. Again, one may not like the explanation, but it is a consistent explanation.
3. They are the only theory mathematically well formulated, which makes predictions which are different from those of standard quantum mechanics. Having models against which quantum mechanics can be tested experimentally is of paramount importance for devising novel experiments and, ultimately, for the development of physics.
4. They suggest a direction to look at, for unfolding the underlying theory of nature. The direction might eventually prove wrong, but it is valuable that one has a clear direction to follow.
Regarding the energy non-conservation, take as an example a particle in a gas. Its energy is not conserved, because the particles moves to thermal equilibrium. No one however is shocked by this fact. The non-coserved energy goes to the bath. The overall energy is conserved. In collapse models the same thing happens. The non-conserved energy goes to the noise responsible for the collapse. When we will have the underlying theory (the analog of classical mechanics for explaining the behavior of a particle in a gas), we will also restore energy conservation.
Regarding non-locality, the issue is simple. The violation of Bell inequalities is an experimental fact. Within the framework of a theory (or model...) containing just the wave function and its dynamics and nothing else, the only way to comply with the violation of Bell inequalities is to have a superluminal collapse of the wave function.
We look forward to reading your essay.
Regards,
Authors
Edwin Eugene Klingman replied on Aug. 23, 2012 @ 18:26 GMT
Dear Dr. Singh,
I certainly agree that 'ugly as sin' is not a scientific characterization, and that all phenomenological models (MOND, etc) share this feature to some degree. I further agree with your reasons for taking GRW seriously, so it really is just a case of "one may not like the explanation". I do like your explanation for 'restoring' energy conservation.
I also agree that "The violation of Bell inequalities is an experimental fact". It is the assumptions underlying Bell's inequality that I believe to be wrong. Because this topic has been discussed in great detail on other threads, I will not clog your space here. Thank you for agreeing to read my essay, I look forward to your comments.
Best,
Edwin Eugene Klingman
report post as inappropriate
Daniel Sudarsky replied on Aug. 28, 2012 @ 01:20 GMT
Dear colleagues,
I found the essay quite stimulating. Congratulations.
Regarding the issue at hand here, I view very favorably the the idea that Quantum Gravity could be behind the modifications that look effectively like CSL.
In that case I think the issue of energy conservation should be considered in a rather different light. In General Relativity there are no generic laws of energy conservation (Energy becomes a well defied concept only in rather spacial situations, such as space-times with time-like Killing Fields). The
only thing that is relevant in a general context is the conservation of the energy momentum tensor (i.e it should have zero divergence). But that, again, relies on physics taking place in a well defied space-time metric.What will be the form of whatever is left of such notions in a situation where metric is ill defied, or fuzzy, or fluctuating, is, I believe anybody's guess.
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:16 GMT
Thank you Daniel. We broadly agree with your viewpoint.
Authors
hide replies
John Merryman wrote on Aug. 23, 2012 @ 15:48 GMT
Teginder,
"In order to describe dynamical evolution the theory depends on an external classical time, which is part of a classical spacetime geometry."
The point I make in my entry is that the classical timeline is an effect of the underlaying dynamic, such that it isn't the present moving from past to future, but the changing configuration of what is, that turns future into past....
view entire post
Teginder,
"In order to describe dynamical evolution the theory depends on an external classical time, which is part of a classical spacetime geometry."
The point I make in my
entry is that the classical timeline is an effect of the underlaying dynamic, such that it isn't the present moving from past to future, but the changing configuration of what is, that turns future into past. Not the earth traveling the fourth dimension from yesterday to tomorrow, but tomorrow becoming yesterday because the earth rotates. Duration is simply what happens within the present, between events, not a property external to the present. This makes time an emergent effect of action, similar to temperature. So without an external dimension of time, there can be no dimensionless point on this timeline. A frozen point in time would be equivalent to a temperature of absolute zero; the complete absence of motion. It would be like trying to take a picture with the shutter speed set at zero. The result would not be a frozen moment in time, but nothing. Which means an entity, micro, or macro, cannot be isolated from its action. That macroscopic table only exists because of the dynamic activity sustaining it. So, effectively, the particle is actually just a really small wave.
Consider this exchange from an interview with
Carver Mead;
"So how did Bohr and the others come to think of nature as ultimately random, discontinuous?
They took the limitations of their cumbersome experiments as evidence for the nature of reality. Using the crude equipment of the early twentieth century, it's amazing that physicists could get any significant results at all. So I have enormous respect for the people who were able to discern anything profound from these experiments. If they had known about the coherent quantum systems that are commonplace today, they wouldn't have thought of using statistics as the foundation for physics.
Statistics in this sense means what?
That an electron is either here, or there, or some other place, and all you can know is the probability that it is in one place or the other. Bohr ended up saying that the only statements you can make at the fundamental level are statistical. You cannot grasp the reality itself, only probabilities related to it. They really, really, wanted to have the last word, and the only word they had was statistical. So they made their limitations the last word, saying, "Okay, the only knowledge that there is down deed is statistical knowledge. That's all we can know." That's a very dangerous thing to say. It is always possible to gain a deeper understanding as time progresses. But they carried the day.
What about Schrodinger? Back in the 1920s, didn't he say something like what you are saying now?
That's right. He felt that he could develop a wave theory of the electron that could explain how all this worked. But Bohr was more into "principles": the uncertainty principle, the exclusion principle--this, that, and the other. He was very much into the postulational mode. But Schrodinger thought that a continuum theory of the electron could be successful. So he went to Copenhagen to work with Bohr. He felt that it was a matter of getting a "political" consensus; you know, this is a historic thing that is happening. But whenever Schrodinger tried to talk, Bohr would raise his voice and bring up all these counter-examples. Basically he shouted him down.
It sounds like vanity.
Of course. It was a period when physics was full of huge egos. It was still going on when I got into the field. But it doesn't make sense, and it isn't the way science works in the long run. It may forestall people from doing sensible work for a long time, which is what happened. They ended up derailing conceptual physics for the next 70 years."
.........
"So early on you knew that electrons were real.
The electrons were real, the voltages were real, the phase of the sine-wave was real, the current was real. These were real things. They were just as real as the water going down through the pipes. You listen to the technology, and you know that these things are totally real, and totally intuitive.
But they're also waves, right? Then what are they waving in?
It's interesting, isn't it? That has hung people up ever since the time of Clerk Maxwell, and it's the missing piece of intuition that we need to develop in young people. The electron isn't the disturbance of something else. It is its own thing. The electron is the thing that's wiggling, and the wave is the electron. It is its own medium. You don't need something for it to be in, because if you did it would be buffeted about and all messed up. So the only pure way to have a wave is for it to be its own medium. The electron isn't something that has a fixed physical shape. Waves propagate outwards, and they can be large or small. That's what waves do.
So how big is an electron?
It expands to fit the container it's in. That may be a positive charge that's attracting it--a hydrogen atom--or the walls of a conductor. A piece of wire is a container for electrons. They simply fill out the piece of wire. That's what all waves do. If you try to gather them into a smaller space, the energy level goes up. That's what these Copenhagen guys call the Heisenberg uncertainty principle. But there's nothing uncertain about it. It's just a property of waves. Confine them, and you have more wavelengths in a given space, and that means a higher frequency and higher energy. But a quantum wave also tends to go to the state of lowest energy, so it will expand as long as you let it. You can make an electron that's ten feet across, there's no problem with that. It's its own medium, right? And it gets to be less and less dense as you let it expand. People regularly do experiments with neutrons that are a foot across.
A ten-foot electron! Amazing
It could be a mile. The electrons in my superconducting magnet are that long.
A mile-long electron! That alters our picture of the world--most people's minds think about atoms as tiny solar systems.
Right, that's what I was brought up on-this little grain of something. Now it's true that if you take a proton and you put it together with an electron, you get something that we call a hydrogen atom. But what that is, in fact, is a self-consistent solution of the two waves interacting with each other. They want to be close together because one's positive and the other is negative, and when they get closer that makes the energy lower. But if they get too close they wiggle too much and that makes the energy higher. So there's a place where they are just right, and that's what determines the size of the hydrogen atom. And that optimum is a self-consistent solution of the Schrodinger equation."
view post as summary
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:19 GMT
Thank you for your comments John.
Authors
Domenico Oricchio wrote on Aug. 23, 2012 @ 17:41 GMT
It is only a chat in a blog, not a complete theory: I read your article and I start to thinking on your interesting essay.
I think that for the third law of thermodynamics exist a single quantum function for a macroscopic object: a superconductive apple (superconductive Newton's apple) is a quantum object that produce tunneling, double slit interference and others quantum effect.
The problem with the real apple is the unattainability of the absolute zero, because of the phonon oscillations (thermal absorption); is it possible to built material transparent for phonons, like diamond, or glasses, or irregular lattice material (only apple shape)?
Exist in the Universe macroscopic objects near the absolute zero?
I think that a black dwarf is a macroscopic object with a single quantum function that emit like a single neutron in a gravitational field (hypothetical observable quantum jumps of the black dwarf).
Saluti
Domenico
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:21 GMT
Thanks for your comments Domenico.
Authors
Ted Erikson wrote on Aug. 23, 2012 @ 20:56 GMT
Tejinder:
Way over my head, but interesting. Does nature recognize fancy mathematics? My essay is perhaps overly simplified, but addresses the real problem of Physics. Wherein lies "consciousness"? Very murky, but emergentism (growth) and panpsychism (memory are properties suggested that aligns them with probabilities of a 1-D, 2-D, and 3-D geometric world. See:
To Seek Unknown Shores
http://fqxi.org/community/forum/topic/1409
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:22 GMT
Then Ted. We look forward to reading your essay.
Authors
Michael Lee wrote on Aug. 25, 2012 @ 07:34 GMT
Dear authors,
there is little to read about the decoherence theory to explain why macroscopic superpositions are so difficult to achieve. What do you think about this theory and it's recent experimental tests that confirm this framework?
Would be happy about some answers.
Best regards,
Michael Lee
report post as inappropriate
Author Tejinder Pal Singh replied on Aug. 27, 2012 @ 18:00 GMT
Hello Michael,
You may kindly want to have a look at the article by Stephen Adler at
http://arxiv.org/abs/quant-ph/0112095
and Section I of our review article
http://arXiv.org/abs/arXiv:1204.4325
B. V. Oman wrote on Aug. 27, 2012 @ 14:34 GMT
A clearly written essay, which, unlike many others here proposed, concerns physics and not science fiction.
One may like these models or not, but one has to acknowledge that they give a logic and fully-consistent explanation of the failure of the superposition principle for macroscopic objects.
B. V. Oman
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:18 GMT
Thank you Dr. Oman
Authors
Anonymous wrote on Sep. 1, 2012 @ 17:32 GMT
Interesting essay. Nice job.
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:15 GMT
Member George F. R. Ellis wrote on Sep. 2, 2012 @ 13:22 GMT
Dear authors
I really like your essay and the way that it tackles a crucial issue for present day physics that so many choose to sweep under the carpet. You suggest "1. Given a system of n distinguishable particles, each particle experiences a sudden spontaneous localization (i.e. collapse of position state) with a mean rate lambda, to a spatial region of extent r_C." I agree that this needs investigation. But my own view would be that this would be very likely to depend on the local context, in much the same way that state vector preparation does (see
here for a discussion). Thus the rate lambda would be environmentally dependent. Penrose' idea is one way that this dependence might occur; but it could be that it is a far more local effect than that (i.e. on the scale of the measuring apparatus).
George Ellis
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:13 GMT
Dear George,
Thanks for your liking our essay, and for your interesting viewpoint.
What you suggest might very well be the case, for a consistent theory of spontaneous wave function collapse. However as you know it is not what is assumed to happen in collapse models such as CSL. There, the collapse rate lambda is a uniquely fixed constant, which does not depend on anything. If collapse models were a fundamental theory, it would play the role of a new constant of nature. The equations of motion then tell you that, when you have a systems of particles, the collapse rate of the center of mass scales with the size of the systems. This scaling seems to be something like the contextually feature proposed by you. But the value of lambda remains always the same.
We are currently reading your detailed paper on quantum measurement mentioned by you above. Your essay here on top-down causation is fascinating. Do you have a picture on how corresponding mathematical models can be built, including specifically in the context of quantum measurement?
Regards,
Authors
Member George F. R. Ellis replied on Sep. 3, 2012 @ 16:40 GMT
Thanks for that. I don't yet have a mathematical model in the case of measurement: am thinking about it. The first step is to look at state vector preparation, which is an analogous non-unitary process, involving a projection operator depending on the local macro context. With that in place, the steps to a contextual measurement model - maybe with a new universal constant, as you say - may become clearer. But the essential comment is that the local measurement context may be the "hidden variable" (it's non-local as far as the micro system is concerned, so the non-locality criterion is satisfied). It's hidden simply because we don't usually take into into account.
George
report post as inappropriate
Lawrence B Crowell wrote on Sep. 2, 2012 @ 18:17 GMT
Dear Angelo, Tejinder and Hendrik,
I think the criterion for this departure from linear QM may come with horizon scales. The de Broglie wave equation tells us the wave length of a particle with momentum p is λ = h/p. If we use the momentum p = mc (thinking in a relativistic sense of p = E/c) we may estimate the wave length for a Planck momentum particle p = m_pc = 6.5x10^5gcm/s, for...
view entire post
Dear Angelo, Tejinder and Hendrik,
I think the criterion for this departure from linear QM may come with horizon scales. The de Broglie wave equation tells us the wave length of a particle with momentum p is λ = h/p. If we use the momentum p = mc (thinking in a relativistic sense of p = E/c) we may estimate the wave length for a Planck momentum particle p = m_pc = 6.5x10^5gcm/s, for m_p the Planck mass. The wave length for such a particle is then 1.0x10^{-32}cm, which is close to the Planck length scale L_p = sqrt{Għ/c^3} = 1.6x10^{-33}cm.
A quantum system is measured by a reservoir of states. The superposition of states in that system is replaced by entanglements with the reservoir of states. The standard measuring apparatus is on the order of a mole or many moles of atoms or quantum states. This then pushes the effective wavelength of this measurement, or maybe more importantly the time scale for the reduction of quantum states measured to an interval shorted than the Planck time T_p = L_p/c. This might mean that measurement of quantum systems, and associated with that the stability of classical states (the table not being in two places at once & Schrodinger’s cat) involves this limit that is associated with quantum gravity.
John Wheeler discussed how there may be different levels of collapse. With gravity there is the collapse of a star into a black hole. He then said this may be related to the “collapse” (to use an over played buzzword) of a wave function. He said the dynamics of black hole generation and the problems with quantum measurement might well be related, or are two aspects of the same thing. It might also be pointed out there are theoretical connections between QCD and gravitation, where data from RHIC and some hints with the heavy ion work at the LHC, that gluon chains or glueballs have black hole (like) amplitudes similar to Hawking radiation.
In an ideal situation it might the be possible to maintain a system with around 10^{20} atoms in a quantum state. A reduction of such idealism may reduce this to a lower number. This does leave open the question of how the physics of superfluidity, superconductivity and related collective overcomplete or coherent states fits into this picture.
My
essay is not related to this topic directly, though my discussions on replacing unitarity with modular forms and meromorphic functions could have some connection.
Good luck with your essay.
Cheers LC
view post as summary
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 3, 2012 @ 13:25 GMT
Dear Lawrence,
1. You seem to subscribe the idea that decoherence solves the measurement problem, if we interpret correctly what you write. We strongly object against the possibility that decoherence alone provides a solution to the measurement problem. See [Adler's paper against decoherence] for a thorough criticism, which we think is convincing enough.
2. About John Wheeler's idea. They are certainly very appealing, and there could be much truth in them. However, they have not been translated so far into consistent mathematical models. In our essay, we stick on purpose only to ideas which find application in well-defined mathematical models, like collapse models and trace dynamics. Moreover, collapse models make precise predictions, which can be tested experimentally. In this way, one has what we think is a perfect match between speculation, mathematical modeling and experimental analysis.
3. Regarding superfluidity, superconductivity and related collective overcomplete or coherent states. They can be very well described within collapse models, and the answer is that they behave as we see them behaving. In other words, collapse models do not predict a (too) different behavior for such collective phenomena, with respect to standard quantum mechanics. The reason is that these phenomena do not involve the *superposition of a large number of particle in two appreciably different positions in space*, the only type of superpositions which are strongly suppressed by collapse models.
Regards,
Authors
Lawrence B Crowell replied on Sep. 5, 2012 @ 02:39 GMT
I don’t think decoherence solves the measurement problem per se. It does indicate how superpositions of a quantum system are teken up by a reservoir of states in entanglements. This then reduces the density matrix of the system to a diagonal matrix which correspond to probabilities. Decoherence does not tell us which outcome actually happens.
I framed this within the decoherence perspective. It seemed as if the criterion for the sort of nonlinear quantum physics would happen when the time of the state reduction occurs at a time comparable to the Planck time. This can happen for a system with approximately 10^{18} amu or proton masses. This might be the maximal size at which a system can have quantum properties.
Cheers LC
report post as inappropriate
Robert H McEachern wrote on Sep. 3, 2012 @ 15:07 GMT
As noted at the beginning of your article:
"The principle of linear superposition...Along with the uncertainty principle, it provides the basis for the mathematical formulation of quantum theory." You then suggest that it might only hold as an approximation.
I view the problem rather differently. Fourier Analysis is the actual mathematical basis for quantum theory. Superposition and...
view entire post
As noted at the beginning of your article:
"The principle of linear superposition...Along with the uncertainty principle, it provides the basis for the mathematical formulation of quantum theory." You then suggest that it might only hold as an approximation.
I view the problem rather differently. Fourier Analysis is the actual mathematical basis for quantum theory. Superposition and the uncertainty principle are merely properties of Fourier Analysis. In other words, they are not properties of the physical world at all, but merely properties of the mathematical language being used to describe that world. Even the well-known, double-slit "interference pattern" is just the magnitude of the Fourier Transform of the slit geometry. In other words, the pattern exists, and is related to the structure of the slits, as a mathematical identity, independent of the existence of waves, particles, physics or physicists.
For the better part of a century, physicists have been misattributing the attributes of the language they have chosen to describe nature, for attributes of nature itself. But they are not the same thing.
Fourier Analysis, by design, is an extremely powerful technique, in that it can be made to "fit" any observable data. Hence it comes as no surprise that a theory based on it "fits" the observations.
But it is not unique in this regard. And it is also not the best model, in that it assumes "no a priori information" about what is being observed. Consequently, it is a good model for simple objects, which possess very little a priori information. On the other hand, it is, in that regard, a very poor model, for human observers; assuming that it is, is the source of all of the "weirdness" in the interpretations of quantum theory.
Putting Fourier Analysis into the hands of physicists has turned out to be a bit like putting machine guns into the hands of children - they have been rather careless about where they have aimed it. Aiming it at inanimate objects is acceptable. Aiming at human observers is not.
view post as summary
report post as inappropriate
Anonymous replied on Oct. 12, 2012 @ 19:57 GMT
Eckard Blumschein replied on Oct. 21, 2012 @ 04:02 GMT
Robert,
Let me object to "Superposition and the uncertainty principle are merely properties of Fourier Analysis."
The authors did perhaps decide to ignore such claims.
Eckard
report post as inappropriate
Inger Stjernqvist wrote on Sep. 3, 2012 @ 15:18 GMT
Dear authors,
When reading your excellent essay, a (perhaps silly) question compes to my mind. You write: "Suppose one has prepared in a controlled manner a beam of very large identical molecules...". What I wonder is: Mustn't there be an upper limit where the very large (hence comlex) molecules can no longer be assumed to be positively identical? Might the lack of controlled identity be the limit where linear superposition no longer holds? Might it be a question of molecular complexity, rather than size/weight?
Best regards!
Inger
report post as inappropriate
Robert H McEachern replied on Sep. 3, 2012 @ 18:26 GMT
Inger,
Your question is not silly at all. It is very near the heart of the issue. One need only go a little bit deeper to arrive at "the issue."
What is the significance of the particles all being "identical" in the first place? If they remain, forever identical, then they cannot change with the passage of time. If they cannot change with the passage of time, then they cannot store...
view entire post
Inger,
Your question is not silly at all. It is very near the heart of the issue. One need only go a little bit deeper to arrive at "the issue."
What is the significance of the particles all being "identical" in the first place? If they remain, forever identical, then they cannot change with the passage of time. If they cannot change with the passage of time, then they cannot store any information whatsoever, within their internal structure.
But a larger entity, constructed from a number of such identical particles, can store information, by the relationships (like distances) between them. Entities that store information, can behavior towards other entities in a "symbolic" manner, and not just a "physical" manner. Even a tiny virus particle has genetic information stored within it, that enables it to exhibit such "symbolic" behavior.
What is the significant difference between "symbolic" and "physical" behavior? It is this: in the latter, observed data measurements are treated as "real numbers", in the former, they are treated like "serial numbers." Real numbers have most significant and least significant digits. Serial numbers, like credit-card numbers do not; change one digit anywhere, and it symbolizes someone else's account number; introduce one genetic mutation, and it may code for a different protein.
All the "interpretations" of mathematical models in physics have assumed that entities only interact "physically." That is true for entities devoid of any information storage capacity, like subatomic particles. But it is not true of macroscopic entities, especially human observers. Physical behaviors can be viewed as encoded into the equations. But symbolic behaviors are coded into the initial conditions. By ignoring the exact (individual digits) of the initial conditions of the information stored within complex entities, physicists has thrown the baby out with the bath-water.
All the supposed "weirdness" in the "interpretations" of quantum theory, derives from the fact that physicists have failed to take into account that human observers interact "symbolically" with their experiments, as well as "physically."
view post as summary
report post as inappropriate
Inger Stjernqvist replied on Sep. 3, 2012 @ 21:46 GMT
Dear Robert,
Thank you very much for your enlighening reply! You gave me more than a hint of the role of information theory in physics, which I would like to follow up further. I entered this essay contest in order to have the opportunity to ask some silly questions to people that are more knowing than me - and kind enough to answer. See, if you like, my essay "Every Why Hath a Wherefore".
You saved my day!
Inger
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 6, 2012 @ 10:22 GMT
Dear Inger,
No, it is only the mass. If you take a look at some of the recent publications on molecule interferometry (Gerlich2011, Nat. Comm. 2, 263), then you can find that the molecules are already very complex. However one finds always the maximum predicted quantum visibility in interferometry experiments. So why is that so? First, what we observe is single particle interferometry...
view entire post
Dear Inger,
No, it is only the mass. If you take a look at some of the recent publications on molecule interferometry (Gerlich2011, Nat. Comm. 2, 263), then you can find that the molecules are already very complex. However one finds always the maximum predicted quantum visibility in interferometry experiments. So why is that so? First, what we observe is single particle interferometry otherwise it would hardly be a quantum experiment. Roughly speaking this means every particle interferes only with itself and this is per definition identical – that is what we mean when we say we probe quantum superposition.
So then you could argue that other properties of the molecule play a role: internal states such as rotation, vibration or the conformation of the molecules, but again we don’t see any indication in the experiments that those properties influence the centre of mass motion. These internal molecular properties are simply not coupled to the motion of the particles. This means in matter-wave experiments only the mass of the particle and they propagation speed is important. Both speed and mass define the de Broglie wavelength of the particle.
There is of course a dependency on particle mass distribution and that comes from the fact that you have to sum many single particle interferometry event to observe a nice interference pattern as for instance in Juffmann2012 [Nature Nanoscience, 2012]. As the interferometer is sensitive to a narrow band of particle de Broglie wavelengths one needs particles of almost the same mass to collect a nice interference pattern. This is taken care of by chemical purification of the molecules after synthesis and also by mass-selective detection with a mass spectrometer in the present experiment. But again this mass dispersion is not a fundamental limitation for molecule interference experiments; it is a technical issue. The question we ask with such experiments is if there is a fundamental reason for the quantum to classical transition - something we cannot overcome by technology.
Regards,
Authors
view post as summary
Don Limuti wrote on Sep. 3, 2012 @ 16:48 GMT
Dear Authors,
You are tackling one of the elephants in the physics room, and as George Ellis commented above this one is hard to sweep under the rug.
Your solution of Continuous Spontaneous Localization [CSL] seems like a good idea to me.
My own work points to the Planck mass as the upper limit of all quantum phenomena including superposition. See my essay for two methods that show this. My essay is "An Elephant in the Room". This is a different elephant than yours (there are plenty of elephants to go around).
Here is a vague outline of an experiment that I believe can be performed that would correlate with your theory:
1. Chose a crystal like diamond to investigate. This is done because diamonds are considered to be a single molecule independent the number of carbon atoms.
2. Create bins of diamonds with increasing numbers of carbon atoms up to the Planck mass.
3. Test these diamond bins via the University of Vienna for the property of interference.
4. I suspect that interference phenomena will gradually decrease with mass and will disappear at the Planck mass. This experiment (if it can be performed) should provide confirmation of your theory.
Good to see you in this contest.
Don Limuti
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 6, 2012 @ 10:12 GMT
Dear Don.
The logic of your proposed experiment is basically what the experiments are aiming for: to increase the mass of particles in matter wave experiments. However, It is technically very challenging to perform these de Broglie interferometry experiments. Problems include: the generation of intense beams of particles at slow speeds, the implementation of an appropriate interferometer to see interference pattern of molecules with higher and higher masses and also the detection of single molecules with sufficient temporal and spatial resolution. On top of that all has to be implemented at ultra-high vacuum conditions. Diamonds would be possible, but there are many other molecules and nanoparticles and clusters, which have to be considered for such experiments. They have to be chosen depending on their special properties for beam generation, interferometry and detection. It is a huge puzzle with many experimental options. To give you an idea about the complexity and influencing parameters, which have to be considered for the experiment see the experimental section of our recent review (Bassi et al. 2012 arXiv:1204.4325) and Hornberger 2012. It would be great to perform an experiment as you suggest, but it will take some time to work out all experimental options to find the optimal setup.
Your results about Planck mass as the cut-off are intriguing. Curiously enough, as you know, Planck mass is already essentially in the macro-regime. Various studies based on gravity induced quantum-classical transition, as reviewed for instance in our above mentioned article, suggest that the transition happens at a few orders of magnitude lower than Planck mass. It would be interesting to try and understand why you get a different result.
Regards,
Authors
Anonymous replied on Sep. 9, 2012 @ 03:34 GMT
Hi Tejinder,
Can you point me to information on testing particles for interference (that I would understand)?
In the essay (http://www.fqxi.org/community/forum/topic/1403) I logically derive the Planck mass (via two methods) as the ultimate mass for a particle. This does not mean there are any particles in nature that can make it to this mass. I define particle as an object with mass that shows the property of interference. It does not surprise me that real particles never get close to the Planck mass.
This is why I was interested in diamonds. They are peculiar because they are hard crystals that are thought to be quantum mechanical at all sizes. I think they have a chance of getting close to the Planck mass.
Let me know what you think,
Thanks,
Don L.
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 9, 2012 @ 19:40 GMT
Dear Don,
Please have a look at this review:
http://in.arxiv.org/abs/1109.5937
Don Limuti replied on Sep. 9, 2012 @ 21:09 GMT
Hi, Post above was by Don Limuti, and not anonymous. Time-out got me.
report post as inappropriate
Don Limuti replied on Sep. 9, 2012 @ 21:58 GMT
Hi Tejinder,
Thank you. http://in.arxiv.org/abs/1109.5937 was very good. There is a lot of "art" and science in these measurements.
Here is conclusion of my essay:
Particles can never be accelerated to "c" because they hit their respective Vmax values first and can not be accelerated further. This is because particles are characterized by their Compton wavelength and at Vmax the Compton wavelength has shrunk to the Planck length, as short as anything can get. The Lorentz contraction (1-v2/c2)0.5 seems to indicate that that the velocity of a particle v can be taken to c but as shown in this essay it can only be taken to Vmax just short of c. Say goodbye to the elephant.
At Vmax all particles:
a. Have the same Compton wavelength which is the Planck length.
b. Have the same mass which is the Planck mass.
c. Have a Lorentz contraction that is equal to m0/Pm
d. Have a Schwarzschild radius that is two Planck lengths.
My contention is that quantum mechanics ends at the Planck mass. This does not mean we can find particles that have this super mass. This is why I am interested in diamonds and your essay which explores this most interesting mass zone from the Buckyball to the Planck mass. I suspect that a diamond with a mass below the Planck mass will show interference and a diamond above the Planck mass will not show interference.
Check out the logic for yourself: http://www.fqxi.org/community/forum/topic/1403
Thanks again,
Don L.
report post as inappropriate
hide replies
Alan Kadin wrote on Sep. 4, 2012 @ 11:39 GMT
Dr. Singh and Colleagues:
You ask an important fundamental question about quantum linear superposition. But implicit in that question is the assumption that linear superposition should be universal. Instead, I would suggest that linear superposition applies ONLY to primary quantum fields such as electrons and photons. Please see my essay "The Rise and Fall of Wave-Particle Duality", http://fqxi.org/community/forum/topic/1296. In this picture, Quantum Mechanics is not a universal theory of all matter, but rather a mechanism for generating localized particle properties from primary continuous fields, where these localized (but not point) particles then follow classical trajectories (as derived from the quantum equations). Composites of fundamental fields such as nucleons and atoms are localized composite objects WITHOUT wave properties of their own, and hence completely without linear superposition. Beams of neutrons or atoms do not require de Broglie waves for quantum diffraction from a crystal lattice, which instead reflects quantized momentum transfer between the beam particle and the crystal. Remarkably, this reinvisioned quantum picture is logically consistent and avoids quantum paradoxes. Even more remarkably, this interpretation seems to be virtually new in the history of quantum theory, although it could have been proposed right at the beginning. The FQXi contest would seem to be an ideal venue to explore such concepts, but this has drawn relatively little attention.
Thank you.
Alan M. Kadin, Ph.D.
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 7, 2012 @ 09:35 GMT
Dear Dr. Kadin,
Thank you for your comments and for your intriguing essay. Experiments which perform matter-wave interferometry with atoms and molecules as large as fullerenes already establish their wave nature and the validity of superposition for them [e.g. please see arXiv:1204.4325]. We wonder how your proposal can be made consistent with these experimental results?
Regards,
Authors
Alan Kadin replied on Sep. 10, 2012 @ 23:53 GMT
Dear Authors:
Thank you for responding to my comment, but you have missed the key point which is at the heart of the quantum paradoxes. The quantum diffraction experiments (all referenced in my essay) are obviously correct, but their interpretation is based on an assumption that is incorrect. As described in my essay, and referenced to the work of Van Vliet, the scattering of a neutron requires a quantum transition of the crystal, which in turn requires a quantized momentum transfer to a degenerate phonon with momentum hG, where G is a reciprocal lattice vector. This gives rise to the classical wave diffraction result, but does NOT require an incident coherent wave. The same is true for an atom, molecule, or buckyball. They are all localized particles, not extended phase-coherent waves. (This is in contrast to electron and photon waves, which really are extended coherent waves with linear superposition.) I realize that this is heresy, but that is exactly the point of this FQXi essay contest - to question assumptions that no one ever questions. Please read my essay more carefully. I have taken great pains to explain everything clearly and consistently. I would be happy to discuss this offline, if that would be appropriate. My email is given in my bio.
Alan M. Kadin, Ph.D.
report post as inappropriate
Eckard Blumschein replied on Sep. 11, 2012 @ 21:49 GMT
Dear Alan Kadin,
"The FQXi contest would seem to be an ideal venue to explore such concepts, but this has drawn relatively little attention."
Perhaps you meant the attention to your essay rather than to the contest. Be sure you explained your remarkable result clearly, consistently, and understandably enough as to persuade any unbiased reader. Your essay was the only one that I called more convincing than quantum logics while everybody so far called my essay overly critical.
I hope, those who read uncommon or even heretical ideas will memorize them and eventually be in position to judge independent of the crowd.
Concerning the discrete vs. analog or linear vs. non-linear issue I would like to iterate what I tied to make aware of in the previous contest where it got unnoticed among more than 400 posts:
Cosine or Fourier transformations are non-linear integral transformations that render a continuous function of (elapsed or anticipated as elapsed) time into a discrete function of (likewise positive) frequency and vice versa.
Regards,
Eckard
report post as inappropriate
Armin Nikkhah Shirazi wrote on Sep. 5, 2012 @ 08:32 GMT
Dear Dr. Bassi and Dr. Singh,
It was a pleasure to meet you at the Quantum Malta conference and I am delighted to see that you have made what is in my view one of the two most important features of quantum theory the subject of your paper.
I agree with the belief that quantum superposition does not hold for macroscopic objects (but for different reasons which are outlined in my paper) and am glad that the predictions of CSL are being put to the experimental test. I just hope that it won't take 20 years, as you suggest in your paper, to test the theory in an adequate regime.
All the best,
Armin
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 7, 2012 @ 09:37 GMT
Dear Armin,
Thank you for your comments, and good to see your essay here.
Regards,
Authors
Andrey Akhmeteli wrote on Sep. 9, 2012 @ 15:50 GMT
Dear Authors,
Thank you for an interesting essay.
Maybe I am missing something, but, on the face of it, there may be some contradiction between the following statements in your essay:
1). "When one considers doing such an interference experiment for bigger objects such as a beam of large molecules the technological challenges become enormous."
2)."However when we look...
view entire post
Dear Authors,
Thank you for an interesting essay.
Maybe I am missing something, but, on the face of it, there may be some contradiction between the following statements in your essay:
1). "When one considers doing such an interference experiment for bigger objects such as a beam of large molecules the technological challenges become enormous."
2)."However when we look at the day to day world around us linear superposition does not seem to hold! A table for instance is never found to be `here' and `there' at the same time. In other words, superposition of position states does not seem to hold for macroscopic objects. In fact already at the level of a dust grain, which we can easily see with the bare eye, and which has some 10
18 nucleons, the principle breaks down."
So if technological challenges are enormous for large molecules, one would think they are downright prohibitive for dust grains or tables, so the principle does not break down, but we just cannot solve the technological challenges to demonstrate it for such objects? The following analogy may be appropriate: we cannot demonstrate reversibility for large objects (e.g., when we break a vase), furthermore, thermodynamics is based on irreversibility, but that does not mean that reversibility fails for large objects.
Another remark. For what it's worth, I expect interference to exist for arbitrarily large objects. My reasoning is based on the following almost forgotten ideas of Duane (W. Duane, Proc. Natl. Acad. Science 9, 158 (1923)) and Lande (A. Lande, British Journal for the Philosophy of Science 15, 307 (1965)): the direction of motion of electron in the interference experiment is determined by the momentum transferred to the screen, and this momentum corresponds to quanta (e.g. phonons) with spatial frequencies from the spatial Fourier transform of matter distribution of the screen. So I tend to make the following conclusion: when the mass of the incident particle increases, the momentum transferred to the screen remains the same, but the angle of deflection of the incident particle becomes smaller, as its momentum is greater. So the mass of the incident particle is in some sense an "external" parameter for the interference experiment.
Thank you
Best regards
Andrey Akhmeteli
view post as summary
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 9, 2012 @ 19:34 GMT
Dear Andrey,
Thank you for your comments.
There is no contradiction actually. When doing an interference experiment with a large molecule, one overcomes the technological challenges to prepare an initial superposed state, and then essentially one waits and watches. If quantum theory is right, the superposition will last forever, an interference pattern will be seen, and indeed it will have been shown that the observed absence of superpositions in daily life is because of practical limitations. On the other hand, if CSL is right, then the superposed state which one has prepared after overcoming the technological challenges will not last forever, and interference will not be seen. This would mean that the absence of macroscopic superpositions is not because of technological challenges, but because of new fundamental physics to which quantum theory is an approximation.
With regards,
Authors
Andrey Akhmeteli replied on Sep. 10, 2012 @ 03:34 GMT
Thank you for the explanation of your position.
report post as inappropriate
DANIEL WAGNER FONTELES ALVES wrote on Sep. 10, 2012 @ 00:06 GMT
Very interesting and clear. Your proposal seems to solve many very difficult problems at the foundation of quantum theory. I wish you luck in the contest, and above all, in the development of your research programme.
Best Regards
Daniel
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 17, 2012 @ 12:52 GMT
Thank you for your kind comments, Daniel.
Authors
Eric Stanley Reiter wrote on Sep. 12, 2012 @ 07:52 GMT
Please:
I show by experiment in this essay contest how quantum theory is an approximation. My experiments refute the Born rule. A singly emitted gamma-ray should go one way or another at a beam splitter, but I show coincident detection exceeding chance. Similarly for an alpha-ray. This supports the Loading Theory, which was misrepresented and misunderstood for ~70 years, which is why no...
view entire post
Please:
I show by experiment in this essay contest how quantum theory is an approximation. My experiments refute the Born rule. A singly emitted gamma-ray should go one way or another at a beam splitter, but I show coincident detection exceeding chance. Similarly for an alpha-ray. This supports the Loading Theory, which was misrepresented and misunderstood for ~70 years, which is why no one considers it. There are two problems:
(1) There are accepted experiments that may be adjusting things to favor QM, and also that researchers have not looked for certain artifacts. A good example is macromolecule diffraction. I do not expect a macromolecule could load up. My experiments and analysis indicate the universe is not crazy and that macromolecules are real particles. But atoms can take on either a wave state or a particle state. My enhanced version of the Loading Theory can explain wave-particle duality up to at least atoms. Physicists may think a macromolecule is neutral, but it is easily charged. It is very likely that many experiments are looking at field deflection effects. To further back my claims, I analyzed one of the Vienna experiments in my essay, and cite several anomalies that do not fit diffraction theory.
(2) The other problem is that my work is so sensational that you are not likely to take it seriously unless other physicists examine it. I have been offering to demonstrate to physicists for 10 years and have performed public demonstration of the gamma-split experiment with little recognition. What I have is for-real and I go with full confidence to face any scrutiny. I made an offer to demonstrate to FQXI people in Brendan Foster's blog on the essay contest.
Please be careful: I do not need to be the one to say bad things about physicists who embrace quantum weirdness because they are invested in it. Now we have a good experimental reason to resolve the paradox instead of embracing it. The history that has misled generations of physicists is in my essay. We no longer need acts of desperation, like superluminal magical collapse of the wave function, etc.
Please see
A Challenge to Quantized Absorption by Experiment and Theory. Also please see
Ragazas' paper that supports the Loading Theory.
Thank you, Eric Reiter, September 12, 2012.
view post as summary
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 17, 2012 @ 12:56 GMT
Thank you Eric for your remarks. We are trying to understand your essay and the one by Ragazas.
Regards,
Authors
Member Benjamin F. Dribus wrote on Sep. 14, 2012 @ 05:41 GMT
Dear Angelo, Tejinder, and Hendrik,
You present a very good idea, all the more so because of the very realistic possibility of experimental verification in the near future. I don’t know if it’s right, but the case you present for pursuing this direction is quite convincing. Indeed, I hope it’s wrong, because it would wreck some of my own ideas about quantum gravity! The universe...
view entire post
Dear Angelo, Tejinder, and Hendrik,
You present a very good idea, all the more so because of the very realistic possibility of experimental verification in the near future. I don’t know if it’s right, but the case you present for pursuing this direction is quite convincing. Indeed, I hope it’s wrong, because it would wreck some of my own ideas about quantum gravity! The universe is oblivious to such considerations, however. A few questions and comments:
1. Presumably this provides an arrow of time, since collapse is irreversible, but perhaps time in this sense fades out of the picture on the fundamental scale where the superposition lifetime becomes infinite?
2. I’m sure this has been addressed, but it seems that there might be some issues involving things like locality and “microscopic constituents” of “macroscopic systems.” Roughly speaking, how does a microscopic system “know” if it is supposed to preserve its own superposition or recognize that it is part of a larger system, which must collapse? One of the main points of the decoherence explanation of the measurement problem is that one must consider microstate, apparatus, and environment simultaneously. I am wondering how this all fits together.
3. You mention Adler’s view that it’s the wrong approach to quantize classical dynamics. This may be correct, but it seems to me that it is simply a choice of assumptions: does one start with the correspondence principle, in which case classical physics is viewed as a limit of quantum theory, or does one start with the superposition principle, in which case quantum theory is built up from classical alternatives? Perhaps the experiments you mention will settle this one way or the other.
4. I will have to look at your reference by Oreshkov et al., to see exactly what they mean by “order.” Again, this might wreck my own ideas if it is right.
5. I don’t expect that you will agree much with my own approach, but if you’re interested to see the motivation for my questions, my submission is here:
On the Foundational Assumptions of Modern Physics.
Thanks for the interesting read. Take care,
Ben Dribus
view post as summary
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 17, 2012 @ 13:13 GMT
Dear Ben,
Thank you for your interesting and important comments.
1. The answer here depends on the stand one has towards collapse models. If one considers the stochastic collapse dynamics as an intrinsic feature of nature, then collapse models define an arrow of time, given by the direction along which pure states become statistical mixtures. Since for material particle lambda is always finite, their dynamics always contains an arrow of time.
On the other hand, if one considers collapse models as phenomenological models emerging from an underlying theory, like the dissipative dynamics of a (classical) particle in a bath is, with respect to the underlying Netwonian dynamics, then there is no in-built arrow of time.
2. The behavior of a system (like in any theory) depends on its state, in this case on its wave function. If the wave function is entangled with a larger system, then it will evolve in a certain way (typically, enhancing the collapse rate); If on the other hand the wave function is factorized from the rest of the world, it will evolve as an isolated system. Whether the system's wave function is entangled or factorized, depends on the previous history of the system.
3. In our view, it would not be a matter of choice whether to obtain a quantum theory from quantizing a classical theory, or derive it from an underlying theory. One cannot assume classical mechanics, define quantum mechanics from that, and then derive classical mechanics. IT is not logical. We agree with you that if experiments show departure from quantum theory in the mesoscopic regime, the need for an underlying theory will be strongly indicated.
Looking forward to reading your essay.
Regards,
Authors
Sreenath B N. wrote on Sep. 15, 2012 @ 16:22 GMT
Dear Dr. T P Singh,
Thanks for your beautifully written enchanting essay. It contains up to date information on QM and its various versions. I, too, have my own version of QM and to know this,please, go through my essay (http://fqxi.org/community/forum/topic/1543--Sreenath) and express your comments on it in my forum. It is continuation of my last year's essay.
In your essay, you have expressed your views with crystal clarity and also proposed an experiment to verify it. However, I feel that, currently there is lot of confusion in distiguishing between the classical world and the quantum one. This confusion,it appears,has arisen as a result of the failure to realise the seperate fundamental traits laying behind both worlds. If we realise this dichotomy then the results of your experiment become obvious before conducting it.
Best regards and good luck in the essay contest.
Sreenath
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 17, 2012 @ 13:15 GMT
Thank you Sreenath, for your remarks; we look forward to reading your essay.
Authors
Patrick Alan Hutchinson wrote on Sep. 15, 2012 @ 18:56 GMT
Dear Tejinder Singh and collaborators
Thank you for your essay, which presents a lot of material completely new to me. We seem to be thinking on the same lines.
I tried to describe a possible form for the kind of nonlinear theory which you suggest. It starts from the notions that
- space-time may have an asymmetric metric $g$ and an asymmetric connection $\nabla$.
From the metric, one can derive (at least) two interesting algebraic features. One is a model of the complex numbers, so $i^2 = -1$. The other is an element $R$ of a Hopf algebra discovered by Dubois-Violette and Launer. $R$ satisfies the "Quantum Yang-Baxter" equation.
$g$ and $\nabla$ are constrained by insisting that
- $\nabla i = 0$ and $\nabla R = 0$.
A solution is a pair $(g, \nabla)$ for which
- the Yang-Mills functional is stationary under all small variations of $g$ and $\nabla$ for which $\nabla i$ and $\nabla R$ are stationary.
There are other variants of this model.
A "particle" is a basic solution, an eddy in the geometry of space-time.
I have no evidence that real physics is like this, but it seems to offer all the apparatus one would expect: variational calculus, Hopf algebras. It fits naturally with general relativity. Solutions $(g, \nabla)$ may form a smooth manifold whose tangent spaces are the Hilbert spaces of quantum mechanics. If so, it seems likely that superpositions of states are unstable, as you suggest.
I would be glad to hear any views you may have on all
this.
Best wishes
Alan H.
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 26, 2012 @ 13:57 GMT
Thank you Alan. I left a short comment on your post.
Tejinder
Peter Jackson wrote on Sep. 17, 2012 @ 12:20 GMT
Angelo et al.
Fascinating essay. I disagree with proofs of Bells inequalities but that does no affect the substance, and I agree Optomechanics and Trace Dynamics, both consistent with my own fully mechanistic approach to causal unification, using a “ radical rethink of how we comprehend quantum theory, and the structure of spacetime.”
I suggest matter can be superposed in terms of additivity, i.e. fluids. Fine sawdust is additive, and at a larger scale 3 billion tables may be equally additive.
Have you considered superposition as long term macroscopic evolution subject to binding energy, so rigidity (viscosity) is the key variable?
And ref the twin slit molecular results; Have considered that molecules may propagate photons on surface interaction at the dense surface electron fine structure slit edges?
I've been discussing a simple causal re-interpretation of the measurement problem and the Copenhagen interpretation based on the mechanism of detection as 'sampling' and modulation discussed in my essay. I hope you'll get a change to read and discuss.
Best wishes
Peter
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 26, 2012 @ 14:09 GMT
Many thanks Peter. Hope to read your essay soon.
Regards,
Authors
Hoang cao Hai wrote on Sep. 19, 2012 @ 14:22 GMT
Dear
Very interesting to see your essay.
Perhaps all of us are convinced that: the choice of yourself is right!That of course is reasonable.
So may be we should work together to let's the consider clearly defined for the basis foundations theoretical as the most challenging with intellectual of all of us.
Why we do not try to start with a real challenge is very close and are the focus of interest of the human science: it is a matter of mass and grain Higg boson of the standard model.
Knowledge and belief reasoning of you will to express an opinion on this matter:
You have think that: the Mass is the expression of the impact force to material - so no impact force, we do not feel the Higg boson - similar to the case of no weight outside the Earth's atmosphere.
Does there need to be a particle with mass for everything have volume? If so, then why the mass of everything change when moving from the Earth to the Moon? Higg boson is lighter by the Moon's gravity is weaker than of Earth?
The LHC particle accelerator used to "Smashed" until "Ejected" Higg boson, but why only when the "Smashed" can see it,and when off then not see it ?
Can be "locked" Higg particles? so when "released" if we do not force to it by any the Force, how to know that it is "out" or not?
You are should be boldly to give a definition of weight that you think is right for us to enjoy, or oppose my opinion.
Because in the process of research, the value of "failure" or "success" is the similar with science. The purpose of a correct theory be must is without any a wrong point ?
Glad to see from you comments soon,because still have too many of the same problems.
Regard !
Hải.Caohoàng of THE INCORRECT ASSUMPTIONS AND A CORRECT THEORY
August 23, 2012 - 11:51 GMT on this essay contest.
report post as inappropriate
Thomas Howard Ray wrote on Sep. 25, 2012 @ 17:53 GMT
Tejinder,
You and your collaborators have done a superb job of explaining why continuous function physics is very much alive, even with all the success of quantum theory over the years.
Nice! Thanks for an essay I am sure to read a few more times.
I hope you get a chance to visti my essay site ("The Perfect First Question.")
Best,
Tom
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 26, 2012 @ 14:01 GMT
Thank you for your kind remarks Tom. Could you possibly give a brief gist of your essay - meaning, what the key points are? I have tried reading it, but some pointers from you will be helpful.
Regards,
Tejinder
Anonymous replied on Sep. 27, 2012 @ 18:12 GMT
Thanks, Tejinder. I am honored that you ask.
The gist, following Wheeler ("it from bit") is that there is zero distance between a yes-no question and its answer, no matter how separated in time; in other words, a relativistic "finite and unbounded" reality does not have to be finite in time and unbounded in space -- general relativity suffers no loss of generality in a model finite in...
view entire post
Thanks, Tejinder. I am honored that you ask.
The gist, following Wheeler ("it from bit") is that there is zero distance between a yes-no question and its answer, no matter how separated in time; in other words, a relativistic "finite and unbounded" reality does not have to be finite in time and unbounded in space -- general relativity suffers no loss of generality in a model finite in space and unbounded in time. The binary relation holds in either case.
This can be made rigorous:
For reasons explained in the essay, we assign positive-definite values + 1/2 to yes, + 1/4 to no; indefinite values of - 1/2 to not-yes and - 1/4 to not-no.
In a Bell-Aspect type experiment, + 1/2 + 1/4 = 3/4 describes the upper limit of probability (75%) that Alice and Bob will have correlated answers, when they decide on a cooperative strategy in advance. Random correlations will fall to the average of correlations, 1/2, by the pair of equations:
+ 1/2 + 1/4 - 1/2 + 1/4 = 1/2
- 1/2 - 1/4 + 1/2 - 1/4 = - 1/2
Because there is no negative probability, the prior assumption of a probability measure function adds a priori an extra sign, such that:
+ (+ 1/2) + 1/4 - 1/2 + 1/4 = 1/2
- (- 1/2) - 1/4 + 1/2 - 1/4 = 1/2
So -- as it should be -- assuming that nature is fundamentally probabilistic, the singular average 1/2 applies in both cases. Underlying this calculation is the implied assumption of probability theory: the hypothesis of equally likely outcomes. That's why the extra sign, which brings with it the additional implication of a non-orientable space.
The contest asks us to identify and question foundational assumptions -- one of the most persistent of these is probabilistic measure. When we eliminate that assumption, what's left is topological orientability, left hand and right hand. That's the reason that I say the distance between question and answer is zero -- because orientability implies an additional degree of freedom by which no matter if the initial answer is yes followed by no (3/4)or not-yes followed by not-not-no, the outcomes is zero. (Thus, zero distance between question and answer, a result that can only come of topological analysis, where distance carries a different meaning than in ordinary geometry.) Thus:
+ 1/2 + 1/4 - 1/2 + 1/4 - 1/2 = 0 (Left Hand, positive rotation in the plane)
- 1/2 - 1/4 + 1/2 - 1/4 + 1/2 = 0 (Right Hand, negative rotation in the plane)
The sign pair, + + and - -, are the same initial condition as the probabilistic model, except that the initial condition is compelled to be orientable, i.e., yes + no (or not-yes minus not-not-no which is sign-reverse equivalent) such that an equal number of measurement events in the orientable space as the probability space, gives a zero remote outcome, implying Right Hand and Left Hand variables.
So instead of a probability average 1/2 for quantum correlated events, dependent on observer orientation and implying an observer-created probabilistic reality, we get a classical continuous wave function, with no probability function at all. The range of continuous values of the wave function are dichotomous correlated discrete values -- binary -- just as Wheeler said.
Therefore:
Local, physical information of a remote measurement outcome is compelled to originate from a point at infinity, because it's that singularity which distinguishes the nonorientable measure space of R^3 from the topology of S^3, the orientable space that is the source of continuous binary variables.
Hope this helps. I was just going to post the "gist," but I couldn't stop myself, because I think the argument is quite elegant.
Best,
Tom
view post as summary
report post as inappropriate
Thomas Howard Ray replied on Sep. 27, 2012 @ 18:13 GMT
Sorry, got logged off. The above is mine.
report post as inappropriate
Georgina Parry wrote on Sep. 25, 2012 @ 19:16 GMT
Dear Angelo Bassi, Tejinder Singh and Hendrik Ulbricht,
I enjoyed your essay. It is very clearly written and accessible.I particularly like the thought provoking questions that you have at the beginning and where you end up, suggestion a possible need to reconsider the relationship of the wave function with space-time.It is good that you are able to propose detailed practical work that will support the presented idea. Good luck in the competition.
report post as inappropriate
Author Tejinder Pal Singh replied on Sep. 26, 2012 @ 14:03 GMT
Thank you Georgina, for your kind comments.
Authors
Frank Martin DiMeglio wrote on Sep. 29, 2012 @ 18:30 GMT
Tejinder, the fact that gravity cannot be shielded and also instantaneity mean that gravity and inertia are in fundamental equilibrium/balance -- in keeping with balanced and equivalent attraction and repulsion -- this means that inertia, gravity, and electromagnetism fundamentally enjoin and balance visible and invisible space in keeping with fundamentally demonstrating F=ma.
The issues raised in your essay, and fundamental unification in physics, requires all of this. Do you agree?
report post as inappropriate
Author Tejinder Pal Singh replied on Oct. 1, 2012 @ 02:28 GMT
Dear Frank,
We have no issues with classical mechanics, of which the equation F=ma is a part. Our point is that quantum theory predicts that the classical world should behave in a certain way, and it apparently does not behave that way [superpositions are not seen in the classical world]. And in our view this calls for an [experimentally falsifiable] explanation. The explanation we discuss possibly involves gravity.
We do not have anything specific to say here about electromagnetism.
Authors
Frank Martin DiMeglio replied on Oct. 1, 2012 @ 02:58 GMT
Hi Authors, fundamental gravitational and inertial equivalency and balancing (both at half strength force) fundamentally proves and demonstrates F=ma. A MOST IMPORTANT POINT -- DO YOU AGREE? (As acceleration is fundamentally balanced and averaged as well.) You will need to show this, in conjunction with the important fact that BOTH gravity and electromagnetism enjoin and balance visible and invisible space. Now, all of this is consistent with instantaneity and the fact that gravity cannot be shielded. You will need to show this too. As you well know, light is known to be quantum mechanical in nature, so all of the above necessarily involves balanced and equivalent attraction and repulsion as well. This will give you fundamental/true quantum gravity. My essay discusses and proves all of this. Can you review and rate it please?
report post as inappropriate
Peter Warwick Morgan wrote on Sep. 29, 2012 @ 22:49 GMT
It's not clear to me that you have adequately considered the possibility that the linearity of QM is conventional (or axiomatic, if you will)? The linearity of probability measures, for example, is /axiomatic/ for disjoint events. I suggest that our practice is to use operators to model the statistics of datasets that we obtain from experimental preparations and measurements /on the conventional...
view entire post
It's not clear to me that you have adequately considered the possibility that the linearity of QM is conventional (or axiomatic, if you will)? The linearity of probability measures, for example, is /axiomatic/ for disjoint events. I suggest that our practice is to use operators to model the statistics of datasets that we obtain from experimental preparations and measurements /on the conventional assumption/ that the Born Rule determines expected values (and probabilities). If we were to use a different Rule, we would model given data using different density and measurement operators; unless there were other changes to the axioms, I suppose the relationship to probability theory might be rendered problematic, which we would presumably avoid. [Although it's extreme nitpicking, I note that QM is perhaps more properly called bilinear, insofar as expected values are linear both in the states and in the measurement operators. A given totality of experimental datasets has to determine both the density operators and the measurement operators. Quantum Theory as a whole is more than just Hilbert spaces and the Born rule, including as it does elaborations such as heuristics for choosing a Hamiltonian for a given Physical situation, but it seems to me that such overlays are not relevant here.]
The rhetorical thrust of your final section's title, "WHY IS QUANTUM THEORY APPROXIMATE?" seems excessive, insofar as for any given finite and finite-accuracy experimental datasets we can in principle construct arbitrarily precise Hilbert space models for that data, particularly by using ever larger Hilbert spaces. I don't suppose that any other mathematics would be different in the respect of accuracy, albeit different types of models might be more or less parsimonious.
Since these are prejudices that I have held for some time against ideas such as you have expressed in your essay, I will welcome an effective rebuttal.
Peter Morgan.
view post as summary
report post as inappropriate
Author Tejinder Pal Singh replied on Oct. 1, 2012 @ 14:00 GMT
Dear Peter,
Thank you for your comments. We whole-heartedly agree that linearity is axiomatic to quantum theory. Our point is: this axiom of linearity leads to a dynamical equation [the Schrodinger equation] whose predictions do not agree with what we see in the macroscopic world around us. The Schrodinger equation predicts that we should be able to observe linear superpositions of...
view entire post
Dear Peter,
Thank you for your comments. We whole-heartedly agree that linearity is axiomatic to quantum theory. Our point is: this axiom of linearity leads to a dynamical equation [the Schrodinger equation] whose predictions do not agree with what we see in the macroscopic world around us. The Schrodinger equation predicts that we should be able to observe linear superpositions of position states of macroscopic objects. Why then do we never observe such superpositions? This is the central question. [The quantum measurement problem, the origin of probabilities and the Born rule, while fundamentally important, are secondary to the afore-mentioned central question].
It may be that this central question can be answered without modifying quantum theory, or without altering its predictions. Thus it may be that macroscopic superpositions are not seen because of decoherence accompanied by the branching of the universe into many worlds. Or it may be that Bohmian mechanics underlies quantum theory.
We are instead considering an experimentally falsifiable answer to this central question, namely that the axiom of linearity has to be given up. To us, the fact that the assumed renunciation of linearity can be tested by ongoing experiments is of great importance. For the first time since the inception of quantum theory, we have concrete quantitative models which explain the absence of macroscopic superpositions, and whose empirical predictions differ from those of quantum theory, and which can be confirmed or ruled out in the laboratory.
We do not quite understand what you meant in your second paragraph. A stochastic non-linear modification of quantum theory gives experimental predictions which disagree strongly with predictions of quantum theory, in the mesoscopic and macroscopic regime. For instance, in a noise-free matter-wave interferometry experiment for particles having mass of million amu, if an interference pattern is not seen, this will certainly imply that quantum theory fails for this system. We do not see how the situation can be saved by enlarging the Hilbert space, keeping in mind that the experiment is disagreeing with the assumption of linearity.
Authors
view post as summary
Peter Warwick Morgan replied on Oct. 1, 2012 @ 16:39 GMT
Thank you for your reply, which I do find effective, even though my intuition directs me elsewhere. My comments in my second paragraph are too embedded in perhaps singular ideas about QM that I ought to have edited out and will not pursue here.
I'm more an EFT person than a ToE person, but I nonetheless find CSL-type modifications of QFT somewhat uncongenial, although I do accept them as...
view entire post
Thank you for your reply, which I do find effective, even though my intuition directs me elsewhere. My comments in my second paragraph are too embedded in perhaps singular ideas about QM that I ought to have edited out and will not pursue here.
I'm more an EFT person than a ToE person, but I nonetheless find CSL-type modifications of QFT somewhat uncongenial, although I do accept them as effective models, when pursued with enough care and detail. I suppose I would somewhat prefer 't Hooft-type models as alternatives to QM, if an alternative is to be pursued, indeed there are a few published papers of mine on random fields.
FWIW, I am currently attempting to develop the consequences of other nonlinearities in QFT, without prejudice to your own intuitions. Higher interaction terms in a Lagrangian formalism, for example, are nonlinear in a sense, albeit they do not directly affect the axiomatic superposition of wave functions. Nonetheless, even the lowest order interaction term, the mass term, modifies interference patterns nontrivially, and the ways in which higher-order interactions modify interference patterns are not restricted purely to what the overall mass of an aggregate object may be. The experimental difficulty of being able to assert definitively that every possible effect of environmental decoherence on whatever interference pattern there may be has been adequately eliminated increases considerably when the aggregate object is large (though this way of putting the experimental situation accepts environmental decoherence as an explanatory move, which I think is awkward for your project; I see that on page 12 you explicitly require that decoherence has been "ruled out", which is, as just mentioned, difficult to ensure and even more difficult to be definitively certain of).
Regularization and renormalization (and the presence of infinite numbers of virtual particles, in the usual way of talking about perturbative QFT) complicate intuitive understanding of interacting QFT, which I take to underlie QM (perhaps too uncritically, but hey ho).
My own FQXi essay begins a discussion of a way of re-conceptualizing interacting QFT that I would hope might in time make it possible to discuss large aggregate objects more clearly (though that's obviously a large hope).
Best wishes,
Peter.
view post as summary
report post as inappropriate
Vladimir Rogozhin wrote on Sep. 30, 2012 @ 10:15 GMT
Dear Dr's Bassi, Singh, and Ulbricht!
Physicists build their phenomenological model of the world ignoring the ontology. But besides the Empirical Standard of foundation of the Fundamental Knowledge required Ontological Standard of foundation. Quantum theory-operative theory, but not conceptual, theory without ontological foundation. Ontology displays dialectical thought to the fact that the whole world is Triune Superposition. What do you understand more broadly in your output «…radical rethink of how we comprehend quantum theory, and the structure of spacetime.»? What is your model of the structure of Space-Time? Thanks for your doubts! Sincerely, Vladimir
report post as inappropriate
Author Tejinder Pal Singh replied on Oct. 1, 2012 @ 14:12 GMT
Dear Vladimir,
By `rethink of how we comprehend quantum theory’ we meant that quantum theory is an approximation to a deeper theory, in the same spirit in which Newtonian mechanics is an approximation to special relativity, and Newtonian gravitation is an approximation to general relativity.
By `rethink of how we comprehend the structure of space-time’ we meant that space-time geometry as described by special and general relativity is an approximation to a more fundamental description of space-time. This is because an aspect of quantum theory such as instantaneous collapse of the wave-function might possibly be inconsistent with the way we are accustomed to describing space-time geometry.
We do not yet have at hand a complete description/understanding of what this underlying space-time structure might be. We are working at it, from various angles. For one possible line of thought, please see one of our recent papers
http://arXiv.org/abs/arXiv:1203.6518 [to appear in Foundations of Physics].
Regards,
Authors
Vijay Mohan Gupta replied on Oct. 2, 2012 @ 22:24 GMT
Dear Mr Singh,
How do you find 'Foundations of Physics'. Being electronic gives the papers a better coverage. Are hard copy journals better to solicit response from Scientific community.
As I know, In 70s Foundations of Physics has one of pioneer journals on physics.
Vijay Gupta
Proponent - Unary Law 'Space Conatins Knergy'.
report post as inappropriate
Peter Jackson wrote on Oct. 2, 2012 @ 12:34 GMT
Angelo, Tejinder, Hendrik.
A real mechanism for 'Continuous Spontaneous Localization' or it's equivalent is discussed in my essay, which you hoped to read but may not have yet. CSL and the STR postulates emerge from the quanta, consistent with your prediction.
An extension towards curved space time then also emerges. I'd still be very interested in your views on my rather ontological construction.
I've now also looked through your recent arXiv paper, and think I agreed with the rather limited areas amongst the mathematics that I understood! There was more conceptual commonality with the foundations of my thesis than I'd expected.
Very best of luck in the final run in. I think all the final 35 and more will be of high quality.
Peter
report post as inappropriate
doug wrote on Oct. 4, 2012 @ 02:26 GMT
Perhaps the difference is the speed of the particles. The smaller particles (electrons/photons/atoms/molecules/etc.) travel much faster. Therefore, they exhibit their spatial equivalency in accordance with CIG (www.CIGTheory.com), and can be "in two places at once" (actually, it is in as many places as the new spatial volume allows).
The large particles (apples/oranges/planets/etc.) do not travel as fast, exhibit little manifestation into new spatial quantities, and therfore no "hallmark linear superposition of quantum theory".
Why do small particles travel faster than large ones, that's what I want to know!
THX
doug (www.CIGTheory.com) comments welcome
report post as inappropriate
Sergey G Fedosin wrote on Oct. 4, 2012 @ 06:28 GMT
If you do not understand why your rating dropped down. As I found ratings in the contest are calculated in the next way. Suppose your rating is
and
was the quantity of people which gave you ratings. Then you have
of points. After it anyone give you
of points so you have
of points and
is the common quantity of the people which gave you ratings. At the same time you will have
of points. From here, if you want to be R2 > R1 there must be:
or
or
In other words if you want to increase rating of anyone you must give him more points
then the participant`s rating
was at the moment you rated him. From here it is seen that in the contest are special rules for ratings. And from here there are misunderstanding of some participants what is happened with their ratings. Moreover since community ratings are hided some participants do not sure how increase ratings of others and gives them maximum 10 points. But in the case the scale from 1 to 10 of points do not work, and some essays are overestimated and some essays are drop down. In my opinion it is a bad problem with this Contest rating process. I hope the FQXI community will change the rating process.
Sergey Fedosin
report post as inappropriate
Sig wrote on Oct. 4, 2012 @ 11:51 GMT
AAA
BBB
C(ig) C(ig) C(ig)
Professor Feinstein: http://www.youtube.com/watch?v=pP3VAtGLQms
Yes - my ratings have dropped (don't know if they were ever up)
thx
doug
report post as inappropriate
Cristinel Stoica wrote on Oct. 4, 2012 @ 18:43 GMT
Dear Dr. Singh, Dr. Bassi, and Dr. Ulbricht,
The problem of why superposition disappears at macro scale, while at micro scale is ubiquitous, is wonderful, and I am fascinated by it as well*. If I understand well, your approach takes GRW to a higher level, offering a nice explanation for it, and QM as an approximation of a stochastic nonlinear theory. It seems to me compelling and elegant.
Dr. Singh, at the previous FQXi contest you commented on my research on singularities in GR. At that time I only had examples of singularities which behave nicely. In the meantime I was able to prove that this behavior is shared by stationary black hole, and FLRW and more general big bang singularities. Moreover, they introduce a metric dimensional reduction, which may be a hope to regularize quantum gravity. I review these results in my current essay,
"Did God Divide by Zero?".
Best wishes,
Cristi Stoica_______________________________
* Here is my take on the problems of quantum mechanics:
"Global and local aspects of causality".
report post as inappropriate
Eric Stanley Reiter wrote on Oct. 5, 2012 @ 00:24 GMT
Excellent essay. Forgive me for not seeing how relevant it was earlier. It seems Continuous Spontaneous Localization is very similar to what I call the Loading Theory. If we are talking about the same thing, my evidence for CSL dates from 2001. The works of your team are new to me, so I am studying it. Very encouraging. I think my experimental technique can measure your lambda time constant. For light (if CSL=LT) it is easy to measure. It may have been measured by Lawrence and Beams ~1928, as the average loading time for the photoelectric effect. You may also like how I re-interpret some fundamental constants to be maximums; we are only able to measure their maximums. In equations for famous experiments exhibiting duality, the constants (e,h,m)are in ratios. We do not see that action, for example, has gone sub-quantum because the ratio is conserved. Hope you liked and rated my
essay.
Thank you
Eric Reiter
report post as inappropriate
Jin He wrote on Oct. 5, 2012 @ 11:36 GMT
You mainstreamsians controle science for over 50 years. You mainstream and Hawking failed. The bad science is because of the Top-Down controle of the people like you. Why do you need money and fame from FQXI where the authors are mostly jobless, are mostly independent researchers, are mostly viXra.org authers? Do you need money and fame by controling jobless???
I want to rate you 0!
report post as inappropriate
Douglas wrote on Oct. 5, 2012 @ 12:17 GMT
Jim He,
If you are tired of top down physics, may we suggest www.CIGTheory.com - as bottom up as it gets.
It explains the described superposition problem by positing that matter turns spatial as it moves ( the faster the rate of travel, the more spatial the manifestation). It is based on a new interpretation of relativity. It is "Plenty Good"!
Bottoms Up!
(OK - maybe I should have left out the part about he alien beings)
I will try to be more professional.....
Did anyone watch my three stooges Dr. Feinstein post??
OK - need feed back on :
Can someone familar with & allowed to post on Google finish my work?
I am trying to fit my theory into Interpretations of Quantum Physics: see -
http://en.wikipedia.org/wiki/Interpretations_of_quantum_mech
anics
Please take the time to understand my theory & fill in the blanks below.
So far, as I understand physics, I have fit my theory into the following:
Interpretation : CIG Theory
Author(s): Douglas Willaim Lipp
Deterministic? YES
Wavefunction real? YES
Unique history? What does this mean?
Hidden variables? NO ( no need for them - I think)
Collapsing wavefunctions? YES (actual - space back to particle)
Observer role? Must be breathing. (i.e. what does this mean)
Local? YES
Counterfactual definiteness? YES ( I think)
Other examples -
Ensemble interpretation Max Born, 1926 Agnostic No Yes Agnostic No None No No
Copenhagen interpretation Niels Bohr, Werner Heisenberg, 1927 No No1 Yes No Yes2 None No No
Bottoms Up!
THX
doug
report post as inappropriate
doug replied on Oct. 6, 2012 @ 23:41 GMT
I think I'm getting there, let me know if you agree:
http://en.wikipedia.org/wiki/Interpretation_of_quantum
_mechanics
definition (partial) from WIKI
Subject: QI CIG
1) COUNTERFACTUAL DEFINITENESS (CFD) is the ability to speak meaningfully of the definiteness of the results of measurements that have not been performed (i.e. the ability to assume the existence of...
view entire post
I think I'm getting there, let me know if you agree:
http://en.wikipedia.org/wiki/Interpretation_of_quantum
_mechanics
definition (partial) from WIKI
Subject: QI CIG
1) COUNTERFACTUAL DEFINITENESS (CFD) is the ability to speak meaningfully of the definiteness of the results of measurements that have not been performed (i.e. the ability to assume the existence of objects, and properties of objects, even when they have not been measured)
YES
2) LOCAL - The principle of locality states that an object is influenced directly only by its immediate surroundings
YES
In the context of quantum mechanics, superdeterminism is a term that has been used to describe a hypothetical class of theories which evade Bell's theorem by virtue of being completely deterministic. Bell's theorem depends on the assumption of counterfactual definiteness, which technically does not apply to deterministic theories................... It is conceivable, but arguably unlikely, that someone could exploit this loophole to construct a local hidden variable theory that reproduces the predictions of quantum mechanics..............
YES = CIG is both deterministic & is CFD , thereby constructing a local hidden variable theory that predicts QM
3) COLLAPSING WAVEFUNCTIONS - In quantum mechanics, wave function collapse (also called collapse of the state vector or reduction of the wave packet) is the phenomenon in which a wave function—initially in a superposition of several different possible eigenstates—appears to reduce to a single one of those states after interaction with an observer. In simplified terms, it is the reduction of the physical possibilities into a single possibility as seen by an observer.
YES - A REAL COLLAPSE
4) HIDDEN VARIABLES (I love a good game of hide and seek) - Historically, in physics, hidden variable theories were espoused by some physicists who argued that the state of a physical system, as formulated by quantum mechanics, does not give a complete description for the system; i.e., that quantum mechanics is ultimately incorrect, and that a correct theory would provide descriptive categories to account for all observable behavior and thus avoid any indeterminism.
NO WITH CIG, Nothing needs be hidden - the variable was found (MTS) EINSTEIN vindicated
5) WAVEFUNCTION REAL - A wave function or wavefunction is a probability amplitude in quantum mechanics describing the quantum state of a particle and how it behaves.
YES
7) DETERMINISTIC - Determinism is a philosophy stating that for everything that happens there are conditions such that, given those conditions, nothing else could happen. Different versions of this theory depend
YES GOD does not play dice
8) UNIQUE HISTORY -
YES - particle and wave dependent upon % "c" = explains Dark matter, Dark Energy, Red Shift Anomalies, Horizon Problem, Double Slit, Combines Spacetime Continuum with the Mass Energy Equation, more
9) OBSERVER ROLE -
YES - based on motion - assumme stationary observer
If anyone knows or wants to try CIG (www.CIGTheory.com) and wants to verify my placement of CIG into the QM interpretation categories, please do so
THX
doug
view post as summary
report post as inappropriate
Jin He wrote on Oct. 5, 2012 @ 13:45 GMT
I visited your website, and I found out that
The mainstream theories are not better than yours.
Therefore, you should be happy with your own life.
report post as inappropriate
Jin He wrote on Oct. 5, 2012 @ 19:18 GMT
MAX PLANK:
An important scientific innovation rarely makes its way by gradually winning over and converting its opponents; it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning.
report post as inappropriate
Doug wrote on Oct. 6, 2012 @ 19:02 GMT
Jin He, (I got the name right this time)
Thank you for visiting my website.
report post as inappropriate
Anonymous wrote on Oct. 11, 2012 @ 17:27 GMT
Tejinder et al.
Congratulations on your place, also with mine a top 10 finish 2 years running. You did say you hoped to read my essay soon, but have not been able to respond yet, or to my subsequent post of Oct 2nd. I will be very interested in your comments. I paste from that second post below;
"A real mechanism for 'Continuous Spontaneous Localization' or it's equivalent is discussed in my essay, which you hoped to read but may not have yet. CSL and the STR postulates emerge from the quanta, consistent with your prediction.
An extension towards curved space time then also emerges. I'd still be very interested in your views on my rather ontological construction.
I've now also looked through your recent arXiv paper, and think I agreed with the rather limited areas amongst the mathematics that I understood! There was more conceptual commonality with the foundations of my thesis than I'd expected."
Thanks, and very best wishes
Peter
report post as inappropriate
Peter Jackson replied on Oct. 15, 2012 @ 16:30 GMT
Tejinder et al.
Thanks for your kind comment ref intuition on by blog. To me the correct conception must come before the mathematics, as Wheeler said, and as proposed in many good essays here. But then more detailed mathematics than the basic logical function I can produce are needed. For your convenience, I replied to your post there as follows;
"Thank you. I see Continuous...
view entire post
Tejinder et al.
Thanks for your kind comment ref intuition on by blog. To me the correct conception must come before the mathematics, as Wheeler said, and as proposed in many good essays here. But then more detailed mathematics than the basic logical function I can produce are needed. For your convenience, I replied to your post there as follows;
"Thank you. I see Continuous Spontaneous Localization in inertial frame terms. It is then implemented by the mechanism of absorption of incoming waves approaching an electron or proton at relative c+v, but re-emission at the new local c of the particle. This was first found by Chandraseckara Raman in 1921 in the work leading to his 1930 Nobel Prize.
If n particles are at rest wrt each other, then they form a co-moving field or 'medium', which when dense enough may be a lens or photodetector (both only ever made of 'matter'). This medium is then kinetically and physically equivalent to a discrete inertial frame, and what is more, this implements local c in ALL cases.
You may have missed the mathematical description in the end notes, it derives c' = c via the inverse changes to lambda and f on waves entering the co moving medium. It exposes an oft forgot case of Doppler shift, which is finding the new lambda of an observers lens medium. If an observer is constituted by matter, then any delta f must be accompanied by a delta lambda. I propose this is the 'simple idea' that Wheeler predicted would be found. The proof is in it's application in resolving multiple anomalies and paradoxes (and mathematically in the simple constant c = f lambda).
On that note; - No!, you misread, I do NOT 'accept special relativity'. Indeed the model proves most of it unnecessary, because (and but) the model derives the POSTULATES of SR direct from the quantum mechanism (i.e. CSL in both it's continuous... and constant... meanings). But I'm not shouting a headline "SR is wrong!". It will be hard enough to persuade most physicists to consider observer lambda as a valid concept! The domain of validity of Cartesian co-ordinate systems for motion is also constrained. Do you agree that too will shock many! The error in SR (removing ANY background instead of just an ABSOLUTE one) will logically emerge with comprehension of model.
9 pages is just half a glimpse, and I know a quick 1st read means missing most of the important elements, their implications and their construction into the beautiful kinetic ontology unifying QM an relativity. I do hope you will read again, or a co-author read very carefully. I'm also interested in your view on the application of CSL in this case.
Very many thanks"
and best wishes
Peter
view post as summary
report post as inappropriate
Jayakar Johnson Joseph wrote on Oct. 15, 2012 @ 14:58 GMT
Dear Tejinder Pal Singh,
As the fundamental matters are described as string like structures rather than point like particles, the wave particle duality is not expressional with
Coherently-cyclic cluster-matter paradigm of universe, and thus the phenomenon of superposition differs in this paradigm, in that the string-length is imperative for the description of the nature and variability of this phenomenon and a different physical theory for observation is applicable.
With best wishes
Jayakar
report post as inappropriate
paul valletta wrote on Oct. 16, 2012 @ 18:04 GMT
This is quite interesting. I have only glanced at he first pages, up to the first diagram. It is my belief that you may have missed a crucial factor, the observer plays a very important role, our detecting brains, do not record superpositions via our consciousness?
report post as inappropriate
Anonymous wrote on Oct. 17, 2012 @ 14:32 GMT
In spite of the fact that FQXi suggests the topical questions for contest, many authors simply ignore these questions. For example Rajna, Crowell, Barbour, Singh.. and others offered their pet theories instead of answering the topical questions. Many other scientists (Dribus,Wharton, Amelino-Camelia..) offered only a few neligible 'wrong' assumptions for formal agreement, whereas the rest of the paper is filled by pet theories. Moreover, it seems that the authors that respect FQXi's questions are considered as "crackpots" and "cranks" by those who ignore the topical questions.
In such a case, FQXi must do one of the two: 1) Since some authors ignore the topical questions then FQXi could prepare an essay contest without any questions. (And all authors simply will republish here their pet theories). 2) The other option is that FQXi must NOT award essays that ignore the topical questions. If FQXi will award an essay that ignores the contest' questions then it will mean that FQXi does not respect its own rules and questions.
Also I should mention that the public and community voting are not able to establish the best essay because most of voters are not competent to judge essays; moreover some voters use fraud. FQXi must establish the best essay using experts but not the public/community voting.
report post as inappropriate
Author Tejinder Pal Singh replied on Oct. 17, 2012 @ 14:54 GMT
Dear Anonymous,
Thank you for your post. We beg to differ with you. The topical question is : “Which of our basic assumptions are wrong?”
Our essay proposes that there are reasons to believe that the long cherished assumption of universal validity of quantum linear superposition is wrong. So we do address the topical question!
What you call our `pet theory’ happens to be a possible alternative to this assumption and it explains the observed world. It is being tested by current experiments.
What is your scientific objection to what we say in the above few lines?
Also, if you are really convinced about what you have said in your post, you should not be hiding behind the cloak of anonymity but rather have the courage to name yourself :-)
Regards,
Authors
Steve Dufourny replied on Oct. 17, 2012 @ 15:11 GMT
Hello,
It is not the fault of FQXi, but it is the fault of several uncompetents inside a kind of team. It is not well indeed. The sciences are not a play. The natural sciences are so important. The business and the economy is a kind of under sciences. If people confounds what are the foundamental roads, so it is sad simply. But it is not due to FQXi,but simply due to several persons implying confusions with their algorythms and bad tools. I beleive that FQXi is a beautiful platform. The play of "known persons" is just a play of unconscious people. In fact,Mr Tegmark and Mr Aguirre have a responsability, they must sort the members and optimize their algorythmic systems. They must also be rational and dterministic. The ideas must be shared with a total transparence. The critics must be transparent and the exhausted publicity of irrational extrapolations is not necessary for a correct universal innovant platform. The strategies must be universal and respectfull of universal values. The natural sciences can save this planet. It is not with extra-irrational-extrapolations that this planet can be harmonized.
Several sockpuppets are not necessary. Like is not necessary the strategies of lies. A general scientist cannot accept these comportments. To be or not to be that is the question dear Jedis of The sphere.
This planet is bizare, but we evolve after all....fortunaly furthermore.
Regards
report post as inappropriate
Steve Dufourny replied on Oct. 17, 2012 @ 15:14 GMT
it is true there in fact, put your name dear anonymous , you are a celebrity ? :)
report post as inappropriate
Eckard Blumschein wrote on Oct. 21, 2012 @ 06:05 GMT
Dear authors,
You see "a need for reconciliation between CSL induced localization, and the
causal structure of spacetime dictated by special relativity."
If I understood your argumentation correctly, CSL is a starting point to an alternative theory that preserves all confirmed predictions of quantum theory; and the CSL model is a stochastic generalization of the nonrelativisitic Schroedinger equation. In other words,
- quantum theories are not entirely wrong but incomplete, and
- CSL doesn't fit to a predefined global causal order.
Isn't a pre-defined order anyway at odds with mere probability?
See Fig. 1 of my essay.
Isn't the still assumed predetermined causal structure of spacetime unrealistic?
Schroedinger did indeed calculate the correct hydrogen spectrum without using SR which seems to require reconsideration too, at least in part.
Could quantum theories be based on cosine instead of Fourier transformation?
With CT there are conjugate pairs like time and frequency, radius and wave number, or position and momentum and therefore uncertainty too.
Eckard Blumschein
report post as inappropriate
Juan Ramón González Álvarez wrote on Oct. 25, 2012 @ 09:41 GMT
Dear Angelo Bassia , Tejinder Singhb, and Hendrik Ulbrichtc,
The answer to your essay question is "no". Quantum linear superpositions are only valid as approximation to an underlying stochastic nonlinear theory. I am glad to find so many points of agreement between our respective approaches; however, I would object to some few points.
First, I would object to the claim that the...
view entire post
Dear Angelo Bassia , Tejinder Singhb, and Hendrik Ulbrichtc,
The answer to your essay question is "no". Quantum linear superpositions are only valid as approximation to an underlying stochastic nonlinear theory. I am glad to find so many points of agreement between our respective approaches; however, I would object to some few points.
First, I would object to the claim that the linear superposition principle is valid for fullerene molecules. The superposition principle applies to the electronic part but not the nuclear framework. Precisely the no-applicability of the linear superposition principle to this molecular entity, as a whole, is the basis for the existence of a well-defined molecular shape, with each Carbon atom vibrating around a well-defined position. In fact, as chemists know very well, the application of the linear superposition principle of quantum mechanics to the full molecule (electrons more nuclei) is at odd with the experimentally well-tested concept of molecular shape.
It is very interesting that you discuss the CSL model because your nonlinear equation (1) can be derived from the generalized Schrödinger equation in section (3) of my essay, in the specific case when there is only one Lindblad operator and this is given by
we recover your (1).
Precisely in my essay I emphasized how this kind of generalized nonlinear equation solves the old Schrödinger-cat paradox because prohibits that a cat was in a macroscopic superposition of a live-cat and a dead-cat. It was a delight to discover that you use (1) for essentially the same.
A difference with the CSL approach is that here the generalized nonlinear equation in my essay is derived from a more general Liouvillian approach. I would add that the explanation for the nonlinearity is not related to any hypothetical gravitational effect of the kind assumed by Penrose and others [*], but has a very different physical origin. However, it is true that nonlinearities depend on the size of the system and that is why cats are not found in superposition states.
Due to size limitations for this contest, I could not discuss all the details, but in my subsequent paper
positive phase space quantum mechanics (here and thereafter denoted by "PPPQM") I offer additional details and theorems of the Liouvillian approach mentioned in my Essay. It is remarkable that this new formulation/interpretation of quantum theory has an alternative matrix representation (I am going to prepare a paper on this; here and thereafter denoted by "MQM"), and that we can find resonances with the trace dynamics mentioned in your Essay. I will discuss some similarities but also fundamental differences.
In the general case, MQM deals with matrices whose elements are ordinary complex-valued functions obtained from the quantization rule in PPPQM. It is interesting that trace dynamics also deals with such matrices.
The non-commutative structure of the matrix phase space is here derived, as in trace dynamics.
A fundamental difference is that the general equations of motion in MQM are not obtained from an action principle as in trace dynamics. The action principle and Hamiltonian-like equations for the matrices are valid only in the 'pure' case limit. The trace dynamics analogue of Liouville’s theorem is valid in the same limit. The consideration of a much more general kind of equations allow us to study far-from-equilibrium phenomena.
It is again interesting that both approaches introduce their corresponding generalization of the Poisson brackets. Although, I do not know what is the interpretation for trace dynamics; references that I know merely introduce the generalized Poisson bracket using the trace formula; maybe you could explain this to me! In MQM the matrix generalization of the Poisson brackets has a clever interpretation in terms of Fréchet derivatives.
Congratulations by your high ranking in the finalists score. I wait you will win some of the prizes.
[*] Their arguments about hypothetical superposition of spacetimes are traced to an incorrect interpretation of general relativity plus an inconsistent quantum gravity approach.
view post as summary
report post as inappropriate
Juan Ramón González Álvarez wrote on Oct. 25, 2012 @ 09:44 GMT
Juan Ramón González Álvarez wrote on Dec. 6, 2012 @ 19:35 GMT
Congrats for your fourth prize! Although, after seeing the complete list of winners, I believe that your work deserved a better prize.
report post as inappropriate
Login or
create account to post reply or comment.