CATEGORY:
Questioning the Foundations Essay Contest (2012)
[back]
TOPIC:
Unitarity, Locality and Spacetime Geometry: Foundations That Are Not Foundations by Lawrence B Crowell
[refresh]
Login or
create account to post reply or comment.
Author Lawrence B Crowell wrote on Aug. 12, 2012 @ 17:16 GMT
Essay AbstractThis essay discusses a possible route towards the removal or deformation of existing physical postulates. This is looked at in light of history. The foundations of interest are unitarity and locality. How these principles are changed or abandoned is first examined in light of previous changes in the understanding of physical foundations. These foundations are examined to question their firmness, and if they give way then unto what do they submit to as emergent properties. Suggested approaches are then proposed within light of the AdS/CFT correspondence, nonlocal BCFW amplitudes in QCD, cosmological quantum phase structure and ultimately the replacement of unitarity by deeper principles of modularity.
Author BioMy graduate work was at Purdue University. I have worked on problems of clock synchronization with general relativity, spacecraft navigation, quantum optics and more recently with IT/programming. I spend much time thinking about issues concerning foundations.
Download Essay PDF File
Yuri Danoyan wrote on Aug. 12, 2012 @ 23:21 GMT
Hi,Lawrence
What mean unitarity is emergent?
More simple please.
Correct quote of Ludwig Wittgenstein
"Whereof one cannot speak,thereof one must be silent"
Major Works: Selected Philosophical Writings p.82
2009 by HarperCollins Publishers
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 01:09 GMT
If you look in P. gibbs' page you will see I break this out in a bit more detail. Unitarity is a limiting case where wave functions are analytic everywhere. Physics based on modularity and nonlocality has no reference to spacetime. Causality in physics is based on propagators or Greene functions that push a field from (x, t) to (x’, t’). Without spacetime this simply does not exist. The removal of the pole or singularity occurs when there are no black holes or in a region of spacetime that excludes big bang singularities.
One of the things I think comes from this is the universe contains only one of each particle. The universe has only one electron, one up quark, one muon, one photon, one Z, one higgs one… . What we observe as individual particles are the same particle within different configuration variables, whether spacetime or momentum-energy. Spacetime is in effect a sort of emergent property, in many ways an illusion, where particles we observe are mirror images of the same particles with different configurations. Baruch Spinoza wrote about something like this, which he called monads.
Cheers LC
Yuri Danoyan replied on Aug. 13, 2012 @ 01:51 GMT
1.The "Monads" belong to Leibniz, The "Modes" coined by Spinoza.
2.The Universe has:
Fermions 12(6 quarks+3 leptons+3 neutrino).
Bosons 12(8 gluons+3 vector(2W+1Z)+1photon).
Numerical supersymmetry not broken.
3.From other side the Universe has:
Fermions 3(proton,electron,neutrino),neutron non-stable
Boson only 1 photon.
See my essay http://www.fqxi.org/community/forum/topic/946
Metasymmetry is broken
In this case Lawrence B Crowell is right.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 13:40 GMT
I thought that Spinoza first advanced the idea of monads.
I write more on this below. If I am right the any particle, say an electron, in some wave function ψ(r, t) at some point in spacetime (r, t), is a projection of a single electron onto those configuration variables. The same holds if the particle is described in momentum-energy variables in a Fourier tranforms
φ(k,ω) = sqrt{1/2π}∫d^3xdt ψ(r, t)e^{ikx – ωt}.
This projection occurs due to the nonlocality of fields, and their physics is described not by analytic functions or unitarity, but rather by modularity.
I write more about this below.
Cheers LC
Yuri Danoyan replied on Sep. 4, 2012 @ 11:35 GMT
I would like to correct above mention summary of elementary particles:
Fermions:
12(6 quarks+3 leptons+3 neutrino).The Generations as a manifestation next cosmological epoch.
Bosons:
4(1 gluon+3 vector(2W+1Z)+1photon).Gluons hasn't color because Pauli Exclusion Principle not valid in 2D space.
See http://fqxi.org/community/forum/topic/1444
The Present time the Universe has:
Fermions;
3 stable(proton,electron,neutrino),1 neutron (non-stable)
Boson:
1 photon.No stable vector mesons,no free quarks,no free gluon.
See my old essay http://www.fqxi.org/community/forum/topic/946
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 4, 2012 @ 16:25 GMT
The gauge boson for QCD, termed a gluon, have a chromo (color charge that is one color plus an anticolor. The QCD charges or colors are label red green blue, which in pairs from the root space of SU(3). This is a set of combinations of these three colors, or 8 in total. The root vectors are
v1 = (rb-bar + br-bar).sqrt{2},
v2 = i(rb-bar - br-bar).sqrt{2},
v3 = (rg-bar + gr-bar).sqrt{2},
v4 = i(rg-bar - gr-bar).sqrt{2},
v5 = (bg-bar + gb-bar).sqrt{2},
v6 = i(bg-bar - gb-bar).sqrt{2},
v7 = (rr-bar – bb-bar)/sqrt{2}
v8 = (rr-bar + gg-bar – 2bb-bar)/sqrt{2}
where r-bar means the complex conjugate of r times γ^0. Guons then have a pair of colors, which exchange those colors with the colors associated with quarks.
Cheers LC
The Spherical Jedi replied on Sep. 10, 2012 @ 23:43 GMT
Hello Lawrence and Mr.Danoyan,
You know Lawrence.When I am not parano, I see the convergences with strings and the 3D.
So I am discussing,:) The light permits to compse all the colors.The angles indeed are relevant.I saw this idea from Mr Dicarlo on the hread of Mr Barbour.
If the angles and the volumes are inserted with the correct quantum finite number, it becomes very...
view entire post
Hello Lawrence and Mr.Danoyan,
You know Lawrence.When I am not parano, I see the convergences with strings and the 3D.
So I am discussing,:) The light permits to compse all the colors.The angles indeed are relevant.I saw this idea from Mr Dicarlo on the hread of Mr Barbour.
If the angles and the volumes are inserted with the correct quantum finite number, it becomes very relevant for our correct 3D architecture, the sphere and its spheres. The combinations are very numerous (rotations spinal,rotations orbital,volumes, serie finite !!!,linear velocity, sense of rotation differenciating m and hv.It permits to unify the gravitation with the 3 other foundamental forces.).
Lawrence I am persuaded that we can create a 3D holographic Sphere and its spheres, cosmologic and quantic. If we consider that the space and the mass and the light are the same at a kind of zero absolute.So if the quantum number is finite and precise.So it implies a real relevance when we insert the rotations and motions more the volumes and the angles. The puzzle is simple and complex. It is relevant to consider that the cosmological number is the same. This serie is so universal. The fractalization in a pure road of primes number seems very relevant with the main central sphere, the most important volume, the 1.
The QCD can be optimized in fact Lawrence. Perhaps that the volumes are still very relevant considering the main light from the main central sphere.
I think that the oscillations can be correlated with rotations and the QM. I see the light turning at the maximum but in the opposite sense than this gravitation in evolution. If the space is also an quantum entanglement.So it is interesting to see its velocities of rotations.and the sense also.In the logic the lattices between spheres disappear in the perfect contact.And if the main central sphere is the most important volume.So it is interesting to see how this space can be checked.In my line of reasoning, the space between sphere can imply so a contraction of this space, like witha vaccuum, but of course two points are necessary, an arrival and a departure of course.It is relevant because we can decrease the space between cosmological spheres.If the arrival point has an other solution ,it is relevant. The second relevance of this line of reasoning is that the mass can be changed in light, so we move at c.The third relevance is that we can decrease with my model,the internal clocks, so the rotations of sphers, so the duration. Now if we check these 3 quantum systems.So we can 1 decrease the space between two spheres.2 we can go at c.(we reencode the mass at the arrival point) and 3 we can decrease our internal duration, so we utilize less of time during the travel. It is the principle of future teleportation. It is there that the volumes of spheres are essential for the stability of informations during the reencoding.
It is very relevant at my humble opinion.
Best Regards
view post as summary
report post as inappropriate
Yuri Danoyan replied on Sep. 11, 2012 @ 19:28 GMT
Lawrence
Your information about 8 gluons known from theory.
My point of view notion "color" proposed for saving Pauli's principle.
But Pauli principle not valid in 2D space
http://fqxi.org/community/forum/topic/1444
Color not need in 2d space
As well as in 2d no gravitation,no Gn
See my essay http://fqxi.org/community/forum/topic/1413
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 11, 2012 @ 20:07 GMT
I will try to get to your paper in the near future.
Cheers LC
Author Lawrence B Crowell replied on Sep. 11, 2012 @ 20:12 GMT
My quote from Wittgenstein is I think closer to the actual German.
The emergence of unitarity means that it is the simplest of modular functions which occurs when there is no nonlocal physics with the singularity. In some of my comments on this blog I discuss this in some greter detail.
Cheers LC
Yuri Danoyan replied on Sep. 11, 2012 @ 20:35 GMT
Planck scale Lp is wrong assumption according my essay
report post as inappropriate
hide replies
James Putnam wrote on Aug. 12, 2012 @ 23:44 GMT
Dr. Crowell,
In reaction to a message I just read posted in Dr. Gibb's blog, your essay is listed. Glad to see you enterred and are sharing your expert opinions again in discussions.
James
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 01:11 GMT
I hope you find this essay not too much to your disliking.
Cheers LC
Pentcho Valev wrote on Aug. 13, 2012 @ 05:00 GMT
Lawrence,
Do you still believe that Banesh Hoffmann and John Norton are wrong in their claim that the Michelson-Morley experiment CONFIRMED the variable speed of light predicted by Newton's emission theory of light?
http://www.pitt.edu/~jdnorton/papers/companion.doc
John Norton: "These efforts were long misled by an exaggeration of the importance of one experiment, the Michelson-Morley experiment, even though Einstein later had trouble recalling if he even knew of the experiment prior to his 1905 paper. This one experiment, in isolation, has little force. Its null result happened to be fully compatible with Newton's own emission theory of light. Located in the context of late 19th century electrodynamics when ether-based, wave theories of light predominated, however, it presented a serious problem that exercised the greatest theoretician of the day."
http://philsci-archive.pitt.edu/1743/2/Norton.pdf
John Norton: "In addition to his work as editor of the Einstein papers in finding source material, Stachel assembled the many small clues that reveal Einstein's serious consideration of an emission theory of light; and he gave us the crucial insight that Einstein regarded the Michelson-Morley experiment as evidence for the principle of relativity, whereas later writers almost universally use it as support for the light postulate of special relativity. Even today, this point needs emphasis. The Michelson-Morley experiment is fully compatible with an emission theory of light that CONTRADICTS THE LIGHT POSTULATE."
http://www.amazon.com/Relativity-Its-Roots-Banesh-Hoffmann/d
p/0486406768
"Relativity and Its Roots" By Banesh Hoffmann: "Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd: A stone thrown from a speeding train can do far more damage than one thrown from a train at rest; the speed of the particle is not independent of the motion of the object emitting it. And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether."
Pentcho Valev pvalev@yahoo.com
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 13:24 GMT
Banesh-Hoffman are right that a particle, even a photon, emitted by a moving frame relative to a stationary frame has more energy than the same particle emitted in the stationary frame. In the case of a photon the photon emitted from a moving frame in the same direction is blue shifted with more energy. These matters concerning the measurement of light speed are old and clearly demonstrated.
To be honest I did not write this essay with the intention of debating century old physics that is well established. I doubt I am going to seriously get around to reading these papers, for they are long and not likely very enlightening. I am not sure why people decide that some aspect of physics is all wrong and devote their lives and work doing battle. This happens with biology in the ongoing reaction to Darwin, but at least the deniers are upholding some theology, which gives some sense for why they do this. There is no such motivating ideology for denying some physical theory that is well established.
The point of my essay is not that relativity is all wrong. It is more that in a quantum field setting at small scales it becomes incomplete. This pertains to black holes that are smaller than a nucleus or within the first 10^{-30} seconds of the big bang and so forth. This does not mean that relativity is overthrown and what I advocate here is found in basic measurement, such as the Michelson-Morley experiment.
Special relativity is not just a subject to be researched, but it really is more of an application these days. It is so well established within its proper domain of experience that its validity is beyond reasonable doubt. General relativity is a subject of research, but it is pretty well tested with no empirical evidence that it fails.
Cheers LC
Pentcho Valev replied on Aug. 13, 2012 @ 14:13 GMT
In 1887 (FitzGerald and Lorentz have not yet advanced the ad hoc length contraction hypothesis) the Michelson-Morley experiment unequivocally confirms the assumption that the speed of light varies with the speed of the light source (c'=c+v) and refutes the assumption that the speed of light is independent of the speed of the light source (c'=c). That is what John Norton and Banesh Hoffmann suggest. Do you agree, Lawrence?
Pentcho Valev pvalev@yahoo.com
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 14:35 GMT
To be honest I would prefer that my essay page not be filled with posts over this imagined controversy.
LC
Author Lawrence B Crowell wrote on Aug. 13, 2012 @ 14:44 GMT
There have been some developments along these lines which may give support for my thesis here. The paper Black Holes: Complementarity or Firewalls? by Almheiri, Marolf, Polchinski, Sully raises an important point. This points out an inconsistency with the holographic principle. They focus on the suggestion that postulate #2; Outside the stretched horizon of a massive black hole, physics can...
view entire post
There have been some developments along these lines which may give support for my thesis here. The paper Black Holes: Complementarity or Firewalls? by Almheiri, Marolf, Polchinski, Sully
raises an important point. This points out an inconsistency with the holographic principle. They focus on the suggestion that postulate #2; Outside the stretched horizon of a massive black hole, physics can be described to good approximation by a set of semi-classical field equations, is to be "relaxed." I take it that this relaxation focuses on the issue of "massive" as the mass approaches around 10^3 to 10^4 Planck units of mass. This still makes the black hole massive when compared to the masses of elementary particles.
In discussions with
Stoica on singularities I suggested the following metric with 1 – 2m/r = e^u. so then
ds^2 = e^udt^2 – e^{-u)dr^2 + dΩ^2.
We now have to get dr from
dr = -2me^u/(1 – e^u)^2du.
Now the metric is
ds^2 = e^udt^2 + -2m[e^u/(1 – e^u)^4]du^2 + dΩ^2.
The singularity is at u = ∞, where the dt term blows up, and the horizon coordinate singularity at u = 0 is obvious in the du term. My rational was that the singularity had been removed “to infinity” in these coordinates. This makes the black hole metric similar to the Rindler wedge coordinates, which does not contain a singularity. In the accelerated frame or Rindler wedge there is singularity. The treatment of the Schwarzschild metric in the near horizon approximation Susskind uses is one where the singularity is sufficiently removed so that field in the Rindler wedge may be continued across the horizon without concerns. In this metric of mine the singularity is at infinity so the analytic functions for fields in the Rindler wedge are replaced with meremorphic functions with a pole at infinity.
Stoica made the observation that this runs into trouble with Hawking radiation. The singularity at infinity causes trouble with the end point of the radiance process for it has to "move in" from infinity. The final quantum process of a black hole is a problem not well known in any coordinates. Your objection does have a certain classical logic to it. However, by the time the black hole is down to its last 10^4 or 10^3 Planck mass units the black hole itself is probably quantum mechanical. In my coordinates (assuming they are unique to me, which is not likely) the singularity at infinity may not have to “move” from infinity. There may be some nonlocal physics which causes its disappearance without having to move at all. This nonlocality is a correspondence between states interior to a black hole and those on the stretched horizon. The Susskind approach does not consider the interior, and he raises this as a question towards the end of his book "The Holographic Principle."
This nonlocaity would be a relaxation of the postulate #2. The issue of unitarity comes into play. If the theory is replaced with meremorphic functions, say analytic in a portion of the complex plane, then fundamentally quantum fields in curved spacetime or quantum gravity is not unitary but modular.
Unitarity is represented by a complex function e^{-iHt} and so forth, which is analytic. The loss of unitarity does not mean there is a complete loss of everything; in particular quantum information can still be conserved. A simple analytic function of this sort describes standard quantum physics. Gravity as we know is given by a hyperbolic group, such as SO(3, 1) ~ SL(2,C), where the latter has a map to SL(2,R)^2. The functions over these groups have posed difficulties for quantum gravity, for they are explicitly nonunitary. The trick of performing a Wick rotation on time or with τ = it is a way of recovering the compact groups we know in quantum physics.
It does turn out I think that we can think directly about quantum gravity by realizing that the SL(2,R) is related to a braid group with Z --- > B --- > PSL(2,Z), and that the braid group is contained in SL(2,R). Braid groups have correspondence with Yang-Baxter relations and quantum groups. The group SL(2,Z) is the linear fractional group, which is an elementary modular form. An elementary modular function is
f(z) = sum_{n=-∞}^{n=∞}c(n)e^{-2πi nz}
which in this case is a Fourier transform. In this case we are safely in the domain of standard QM and QFT. In general modular functions are meromorphic (analytic everywhere but infinity) and analytic condition is held on the upper half of the complex plane.
Of particular interest to me are the Eisenstein series of modular functions or forms. These define an integer partition function, which is an acceptable partition function or path integral for a stringy black hole. I include a graphic here illustrating an Eisenstein function. This has a certain self-similar structure to it, or what might be called an elementary form of a fractal. In this picture unitarity is replaced with modularity. In this more general setting the transformation do no promote a field through time by some operator, but that the operator simply computes the number of states or degrees of freedom in a way that is consistent. Unitarity is then a special case of this, which happens to fit into our standard ideas of causality.
The Eisenstein series describes a partition function or path integral for a black hole. The theory is not one of unitary evolution, but simply one of counting states or degrees of freedom on the horizon. In effect physics is more general than unitarity, where unitarity is a necessary condition to describe the semi-classical states in postulate #2.
Cheers LC
view post as summary
James Putnam wrote on Aug. 13, 2012 @ 15:40 GMT
Dr. Crowell,
Nice work! Just the right amount of historical and introductory theoretical information to help the educated non-physicist have a reasonable opportunity to follow the logic of your essay. That extra information is clearly not filler material or added in an author's attempt to appear to be well informed. Not at all! Your professional viewpoint is made accessible while presenting advanced theoretical concepts. Thank you for the lift-up.
James
report post as inappropriate
Steve Dufourny wrote on Aug. 13, 2012 @ 16:05 GMT
Hi Lawrence,
here are some ideas ...
Hello thinkers,
Very interesting these extrapolations. But I am insisting about the finite groups. Furthere more a photon in my line of reasoning possesses the serie of uniqueness.So a serie begining from the main central sphere.After the serie is a fractalization with specific spherical volumes. the serie is between 1 and x. So a photon is so complex in fact because its quantum number is very important.See also that this numbers the same than our cosmological number of spheres(without the quantum spheres of course).So it is very relevant about the broken symmetry indeed due to informations correlated with volumes and the rotations spinal and orbitals.The tori of stability take all their meaning. See that the system is a fusioned system.The desnity is relevant correlated with mass, and the polarity m/hv due to evolution. So the exchanges is probably a fusioned system and not a binar system in its pure generality.
See also that the informations are very relant when we consider the VOLUMES OF SPHERES !!!The informations can be bosonic or fermionic.Personnaly I beleive that the volumes of fermions are more stable due to the encoding of evolution.The bosonic encoding is more subtle due to its quantum number and its fractal cited above. The sortings and synchros appear with a universal proportionality.
Regards
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 22:11 GMT
Thanks guys for the response. I think the situation we face with quantum gravity may mirror something in the past. The solution might in part be under our noses.
LC
Steve Dufourny replied on Aug. 13, 2012 @ 23:00 GMT
Steve Dufourny replied on Aug. 16, 2012 @ 16:32 GMT
not guys !!!but Steve or Mr Dufourny.
It is a kind of respect above the strategies !
report post as inappropriate
Steve Dufourny replied on Aug. 16, 2012 @ 16:33 GMT
still my paranoid comportment of course.
report post as inappropriate
hide replies
Ted Erikson wrote on Aug. 13, 2012 @ 19:16 GMT
1st timer submission, not yet submitted, while reviewing selected works for possible End Notes.
Good history and very interesting paper but got lost in the heavy math, wish I was abreast of all covered…but love your ideas "G implying 1/mass and QFT as unit less coupling, have different times" (if I interpret correctly)
First I see E/f = h and Power = E/t. Dividing one gets, t/f, so IF t = 1/f it implies either t squared of 1/f squared. Square roots generate plus and minus, a past and future with no present?
Second, mass and energy, respectively, as the inscribed sphere, tangent to the face of a regular tetrahedron where sphere and tetrahedron have equal surface-to-volume ratios at ANY size, e.g. equivalent "activity" as free energy.
Comment? (may use in end notes)
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 13, 2012 @ 22:09 GMT
The gravitational constant in naturalized units is "area," so it is sqrt{G} that is ~ length or reciprocal of mass.
Verlinde has a proposal that the work done by gravity W = ∫F•dr is equal to entropy S = nkT. There has been some controversy over this. However if we look at an increment of work done through some increment of time δt, I will not worry about the relativistic issues with this definition of time for now, then the increment of work is
δW = F•(dr/dt)δt = Pδt.
Power has in natural units reciprocal length squared L^{-2}, or ~ 1/G. Consequently, this increment in work can be written as δW = δt/G ~ n/G. We interpret G as the fundamental Planck unit of area, and n = # of Planck units of area generated by this work. This would then correspond (roughly) to the Bekenstein bound or entropy S = kA/4L_p^2. This is why I think his entropy force of gravity pertains to moving the holographic screen.
Cheers LC
Ted Erikson replied on Aug. 15, 2012 @ 17:12 GMT
Thank you. My approach is perhaps simply too naive, but suggests work for confirming Dr. Tykodi's approach and a preliminary definition of consciousness, aka panpsychism..
report post as inappropriate
Edwin Eugene Klingman wrote on Aug. 14, 2012 @ 00:29 GMT
Dear Lawrence Crowell,
You begin your essay with a well written summary of physics history, beginning with "the motion of particle executes little variations to find the proper path", then undergoing a "radical shift [from] variation of the least action in classical physics [to] the path integral in the quantum mechanics of fields." Like some others in this essay contest, I am more inclined to attempt to derive quantum theory from classical fields than vice versa, so I particularly liked your explanation that "constructing a propagator for a field on that very same field" leads to problems.
In analyzing the limits of space-time, you point out that we are limited by the fact that beyond a certain point, our probe creates black holes that hide the information from us. [That's one reason I treat non-relativistic quantum mechanics and weak field gravity, where we know, at least potentially, whereof we speak.] Thus you point out, "space-time itself is a barrier to the complete specification of an observable." You then say "information conservation demands...". If you'd care to comment on the grounds on which you base a belief in "information conservation" I would be interested. I know it is often assumed nowadays, but I'm not sure on what it is based. I assume you do not begin with quantum error correcting code to achieve this.
While I don't buy either quantum gravity or supergravity, nevertheless your observations about "the breakdown in the ability to measure everything about the universe" are quite interesting, as is your conjecture that this implies time, unitarity, locality, and causality to be emergent. You seem to agree with Philip Gibbs, so I suspect these are the waters the "math beyond physics" school swim in today. In my previous essays and in my dissertation, "The Automatic Theory of Physics", I presented logic and mathematics as emergent, so I tend to question any ultimate conclusions based on math that go beyond physical barriers to observation. Frank de Meglia may have as much claim to this territory as anyone.
Nevertheless, having chosen to play the game of 'math beyond physics', you do a bangup job of it, ending up with one electron, one quark, one photon in a universe based on underlying quantum error correction codes.
Best of luck in the contest,
Edwin Eugene Klingman
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 14, 2012 @ 13:54 GMT
It is not difficult to quantize weak gravity. This is usually written as a bimetric theory g_{ab} = η_{ab} + h_{ab}, where η_{ab} is a flat spacetime (Minkowski) metric and h_{ab} is a perturbation on to of flat spacetime. We may write a theory of the sort g_{ab} = (e^{ω})_a^c η_{cb}, where the bimetric theory is to O(ω) in a series expansion
g_{ab} =~ (δ_a^c + ω_a^c) η_{cb}.
Gravitons enter in if you write the perturbing metric term as h_{ab} = φ_aφ_b, or ω_a^c = φ_aφ^c. The Ricci curvature in this weak field approximation is
R_{ab} - (1/2)Tg_{ab} = □h^t_{ab},
with h^t_{ab} the traceless part of the metric, and □ the d’Alembertian operator. Which in a sourceless region this computes plane waves. The two polarization directions of the graviton may then be interpreted as a form of diphoton, or two photons in an entanglement or a “bunching” as in Hanbury Brown-Twiss quantum optical physics.
If we now think of extending this to a strong field limit there are the square of connection terms Γ^a_{bc} in the Ricci curvature, or cryptically written as R ~ ∂Γ + ΓΓ where there is the appearance of the nonlinear quadratic term in the connection. This nonlinear term indicates the group structure is nonabelian, so the photon interpretation breaks down. The graviton in this case is a form of di-gluon, or gluons in a state entanglement or chain that has no net QCD color charge. This connects with the AdS_n ~ CFT_{n-1} correspondence, where for n = 4 the conformal field theory is quark-gluon QCD physics. Further D-branes have QCD correspondences and this takes one into the general theory I lay out. One does need to look at the references to learn more of the specifics. The quantum phase transition to entanglement states is given in the paper I write in ref 11
L. B. Crowell The simple fact is that as physics develops it will invoke new mathematics. I don’t think I am overly mathematical in this essay, and I leave most of those details in the references. A theoretical physicist I think is wise to have a decent toolbox of mathematical knowledge and thinking. Physics invokes ideas of symmetries, remember Noether: symmetry corresponds to conservation law, and invariant quantities can also have connections with topology and number theory. I think the more one is familiar with advanced mathematics the more capable one is of thinking deeply about these matters.
It is true that my work is commensurate with P. Gibbs’. If field theoretic locality and spacetime are emergent structures then so is causality. This emergence is connected with a quantum phase transition, or a quantum critical point (tricritical point of Landau), and something occurring on a scale much larger than the string length.
Cheers LC
Edwin Eugene Klingman replied on Aug. 14, 2012 @ 17:11 GMT
"Physics invokes ideas of symmetries, remember Noether: symmetry corresponds to conservation law, and invariant quantities can also have connections with topology and number theory."
"Invokes" is an excellent choice of word. My impression is that many physicists today would go farther and claim that symmetry is the basis from which the universe 'emerges' -- a very questionable assumption.
I also agree that "the more one is familiar with advanced mathematics the more capable one is of thinking deeply about these matters." But that doesn't address the issue that "mathematics hangs on logic." And to assume that when space and time are abolished (coming "close to what we might call nothingness") somehow logic and math still exist, is to assume a lot. I believe it is a wrong assumption.
Edwin Eugene Klingman
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 14, 2012 @ 19:00 GMT
Large symmetries are clearly important. The more general a symmetry group is, say with a larger Lie group, the transformations of that group can maintain a more general vacuum as a vacuum. In other words, symmetry preserves the ground state (vacuum), and broken symmetry does not, or maintains a more restricted ground state. There may of course be other elements to the foundations of physics than simply using ever larger Lie groups, such as removing certain postulates like locality of field data.
A lot of this about mathematics and logic relies upon the philosophy of mathematics, which I have read about and find somewhat interesting. However, I am not that steeped in the subject, nor does it concern me that deeply. Some mathematical subjects have no reference to geometry, such as most of number theory. Of course we humans have to exist with all our causal structure in spacetime to study it. However, a mathematical realist would say that number theoretic proofs are true whether we know them or not.
Cheers LC
Edwin Eugene Klingman replied on Aug. 14, 2012 @ 23:33 GMT
"...a mathematical realist would say that number theoretic proofs are true whether we know them or not."
That is exactly the type of religious 'belief in a Platonic realm of math' that I referred to on Phil Gibbs thread. Besides, the question is not "whether we know them or not", but whether, in a space-less and time-less state that comes "close to what we might call nothingness", it makes any sense to claim that logic and math "exist".
An 'appeal to authority'-based statement, unproved, and almost certainly unprovable, is exactly the type of assumption that this essay contest was designed to challenge.
Edwin Eugene Klingman
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 15, 2012 @ 00:22 GMT
I am not particularly interested in getting further into math foundations issues.. My point is there are plenty of mathematical topics which are independent of geometry. The question of mathematical realism, which is related to Platonism, is something I am not interested in debating much. There are various schools of math foundations, intutionism, constructivism and so forth, and I am not particularly a partisan to one over the other. The mathematics I work with is not terribly dependent on set theory subtleties, and most mathematics used in physics is not directly dependent on these matters.
This extends to Blumschein as well. If I recall he has some big alt-math idea that “upends” the foundations of mathematics. I am not terribly interested in going there.
Cheers LC
Eckard Blumschein replied on Aug. 15, 2012 @ 21:31 GMT
"An 'appeal to authority'-based statement, unproved, and almost certainly unprovable, is exactly the type of assumption that this essay contest was designed to challenge." Because I did not design this contest I would like to be more cautious and replace the speculative word "designed" by "tempting".
Blumschein, that's me, does also not see the solution to what some physicist dare to call a crisis to be found in intuitionism or constructivism. Nonetheless he got aware that Hilbert behaved rude toward Brouwer. Hilbert's successor Hermann Weyl rejected considerable parts of Hilbert's set theory and called Hilbert a piper to whom all followed like rats.
Blumschein just tries to upend back what was upended. He agrees on that set theory (ST) is unnecessary - except for providing the elusive feeling of rigorous foundations.
In his last essay he pointed to some ST-related imperfections in mathematics. This time he tries to investigate how Cantor's naive ST not just resembles but even contributed to similar confusion between model and reality in physics.
He sees unitarity and the usual notion of time abstract notions that must not be believed to fit one-to-one with reality. He sees causality (in the sense of contextual dependence on what already happened) an indisputable property of reality.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 15, 2012 @ 23:21 GMT
I did it again and pasted the response in as a fresh post.
LC
hide replies
Christian Corda wrote on Aug. 14, 2012 @ 09:37 GMT
Fantastic Essay LC, I hope to send my entry within this week.
Cheers,
Ch.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 14, 2012 @ 13:51 GMT
Thanks for the thumbs up. I seem to be falling downwards in the community rankings, though my paper has only been up about 36 hours. I am not sure what is going on there, for I know the physics I present is better than a whole lot of the papers ranking higher.
Cheers LC
Steve Dufourny replied on Aug. 14, 2012 @ 14:50 GMT
hello to both of you ,
Mr Corda,
Happy to see you again.
Regards
report post as inappropriate
Eckard Blumschein wrote on Aug. 14, 2012 @ 22:04 GMT
Dear Lawrence Crowell,
While you now correctly spelled annus, I see you wrong again: "The introduction of the Monad, as Leibniz conceived it, is a direct result of his disagreement with Descartes and Spinoza." (http://www2.sunysuffolk.edu/schievp/file22m.html)
You wrote on p. 3: "This would not quantum mechanics in any natural or realistic
way." I do not understand this sentence.
According to the title of your essay, unitarity is a foundation that is not a foundation. I wonder why you did not anticipate readers like Yuri and me who do not feel forced to immediately understand such play with words just because you mentioned the nebulous word "emerging". Having searched for "unitarity" in the text of your essay, I did not get the due explanation but only two hits.
The abstract promised replacement of unitarity by modularity, a word that is not at all mentioned in the text.
We merely learn: unitarity "might be emerging". In "abandonment of locality and unity" you did perhaps also mean unitarity, not unity; because the next sentence speaks of the loss of unitarity.
I do not just criticize some imperfections but I am also ready to factually question it if you are willing to deal with my admittedly quite different view.
Sincerely,
Eckard
Eckard
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 15, 2012 @ 00:17 GMT
My response is below. I didn't put it in a response.
LC
Author Lawrence B Crowell wrote on Aug. 15, 2012 @ 00:03 GMT
The matter over Leibniz is not in my essay. I appear to be wrong with respect to Spinoza in my blog post.
The sentence is missing the word be, which is unfortunate. it should read, "This would not be quantum mechanics in any natural or realistic
way."
I discuss modularity towards the end. I was planning on breaking this out further, and in fact did so more. However, I exceeded the word/page limits for the essay. So this got rather scant mention at the end.
The word unity is supposed to be unitarity.
The emergence of unitarity is complicated. The existence of singularities means that quantum wave functions are not analytic functions. They are meremorphic, which define modular function or forms. I wrote a bit more on this in a post above on Aug. 13, 2012 @ 14:44 GMT. This is a deep subject, which gets into Borel groups, Leech lattices and so forth. The length restriction on this essay prevents me from breaking this out. Besides most of this that I have worked out is on notebook paper and not published. Yet analytic functions or unitarity occurs in the special case the singularity is removed or has minimal nonlocal connection to a region outside the event horizon in the case of a black hole. This occurs for large black holes.
I am not sure what the main objection is you want to raise. I am not likely to respond much if your objection is about the foundations of mathematics or set theory.
Cheers LC
Anonymous replied on Aug. 17, 2012 @ 02:27 GMT
Lawrence,
Meromorphic (not meremorphic) means analytic with exception of singularities. For instance Joy Christian made the singularity of Riemannian sphere at infinity an issue in physics. Used to write C U {infty}, mathematicians are treating only the "north-pole" of it as a singular point while they do not take care for the "south-pole", i.e., for the zero. For EEs like me it is common practice to operate with poles and zeros almost as naturally as with north and south.
If we assume that the singularity alias actual infinity is just a mathematical fiction, then this might also hold for zero and the ideal (Peirce) continuity; and those who ascribe physical correlates to singularities are simply victims of their inability to realize that even the most advances mathematical tools are just tools that must not be misused in an intuitive pre-mathematical manner. Weren't you unable to refute Ernst Fischer's essay?
Eckard
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 17, 2012 @ 02:57 GMT
Fischer’s essay is a case of how one gets out of something what one puts in. He uses equation of state for static matter to show that there is no singularity. This is of course to be expected. The matter is composed of particles on nongeodesic paths in spacetime, which if these are meant to modelgeodesic flow corresponds to a violation of the equivalence principle. Otherwise this is just a model of a star or some bulk material object which has no singularity by construction. I am a bit amused that his essay is in the top slot.
Check out Gibbs’ blog entries, he spells very colorfully. I wont misspell meramorphic again, or is it merumorphic or … :)
Cheers LC
Eckard Blumschein replied on Aug. 17, 2012 @ 21:08 GMT
Lawrence,
Concerning singularity, I decided to add new arguments
here. Your objection against Ernst Fischer were already rejected by himself.
Anyway, I appreciate your insight that pre-mathematical assumptions are decisive.
Eckard
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 17, 2012 @ 21:36 GMT
Fischer never really countered my argument. He indeed said there was no motion, geodesic or otherwise (which is not quite right). He used a static equation of state to show that a collapsing body (not static) does not produce a singularity. It is trivially true, but this is not the case of a collapsing body so there can't be a singuarity.
There is motion, even for a body at rest, for it is moving forwards in time. For a gravitating body in a static configuration matter is on a nongeodesic motion forwards in time.
When I get the time I will read your essay.
Cheers LC
Steve Dufourny replied on Aug. 18, 2012 @ 00:27 GMT
Hello to both of you,
Eckard, you know the singularity is not a mathematical infinity you know.
The serie is a finite group of spheres !!!
So the continuity is specific when we want to quantize the mass. The singularities and the singularity are inside the physicality !!!
The infinity , so the light without movement, is above our physicality. It is totally different. If this universal axiom is not respected for the quantization, so it is not possible to understand the correct necessary serie.
Furthermore the principle of equivalence is so important, and the finite groups are essential for this equivalence between mass and energy.
If not we have probelms in the calculations just because the tools have not limits and domains.
There is that said a paradox about the entropy and its maximum.My equation shows that we can add or multiplicate. But in fact this maximum is not possible to reach !Furthermore just a part is sufficient. When we fractalize this energy, it is incredibly important like energy.The mass polarises the light after all on this line time. There is so a limit of maximum.But it is pardoxal.But not for the infinite light ....so the maximum entropy, physical is not a probelm but a tool ! We could nourrish our planet with 1 water drop during the eternity....the singularity and the singularities possesse this maximum entropy.
Regards
report post as inappropriate
hide replies
Author Lawrence B Crowell wrote on Aug. 15, 2012 @ 23:19 GMT
I think that unitarity is a special case of modular transformations when there is no singularity, or if in the case of a black hole the singularity is hidden by a very classical event horizon that causes decoherence of nonlocal fields across it. If the event horizon is quantized, say with a very small black hole, then this breaks down. Further, the meaning of spacetime and light cones becomes lost. They are so to speak blurred out by quantum fluctuations.
If the definition of spacetime breaks down on a very small scale then the definition of time is lost. If there is no effective definition of time there is then no unitary time development of quantum states or observable by an operator, such as the Hamiltonian. This is a loss of unitarity.
The Wheeler-DeWitt equation tells us this to begin with. The Hamiltonian in classical gravity is zero, or NH = 0, for N the lapse function. This is a standard result of ADM general relativity. The reason for this is Gauss’ law, where there is no boundary sphere around the universe by which one can integrate out the mass-energy contained within. This argument can be posed according to the nature of coordinate time in general relativity, where this is a frame dependent quantity and physics should not depend upon it. So the Schrodinger equation
i∂ψ/∂t = Hψ = 0
is seen to be zero on both the left and right hand side in a consistent manner. This is the Wheeler-DeWitt equation Hψ = 0, which is the quantum form of the Hamiltonian constrant NH = 0. There is in this case no Hamltonian which acts as a Hermitian operator that define a unitary time development operator. Unitarity is gone.
What takes the place of unitary transformations are modular transformations, in particular the Eisestein series and Jocobi functions. The Jacobi functions are realizations of the E_8 and Leech lattice Λ_{24} based sporadic groups --- in particular the Mathieu group of quantum error correction.
In that case my essay does propose the removal of or change in established physical postulates.
I am less concerned with mathematical foundations. The connection to matters such as Zermelo-Fraenkel (ZF) set theory is at best very subtle, and really could be nonexistent. ZF set theory has some strange features, such as the duplication of spheres with the Axiom of Choice. There are alternatives such as Polish sets. This goes double for philosophical issues over how or whether mathematics exists independent of physical reality or the mind of a mathematician. These questions simply go far beyond the scope of what I am concerned with.
Cheers LC
Steve Dufourny replied on Aug. 16, 2012 @ 16:30 GMT
You think really that you can make what you want with my spheres and spherization Theory or What ??? :)
well a duplication of spheres with an axiom of choice , it is interesting.
and a toe also no? second :)
third :) soon at New York my friends and we shall discuss about my spheres in live.
They turn so they are...
ps eureka form belgium, the real toe.The real gut, the real spherization ! with humility of course.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 16, 2012 @ 16:51 GMT
The duplication of a sphere is called the
Banach-Tarski paradox. This is the result of "immeasurable measures" that can occur with the axiom of choice.
Cheers LC
Steve Dufourny replied on Aug. 17, 2012 @ 21:46 GMT
I am going to catalyze you in live.
:) you know that I am not a fan of paradoxs.
The meiosis of a sphere, interesting.or a mitosis.interesting :)
immeasurable measures ??? Not really rational that.
You know Lawrence, I find your knowledges very relevant, but you know the aim is not to enumerate the concepts but to apply them with a pure rational and dterministic road. In fact , when there are too much pseudo convergences, so it implies an ocean of confusions.Implying an impossibility to have a true general theory.
In fact you know indeed your physics and maths.But is it sufficient for the generality. I am surprised to see how you interpret the boundaries ? You know Lawrence ,forget your chains ....and open your universal heart.
The mind of a mathematicians is the same than for physics. They must be always rational and dterminsitic.
ps you can make better :)
ps2 the unitarity, the singularity,it is these central spheres, Lawrence.
ps3 I have an idea for the serie,the fractal from the main cnetral sphere, in logic the serie is universal at all scales for the uniqueness. I ask me if the primes can help, I beleive that yes for the periodic oscillation.between 1 and x. The number of planets become relevant for our universal sphere.and we take the number 1 for the central sphere. The volumes are under this logic. The primes can help for the correct serie. It is essential to have finite groups and boundaries you know Lawrence for our quantization and our axiomatizations.If not the thermodynamics are not ok.like the proportions of our universal mecanic.
It is evident you know.
Regards and good luck, your essay is very well.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 17, 2012 @ 23:55 GMT
Steve,
I am not entirely sure what you are trying to say. Though I am sure you have some sense that it is deep and profound.
Cheers LC
Steve Dufourny replied on Aug. 18, 2012 @ 00:06 GMT
I become completely crazy you know Lawrence. My parano is important. I cannot stop you know.
ps think about the finite groups and the serie of uniqueness.
Regards
report post as inappropriate
hide replies
Joe Fisher wrote on Aug. 16, 2012 @ 14:32 GMT
Dear Doctor Crowell,
Because I am an uneducated non-physicist, I was unable to fully understand many of the exquisite rational arguments you touched upon in your exceptional brilliantly written essay. I was immensely gratified by your posted contention that “One of the things I think comes from this is the universe contains only one of each particle” seems to agree with my understanding of the singular universe as posited in my essay Sequence Consequence. Could technological advancement be confusing all of us as to the true nature of the universe? Natural sunlight barely penetrates 10 fathoms into the ocean. Yet fabricated electrical flashlights are used thousands of fathoms deep. I do not understand how fabricated electrical light can overcome density while much more powerful sunlight cannot.
report post as inappropriate
Anonymous replied on Aug. 16, 2012 @ 16:57 GMT
Light is attenuated in water by particles that absorb or scatter light. The extinction of light over a distance is the same for sunlight striking the water surface and for photons leaving a flashlight underwater.
I'll take a look at your essay. Good luck.
Cheers LC
report post as inappropriate
Georgina Parry wrote on Aug. 17, 2012 @ 09:07 GMT
Hi Lawrence,
I get the feeling I understood more of this essay than last years. So it is, I think, more accessible. (I made two separate attempts.) The history was interesting. I understood a chunk of problems you pointed and the reasons for then considering each in turn, though not the mathematical explanations themselves. My own failing I know. It is probably straightforward to those with the necessary familiarity with mathematics.
As I rarely understand much of what you write I think we have both done rather well with this essay. I wish I was able to give more positive feedback. I hope you get lots of informed readers who will be able to properly understand and talk to you about the essays content, which is probably far more fascinating than I can appreciate. Good luck in the competition.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 17, 2012 @ 17:01 GMT
There is a growing understanding of a correspondence between quantum chromodynamics (QCD) and gravitation. The AdS/CFT correlation is one of them. More recently developments such as the
BCFW recursion relationship indicate that calculation techniques for QCD and quantum gravity are related to each other. Gravitons are I think entanglements of gluons, or what we might call gluon chains. Certain complex self interacting states of gluons can form effective mass states. Remember that gravity interacts with mass-energy, so gluons can be self-interacting --- similar to gluons. In classical gravity there are some solution types that are intermediate to the near field solution, a black hole, and the far field solution as gravity waves. A black hole can be thought of as a condensate of particles or gravitons in a state that is completely self-confined.
Quark-gluon plasmas produced by RHIC and the lead heavy ion collisions at the LHC can produce very transient states corresponding to black holes, or with tiny quantum amplitudes corresponding to black holes. These amplitudes are not large enough to generate a full bonifide black hole, as seen in previous fears of the LHC producing an Earth devouring black hole, but they should be sufficient to test these theories.
holographic gravitonCheers LC
Avtar Singh wrote on Aug. 17, 2012 @ 16:29 GMT
Hi Lawrence:
I enjoyed reading your paper, especially the discussion related to the widely known “Foundations that are not Foundations.”
The fundamental question is how to determine what is the most fundamental reality or physical process that governs the Foundation of the universe. I demonstrate in my posted paper -“
From Absurd to Elegant Universe” that current crisis in physics and cosmology as evidenced by the well-known paradoxes and singularities are artifacts of the missing Foundation of the fundamental physics of the spontaneous decay and birth of particles. Hence, many of the so-called foundational assumptions or phenomena are shown to be artifacts rather than foundation of the universe or a universal theory. When this missing foundational physics is counted in, it not only successfully predicts the observed accelerated expansion of the universe and galactic star velocities but also resolves paradoxes and singularities of the Cosmic Conundrum today. It also provides understanding of the inner working foundations of quantum mechanics.
I would greatly appreciate it if you could please review my paper and provide your comments.
Thanking you in advance,
Best Regards
Avtar Singh
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 17, 2012 @ 17:58 GMT
Hi Avtar,
I loaded up your paper, which time permitting I will try to read today. Singularities in one sense do reflect a failure of an existing theoretical structure. In a more general theory they becomes something else, or are removed.
There is a growing understanding of a correspondence between quantum chromodynamics (QCD) and gravitation. The AdS/CFT correlation is one of them. More recently developments such as the
BCFW recursion relationship indicate that calculation techniques for QCD and quantum gravity are related to each other. Gravitons are I think entanglements of gluons, or what we might call gluon chains. Certain complex self interacting states of gluons can form effective mass states. Remember that gravity interacts with mass-energy, so gluons can be self-interacting --- similar to gluons. In classical gravity there are some solution types that are intermediate to the near field solution, a black hole, and the far field solution as gravity waves. A black hole can be thought of as a condensate of particles or gravitons in a state that is completely self-confined.
Quark-gluon plasmas produced by RHIC and the lead heavy ion collisions at the LHC can produce very transient states corresponding to black holes, or with tiny quantum amplitudes corresponding to black holes. These amplitudes are not large enough to generate a full bonifide black hole, as seen in previous fears of the LHC producing an Earth devouring black hole, but they should be sufficient to test these theories.
holographic graviton Cheers LC
Author Lawrence B Crowell replied on Aug. 19, 2012 @ 18:05 GMT
Your essay proposes a way in which special relativity can be extended to global spacetime. Your results are departures from standard cosmology. I suppose I am not sure how the cosmological constant depends upon the velocity of a particle. The potential you compute in equation 5 PE = ∫Gmm*dr/r, where I presume there should be a dr in there, appears to be similar to the calculation of a moment of inertia. The redshift factor z diverges as v --- > c in a special relativistic type of theory, but this runs into trouble with luminosities.
Cheers LC
Avtar Singh replied on Aug. 20, 2012 @ 15:39 GMT
Hi Lawrence:
Thanks for your replies and comments on my paper.
Yes, the results of my paper and book – The Hidden Factor show departure from the paradoxical and inconsistent results of the Standard Cosmology. My paper shows that when the missing physics of spontaneous decay are taken into account, it cures many ills of the standard cosmology and successfully predicts the observed expansion of galaxies and the universe.
You asked – “…… how the cosmological constant depends upon the velocity of a particle?” The cosmological constant represents the kinetic energy (velocity) of the particles residing and moving close to the speed of light within the so-called vacuum space. This kinetic energy is the mechanistic description of the mysterious dark energy still un-described by the standard model.
In response to your comment –“The potential you compute in equation 5 PE = ∫Gmm*dr/r, where I presume there should be a dr in there, appears to be similar to the calculation of a moment of inertia”, a complete derivation of the gravitational potential is provided in the attached pdf file.
Also, responding to your comment- “The redshift factor z diverges as v --- > c in a special relativistic type of theory, but this runs into trouble with luminosities”, in the GNMUE model describe in my paper and as shown in figure 3, V is never larger than C; hence the luminosity equation has no singularities or infinities.
I hope I answered all your questions satisfactorily. I would be glad to answer any other questions or comments.
Best Regards
Avtar Singh
attachments:
Gravitation_Potential_Derivation__Excerpts_from_my_book.pdf
report post as inappropriate
Lawrence B. Crowell replied on Aug. 20, 2012 @ 18:27 GMT
There is a question here concerning expansion of the universe, and a comparison with the Andromeda galaxy which is indeed moving towards our galaxy. So let us start with the basics. I will outline the understanding of cosmology as currently understood.
Let the distance to some galaxy far away be x. I find that this distance x is changing, so I assign a scale factor a. So the time...
view entire post
There is a question here concerning expansion of the universe, and a comparison with the Andromeda galaxy which is indeed moving towards our galaxy. So let us start with the basics. I will outline the understanding of cosmology as currently understood.
Let the distance to some galaxy far away be x. I find that this distance x is changing, so I assign a scale factor a. So the time evolution of a distance x is given by
x = x(t) = a(t)x(0)
In this way this motion of any distant galaxy can be compared to this scale factor which expands (or contracts if that were to be the case) with the dynamics of the universe.
Now consider the next ingredient. The energy E of a particle of mass m moving in a central gravity field by some mass M at a distance r is
E = (1/2)mv^2 – GMm/r
The total energy E is constant, and largely can be ignored. In particular if the universe expands so there is no recollapse we can set it to zero. We concentrate on the velocity
v = dx/dt = x(0)(da/dt) = x(0)a’, prime means time derivative,
so that (1/2)mv^2 = (1/2)(a’)^2(x(0))^2. Now concentrate on the gravity part. We set r = x, the distance to other galaxies, and we assign an average density so that the mass M is a sum of all these galactic masses M = ρVol. The volume out to some radial distance x is then Vol = (4π/3)x^3 = (4π/3)a^3(x(0))^3. We put all of this together and we get the equation
(a’/a)^2 = 8πGρ/3.
This equation is close to what one gets with general relativity, where here we have just used Newtonian mechanics and gravity. There is with general relativity an additional –k/a^2 factor related to the constant energy E, which for a spatially flat universe has k = 0.
How the Hubble constant is H = (a’/a), which is a constant in space, but not necessarily in time. The Einstein cosmological constant is Λ = 8πGρ for some constant vacuum energy density ρ, and so the Hubble parameter is then
H^2 = (a’/a)^2 = Λ/3
For some other mass-energy density, such as matter or radiation, the density is dependent on the scale factor a.
For those familiar with differential equations the solution to a’ = sqrt{Λ/3}a is an exponential function. This is the expansion driven universe we do observe. For a small scale factor this exponential is approximately linear a’ ~= (1 + sqrt{Λ/3})a which gives the Hubble relation found in the 1920s v = Hd. So for a galaxy as a distance d the Hubble parameter multiplied by that distance gives the velocity. The Hubble parameter is approximately H = 74km/sec/Mpc.
The red shift factor z = v/c, which by the Hubble law is z = Hd/c. This is an approximation, where H should be thought of as the Hubble parameter that is constant on the spatial surface of the Hubble frame. The distance is d = c/H = 3x10^{5}km/sec/74km/sec/Mpc = 4054Mpc or 1.3x10^{10}ly. The apparent magnitude of an object is m = M + 5(log_{10}d - 1), for M the absolute magnitude and d the distance. For objects at z = 1 the Hubble distance matches the luminosity distance d = 10^{(m-M)5+1}. In fact this works out to the most distant galaxies observed out to z = 10.
This does mean that objects are commoving with expanding space faster than light. It does turn out that we can still receive photons from them. Explaining that is for another day. The CMB limit is out to z = 1100, and the luminosity matches a distance of 46 billion light years. How this is larger than the distance conversion to 13.7 billion years is due to the dynamics of space.
Cheers LC
view post as summary
report post as inappropriate
Avtar Singh replied on Aug. 21, 2012 @ 17:48 GMT
Hi Lawrence:
Thanks for your reply.
You have provided an alternative explanation to the observed accelerated expansion that combines Hubble expansion with expanding space. But this explanation does not address the fundamental physics missing from current theories leading to the well-known singularities, paradoxes, and inconsistencies in QM and GR.
The critical question is why the space is expanding. The so called dark energy, which is the assumed cause, still remains allusive with regard to its fundamental mechanism. The Relativistic expansion model GNMUE described in my paper explains the observed galactic as well as universe expansion with a physical model of the spontaneous decay of mass providing the expansion energy for space thus solving the mystery of dark energy or cosmological constant. Another feature of my model is that V never exceeds C, hence relativity is never violated. Further, it resolves many paradoxes of the standard cosmology and provides understandings of the inner workings of QM. The other alternative explanations of expansion, such as yours, may solve just one problem but do not address the many ills paralyzing physics today because of the root cause missing physics at the core.
Best Regards
Avtar Singh
report post as inappropriate
hide replies
Jayakar Johnson Joseph wrote on Aug. 18, 2012 @ 05:45 GMT
Dear Lawrence B Crowell,
I think, without quantum physics we cannot describe the events of universe coherently, in that unitarily is imperative for its completeness. Thus the unitarily of the universe and the locality of its events are expected to have their outcomes as nonzero, in that the
current scenario of dimensionality from point source is contradictory.With best wishes,
Jayakar
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 20, 2012 @ 18:29 GMT
I have to confess I am not having the easiest time figuring out what you have written here. Good luck on this. I will try reading again in the next couple of days.
Cheers LC
Author Frank Martin DiMeglio wrote on Aug. 19, 2012 @ 00:23 GMT
Hi Lawrence. Gravitational and inertial equivalency and balancing is fundamental to balanced and equivalent attraction and repulsion and to fundamentally stabilized and balanced distance in/of space as well. Importantly (and moreover), this fundamentally proves/demonstrates F=ma.
And, this is all consistent with instantaneity and the fact that gravity cannot be shielded. (Obviously, the fact that gravity cannot be shielded is connected with instantaneity.) Balance and completeness.
Your essay reflects your fine ability. I encourage you to broaden and embolden your thinking.
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 20, 2012 @ 18:30 GMT
Thanks for the positive response.
Cheer LC
Author Lawrence B Crowell wrote on Aug. 23, 2012 @ 21:33 GMT
The Britto, Cachazo, Feng, Witten (BCFW) recursion relationship is a way in which a complex scattering process can be decomposed into tree level diagrams. The picture attached describes the process
A set of gluon momenta entering a region (we set those leaving as the negative of entering as done in the STU symmetries) may be written as the sum of products of two diagrams. To start one...
view entire post
The Britto, Cachazo, Feng, Witten (BCFW) recursion relationship is a way in which a complex scattering process can be decomposed into tree level diagrams. The picture attached describes the process
A set of gluon momenta entering a region (we set those leaving as the negative of entering as done in the STU symmetries) may be written as the sum of products of two diagrams. To start one chooses two gluons, here the k and n lines bolded. The sum is over all cyclically ordered distributions of gluons on each sub-amplitude (one with k and the other with n mometa) and one sums further over the helicities of the internal gluon.
To formulate this requires the use of bispinors, or what are in effect twistors. BCFW recursion is a development in Witten’s “twistor revolution” in string theory. The momenta for a gluon, a null momenta as it is massless, is written as p_{aa’} = λ_aω_{a’}. This exterior product is a form of twistor, and the two spinors for the inner products (λ, λ’) = ε_{ab}λ^aλ^b, [ω, ω’] = ε_{a’b’}ω^{a’}ω^{b’}. (I use parentheses because carrot signs cause trouble with this blog) There is a notation convention that one spinor type has ( ) as an inner product and the other a [ ] inner product. This is the convention that has emerged and is here to stay. If we have two momenta p_{aa’} = λ_aω_{a’} and q_{aa’} = λ’_aω’_{a’} then
p•q = λ_aω_{a’}λ’_bω’_{b’}δ^a_b
948;^{a’}_{b’}
= = λ_aλ’_bω_{a’}ω’_{b’}δ^a_b
948;^{a’}_{b’}
= (1/2)ε^{ab}λ_aλ’_bε^{a’b’}ω_{
a’}ω’_{b’} = ½(λ, λ’)[ω, ω’]
A tree level amplitude A(1,2,…,n-1,n) of n cyclically ordered gluons. Each gluon has momenta p_i^{aa’} = λ_i^aω_i^{a’} corresponding to the two spinors. We pick out our two gluons of interest and define a momentum
p_k(z) = λ_k(ω_k - zω_n},
p_n(z) = (λ_n + zλ_k)ω_n
which are forms of the twistor equations. The momenta of the other gluons remain unchanged p_j(z) = p_j, for j =/= k or n. This theory involves then the transformations on the two elements of the bispinor as
ω_k --- > ω_k - zω_n
λ_n --- > λ_n + zλ_k.
Now examine the amplitude under this transformation
A(z) = A(p_1, p_2, …, p_{k-1}, p_k(z), … p_{n-1}, p_n(z)),
Now a complex function of z. This amplitude is on shell for all z and momenta are all conserved.
Breaking up the “blob” into these two parts is then equivalent to writing this amplitude as
A_k = sum_{ij}A_{j+1}(1/P_{ij}^2)A_{k – i+1}
The momentum flowing through a tree diagram is equal to the sum of external momenta. This sum in the propagator is the sum of momenta in adjacent external lines, where here the index j stands for k and n P_{ij}(z) = p_i(z) + …+ p_j(z) = sum j_j + p_k(z) + p_n(z). By the construction above it is clear this turns out to be independent of z. In the summation we let k lie within the range i,j and n in the range j+1 … .
The P_{ij}(z) = P_{ij} + z_kλ_n so the square is then P_{ij}^2(z) = P_{ij}^2 – z(λ_k|P_{ij}|ω_n], here evaluated on both pairs of spinors. Thus we have
1/P_{ij}^2(z) = 1/(P_{ij}^2 – z(λ_k|P_{ij}|ω_n]) =
(1/P_{ij}^2)(1/(1+z(λ_k|P_{ij}|ω_n])/P_{ij}^2)
or as
A(z) = sum_{ij}ρ_{ij}/(z – z_{ij}), for z_{ij} = z(λ_k|P_{ij}|ω_n]/P_{ij}^2
This then has simple poles at z = z_{ij} where the residues ρ_{ij} are evaluated with ∫A(z)dz/z. The residues correspond to internal lines which are placed on shell.
This then in general corresponds to the recursion relationship, where we set
A_k = sum_{ij}sum_hA^h_{j+1}(1/P_{ij}^2)A^{-h}_{k – i+1},
where now I have included the sum over helicity states. The recursion relationship is evident where the two terms in the numerator may be further decomposed. This procedure with P_{ij}(z) = P_{ij} + z_kλ_n evaluated at the pole reduces all off-shell processes in the “blob” on the left hand side of the diagram to an on-shell process in the evaluation of residues.
Cheers LC
view post as summary
attachments:
BCFW_recursion_rule_2.GIF
Cristinel Stoica wrote on Aug. 29, 2012 @ 06:20 GMT
Dear Lawrence,
Congratulations for the essay. I like how you walked through the assumptions about space and time, showing how they changed in the history, and how you discussed the deformations of the foundations. I found the second part more difficult to me, so I had to reread it with more care. I really hope that unitarity and locality are not lost, but if they are, the implications you foresee are very interesting.
Good luck with the contest,
Cristi Stoica
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 29, 2012 @ 17:48 GMT
I wrote the response in a fresh text box. So if you are getting those respose alerts by email you will be appraised.
LC
Steve Dufourny replied on Aug. 29, 2012 @ 19:47 GMT
pay attention the dream team ahahahah wait wawwwww impressing your maths ahahah.
they have the latex in their head ahahah Chriti, Florin, Georgina,Jonathan, Joy, Ray, Lawrence, Edwin,Mickael,Don,James, JCN,goodband they say hahahah wawww imrpessing the strategy in some years, wawwww ahahah make surf band of comics ! I have seen your real heart . Dark and vanitious and without consciouness.Ahahah pay attebntion, I don't see their play, pay attention, they superimpose the algorythms, waww they are so intelligent.
And what after ahahaha band of comics.
I will fight with honor, faith, universality, universal love !!!
report post as inappropriate
Steve Dufourny replied on Aug. 29, 2012 @ 19:58 GMT
ahaha and Joe and Frank and alan and ted an,d friends who insists ahaha poor thinkers
Occupied with startegies instead of studying from real innovators.ahaha ironical no,
And what after? that is all you can make ???
You can make better perhaps become there it was easy to find the players and easy to play also. But it is just a suggestion of course.ahahah ironical.
Jonathan and lawrence,them make surf in california, Don, Florin and Jonathan,them are at New York, Edwin and Eckard them speak about consciousness wit James and Brendan and Johan them travel of course.And who pay for these things, still the people of course like always.Georgina prefers the prime quaternionic bridge and of course joy implies the connection. and what after , a course of maths.
You are ironical !
Vanity of vanities , all is vanity !
report post as inappropriate
Author Lawrence B Crowell wrote on Aug. 29, 2012 @ 17:47 GMT
If locality and unitarity are not fundamental it means there is a huge reduction in the number of fundamental degrees of freedom in the universe. In fact if you read my paper referenced in my essay you see that the number of degrees of freedom on a brane are boost dependent, and are thus not fundamental. The huge number of elementary particles we observe in the universe are just the same type...
view entire post
If locality and unitarity are not fundamental it means there is a huge reduction in the number of fundamental degrees of freedom in the universe. In fact if you read my
paper referenced in my essay you see that the number of degrees of freedom on a brane are boost dependent, and are thus not fundamental. The huge number of elementary particles we observe in the universe are just the same type of particle under multiple copies of emergent spacetime configuration variables This means there is fundamentally only one electron, one down quark, one Z particle, one Higgs particle, one photon and so forth. We observe any of these single particles under a huge number of “projections,” if you will, which are due to the emergence of configuration variables on a spatial manifold.
I think that quantum gravity is not unitary, but that it probably conserves quantum information. The issue I raised on your essay blog with coordinate change with the singularity removed to infinity connects with this. The quantum wave functions are not unitary, but with the appearance of a pole they are meromorphic. These functions are then more fundamentally modular functions, or modular forms, which operate on lattices. These lattices are E_8 or the Leech lattice Λ_{24}, which are quantum error correction codes.
I am not very happy with how this is turning out. First off I am not garnering the type of attention I would prefer to see. Secondly my essay is languishing at #46, where 10 to 15 of the essays ahead of mine are TOTLSHT. About an equal number I fail to see as better than mine. In fact the paper by Fischer that has been near the top is basically wrong; he uses a static matter solution (the TOV equation of state) for a dense star to prove that a collapsing body (not static mind you) does not form a singularity. Thirdly, since I had to re-edit my essay, due to the fact it went over a bit to page 10, it was later hosted but I was not given a voting code. My attempts to rectify this situation have failed.
In the near future I will try to rattle some people’s cages to see if I can get more attention, and maybe a few votes that buoy me upwards a bit. I have been rather busy and frankly a bit depressed about how this seems to be turning out.
Cheers LC
view post as summary
Cristinel Stoica replied on Aug. 30, 2012 @ 04:23 GMT
"If locality and unitarity are not fundamental it means there is a huge reduction in the number of fundamental degrees of freedom in the universe."
I see now what you mean, and I think you're right. These symmetries sound like a kind of "gauge freedom".
Best regards,
Cristi
report post as inappropriate
Eckard Blumschein replied on Aug. 30, 2012 @ 05:31 GMT
With his judgment TOTLSHT LC will perhaps win less sympathies than for instance Christi who even declared non-constant numbers "great work". This comment of mine is not meant to appreciate non-factual kindness.
How many degrees of freedom has an empty sheet of paper? Call me an anus, I think LC is not even wrong if he demands a huge number of fundamental degrees of freedom in the universe. I see his gauge freedom in company with Einstein's naive observer-bound perspective.
If my own essay did not just face more attention but at least one tangible critical comment, those who might tacitly agree with my admittedly unwelcome arguments will certainly be happy.
Eckard
report post as inappropriate
Author Lawrence B Crowell wrote on Aug. 30, 2012 @ 19:21 GMT
Chris,
There is more to this, which I could not break out due to length limitations. The gauge symmetries are Yangians, or enveloping algebras. These have a duality, where the gauge symmetry in one representation is dual to another without spacetime configuration variables.
Eckard,
I argue for a massive reduction in the number of degrees of freedom. In fact if the universe has quantum states given by E_8xE_8, it means the universe has only 496 fundamental degrees of freedom, or in its supersymmetric extension 512 = 2^8.. In the Leech lattice Λ_{24}there are 4096 weights, due to the theta function representation over 3 E_8 groups, and Λ_{24} is the automorphism of the Conway group Co_1 with rank 8,315,553,613,086,720,000. The full automorphism over the Fischer-Griess group is of rank 808,017,424,794,512,875,886,459,904,961,710,757,005,754,368,
000,000,000, which is huge. Yet in this total extended picture the number of real degrees of freedom is only 4096.
The actual number of elementary particles is then very small, but they have multiple representations in configuration variables. The configuration variables are a system of entanglements, or holographic projections, which give the appearance of a large number of particles.
I don’t think the fundamental issues with physics lie with the foundations of mathematics. I might be wrong of course, but I really do not think mathematics has been on some fools errand for the last 150 years or more.
Cheers LC
Torsten Asselmeyer-Maluga wrote on Aug. 31, 2012 @ 10:09 GMT
Lawrence,
a really interesting and enlightning essay. In most cases, only "boring" agreement between us. The BCJ duality is very interesting. Before reading your essay I started to study this duality but now I understand its relevance.
At one point we maybe disagree: "...that spacetime is not a complete concept". We found a contrary point of view (see
my essay) especially to express the "fuzzyness". Interestingly, modularity is also important there and locality is unimportant (by diffeomrophism invariance). In particular, the diffeomorphism group is not a Lie group (rather a pseudo-group) and the description of the local part (some substitute of a Lie algebra) used enveloped algebras in an essential way. You see "boring agreement" at wide parts.
Best
Torsten
report post as inappropriate
Author Lawrence B Crowell replied on Aug. 31, 2012 @ 18:24 GMT
I read your paper a week ago with the idea of reading it again with greater attention to detail and your references. I just reread your paper, but unfortunately not in great detail, so I have yet to dig into your paper at great length. I have to confess I have read a pretty small minority of the paper on this essay website.
I went through the Atiyah, Donaldson, Freedman work on exotic...
view entire post
I read your paper a week ago with the idea of reading it again with greater attention to detail and your references. I just reread your paper, but unfortunately not in great detail, so I have yet to dig into your paper at great length. I have to confess I have read a pretty small minority of the paper on this essay website.
I went through the Atiyah, Donaldson, Freedman work on exotic four manifolds some years ago. I thought there were certain prospects for a quantum description from this. The difficulty I see with this is that manifolds which are homeomorphic but not diffeomorphic leave a big question on how one defines a Polyakov measure in a path integral
∫(D[g, ψ]/diff(g, ψ)) exp(iS)
where one “mods out” diffeomorphisms or gauge dependencies. The thought occurred to me that in 11-dimensions the dual to four dimensional spacetime is a 7-dimensional space. In that case there are these 28 distinct differentiable structures Milnor demonstrated to exist. I think by doing this the really tough problem with Donaldson’s theorem might be transformed to a much more tractable problem. The Cartan matrix for the E_8 is the same as the matrix associated with Donaldson’s theorem. The 28 differential structures of the 7-manifold I have pondered have some relationship to the complex G_2, the automorphism of E_8.
Physically spacetime will never be observed to have a foamy or grainy structure. The reason is simple. If I am right there are only one of each type of elementary particle. The multiplicity of elementary particles exists because they are holographic projections onto configuration variables. The configuration variables are simply a measure of how an electron here is entangled with another “there,” whether there means an electron in a nearby transmission line, or the degenerate gas in a white dwarf or anywhere in the universe. The same holds for a photon, down quark and so forth. So any UV particle, say a photon, it may “feel” noncommutative geometry more than an IR photon, but due to their entanglement this effect is cancelled out. In effect the extreme IR boson from Hawking-Gibbon radiation is equivalent to an extreme UV boson, and so the apparent fluctuations at the UV scale are removed.
The physical effect of the emergence I propose is with quantum information exterior and interior to a black hole. There exists a duality between the two data sets, and if we were to develop a Planck energy accelerator (which we will not do) then scattering amplitudes should reflect this fact. We do however have a possible window into this with gravity as the “square” of gauge theory. Gluon scattering amplitudes should carry this information as well. This may then be accessible to LHC types of experiments.
I will read your paper in greater detail in the near future, for it is one of the better ones I have seen submitted. It might take me a week or so to make more detailed comments.
Cheers LC
view post as summary
Torsten Asselmeyer-Maluga replied on Sep. 3, 2012 @ 13:12 GMT
Dear Lawrence,
thanks for your answer. Yes it is not an easy problem to consider exotic 4-manifolds. Actually from the differential topological point of view, two non-diffeomorphic 4-manifolds are distinct. Therefore you have to sum over these possibilities in the path integral and for each class the measure remains the same. (see arXiv:1112.4882 and arXiv:1003.5506)
Your idea about...
view entire post
Dear Lawrence,
thanks for your answer. Yes it is not an easy problem to consider exotic 4-manifolds. Actually from the differential topological point of view, two non-diffeomorphic 4-manifolds are distinct. Therefore you have to sum over these possibilities in the path integral and for each class the measure remains the same. (see arXiv:1112.4882 and arXiv:1003.5506)
Your idea about the splitting of the 11-manifold (I assume a compactification?) looks interesting. The meaning of the E
8 in the intersection forms of 4-manifolds and its relation to the corresponding Lie group is mysterious for me too.Currently I have no idea to bridge this gap. A possible way is the theory of calibrated manifolds. Every oriented 4-manifolds embes into a 7-manifold and one can choose a G
2 structre on the 7-manifold. Then the 4-manifold is an associated submanifold of this calibrated geometry. The deformation theory has a large overlap with Seiberg-Witten theory (the way to describe exotic 4-manifolds).
Another connection is via singularity theory (Arnolds approach) by using the ADE singularities. The E_8 singularity is directly related to the E_8 4-manifold.
I agree that spacetime has no foam structure. In particular, I like your argument that there are only one of each type of elementary particle. I resolves a conundrum in this theory. (so I have to go more deeply in your essay and the corresponding papers) If we started to describe matter by exotic smoothness we used the Casson handle and obtain similar results like in our (now published) paper arXiv:1006.2230. But we always got one type of a particle for each type. Maybe you understand the reason and I will read it more carefully. In principle, in the current version of the paper we have the same problem: we obtain the fermions as knot complements and the bosons as torus bundles. There are three types of torus bundles related to the usual groups U(1), SU(2) and SU(3). Currently we conjecture that gravity is a sphere bundle (which will be explain the universality). Interestingly, there are interesting connections between the Anosov torus bundle (represneting the SU(3) gluons) and the sphere bundle which I omit. Gravity as the "square" of a gauge theory is very intersting for me in this context.
I have also to understand your duality more fully. I remember back on lectures of Fadeev about Yangians (currently I dust my notice of the lectures). The deformation theory of Lie algebras is also part of our description of exotic smoothness. We obtain the deformation in a natural way using codimension-1 foliations. Then we obtain a relation to skein spaces (used to define R matrices for quantum groups).
So, our approaches converge in some sense, I will read your essay more carefully
Best wishes
Torsten
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 3, 2012 @ 23:44 GMT
Dear Torsten,
Thanks for the reply. I will try to read your papers on this in the near future. I also need to review matters of the Atiyah-Singer index, Seiberg-Witten theory, Freedman- Uhlenbeck work on moduli at singular points and the rest. Back in the late 1990s I was better spun up with these matters.
The one thing which I think needs to be considered is that spacetime is...
view entire post
Dear Torsten,
Thanks for the reply. I will try to read your papers on this in the near future. I also need to review matters of the Atiyah-Singer index, Seiberg-Witten theory, Freedman- Uhlenbeck work on moduli at singular points and the rest. Back in the late 1990s I was better spun up with these matters.
The one thing which I think needs to be considered is that spacetime is hyperbolic, and all of this algebraic geometry machinery is set up for elliptic complexes. We might of course Euclideanize spacetime by considering τ = it. We then have –dt^2 = dτ^2 and we patch over the problems. This in effect deforms the moduli space so that sequences of gauge equivalent connections converge, say as a Cauchy sequence. With out this trick the moduli space is not Hausdorff and we do not have universal convergence conditions.
An 11-dimensional spacetime, 10 space plus 1 time, decomposes into the M^{3,1} plus M^7. Poincare duality on the total space tells us that homological data on the 4 dim part is equivalent to the data on the 7-dim part. Of course this may not necessarily have all the data, where homotopy tends to contain more. However, if we were to run with this the exotic data for smoothness might be contained in the 7-dimensional part. Of course at lower energy these spaces become compactified. In the 10 dimension supergravity theory the space of compactification is a Ricc flat 6 dimensional space. A canonical example is the 5-torus. A more potentially realistic theory is K3xK3. The 7 dimensional space in the 11 dimensional theory embeds the 6-manifold.
The first exception al group G_2 fixes a basis in a 7-sphere, as vectors in J^2(O). This consists of the vectors V and two spinors S1 and S2. This fixes a vector in spin(7) on the 7-sphere with spin(7)/G_2 ~ S^7. G_2 fixes a frame for the octonions or E_8 and acts as a gauge group. In addition
dim(G_2) = dim(spin(7) – dim(S^7) = 21 – 7 = 14
The complexified version of G_2 (G_2xC) is seen from the double covering so(O) ~ so(8). The inclusion of of the algebra g_2 into so(O) maps a 14 dim space into 28 dimensions of so(8).
There then seems to be some possible relationship between the G_2 ~ Aut(E_8) and the 28 cyclic group for 7 distinct exotic 7-spheres of Milnor. I also think this G_2 as a gauge action plays a possible role in the holographic reduction of 10-dim supergravity. The physics boosted to the “infinite momentum frame,” or sometimes called the light cone condition or gauge, reduces the theory to so(9) ~ B_4. The G_2 plays a special role with the next complex group F_4, where F_4 = cent_{E8}(G_2), and the two groups are relatively abelian. The F_4 group gives
F_4/B_4:1 --- > spin(9) --- >F_{52/16} --- > OP^2
Which is sequence from the B_4 to the projective Fano plane.
Enough of the mathematics for now. It is curious that in your work you found only one particle. What I argue from physical grounds in one of my references is that a D-brane that is highly boosted will exhibit finer grained dynamics, as seen with Feynman’s wee partons. This means the number of degrees of freedom on a D-brane increases. The highly boosted D-brane contains then holographic information that is becoming redundantly represented. It does not make physics sense for the number of real degrees of freedom to increase. Instead there is only the appearance of an increase. So I argue by ansatz that a particle exists as only one fundamental states, but that holography induces multiple configuration variables representations of that particle. It is then astounding that you have found a situation where there can only exist one of each type of particle.
Cheers LC
view post as summary
Torsten Asselmeyer-Maluga replied on Sep. 4, 2012 @ 19:21 GMT
Dear Lawrence,
interesting math, in particular the special thinks about the E_8 and G_2. I have to study the projective Fano plane, an interesting relation.
Yes, the appearance of a single particle was also a surprise for me. Maybe I have to understand more of your work.
Very interesting ideas, thanks a lot for your time.
Torsten
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 5, 2012 @ 16:00 GMT
In a response to Jonathan Dickau I make greater mention of these matters. I also make a bit of a pitch for your essay.
Cheers LC
hide replies
Member Giovanni Amelino-Camelia wrote on Sep. 1, 2012 @ 20:32 GMT
dear Lawrence
as you suggested in a post related to my essay, there are some connections between our essays, in spite of the differences of approach and goals
and now that I have studied your essay I can observer that there are closer connections between parts of your essay and some of my works, see e.g.
http://arxiv.org/abs/arXiv:1206.3805
http://arxiv.org/abs/arXiv:1107.1724
http://arxiv.org/abs/arXiv:1101.0931
best wishes for the competition
Giovanni
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 2, 2012 @ 02:05 GMT
Dear Giovanni,
I just started reading Relative locality in a quantum spacetime and the pregeometry of _-Minkowski http://arxiv.org/pdf/1206.3805v1.pd. You seem to be pointing to a similar end. Noncommutative geometry and Hopf algebras are a main tool in the work with Yangians. I will write more when I complete reading your paper.
Equation 1 is interesting, for it proposes a...
view entire post
Dear Giovanni,
I just started reading Relative locality in a quantum spacetime and the pregeometry of _-Minkowski http://arxiv.org/pdf/1206.3805v1.pd. You seem to be pointing to a similar end. Noncommutative geometry and Hopf algebras are a main tool in the work with Yangians. I will write more when I complete reading your paper.
Equation 1 is interesting, for it proposes a noncommutative relationship between time and the spatial coordinates. This in my opinion harkens back to an old argument by Bohr. In 1930 there was a famous Solvay conference where Einstein and Bohr sparred over the reality of quantum mechanics. Einstein was convinced of reality and locality and argued staunchly for an incompleteness of quantum mechanics. Quantum theory could only be made complete if there are some hidden variables that underlay the probabilistic, nonlocal quirky aspects of quantum mechanics. At the 1930 Solvay conference Einstein proposed an interesting thought experiment. Einstein considered a device which consisted of a box with a door in one of its walls controlled by a clock. The box contains radiation, similar to a high-Q cavity in laser optics. The door opens for some brief period of time t, which is known to the experimenter. The loss of one photon with energy E = ħω reduces the mass of the box-clock system by m = E/c^2, which is weighed. Einstein argued that knowledge of t and the change in weight provides an arbitrarily accurate measurement of both energy and time which may violate the Heisenberg uncertainty principle ΔEΔt ~ ħ.
Bohr realized that the weight of the device is made by the displacement of a scale in spacetime. The clock’s new position in the gravity field of the Earth, or any other mass, will change the clock rate by gravitational time dilation as measured from some distant point the experimenter is located. The temporal metric term for a spherical gravity field is 1 - 2GM/rc^2, where a displacement by some δr means the change in the metric term is ~ (GM/c^2r^2)δr. Hence the clock’s time intervals T is measured to change by a factor
T --> T sqrt{(1 - 2GM/c^2)δr/r^2} ~ T(1 - GMδr/r^2c^2),
so the clock appears to tick slower. This changes the time span the clock keeps the door on the box open to release a photon. Assume that the uncertainty in the momentum is given by the Δp ~ ħΔr < TgΔm, where g = GM/r^2. Similarly the uncertainty in time is found as Δ T = (Tg/c^2)δr. From this ΔT > ħ/Δmc^2 is obtained and the Heisenberg uncertainty relation ΔTΔE > ħ. This demands a Fourier transformation between position and momentum, as well as time and energy.
Consider an example with the Schwarzschild metric terms. The metric change is then ~ 1x10^{-12}m^{-1}δr, which for δr = 10^{-3}m is around 10^{-15}. Thus for a open door time interval of 10^{-2}sec, the time uncertainty is around Δ t ~ 10^{-17}sec. The uncertainty in the energy is further ħΔω, where by Fourier reasoning Δω ~ 10^{17}. Hence the Heisenberg uncertainty is ΔEΔt ~ ħ.
This argument by Bohr is one of those things which I find myself re-reading. This argument by Bohr is in my opinion on of these spectacular brilliant events in physics.
This holds in some part to the quantum level with gravity, even if we do not fully understand quantum gravity. Consider the clock in Einstein’s box as a black hole with mass m. The quantum periodicity of this black hole is given by some multiple of Planck masses. For a black hole of integer number n of Planck masses the time it takes a photon to travel across the event horizon is t ~ Gm/c^3 = nT_p, which are considered as the time intervals of the clock. The uncertainty in time the door to the box remains open is
ΔT ~ Tg/c(δr - GM/c^2),
as measured by a distant observer. Similarly the change in the energy is given by E_2/E_1 = sqrt{(1 - 2M/r_1)/(1 - 2M/r_2)}, which gives an energy uncertainty of
ΔE ~ (ħ/T_1)g/c^2(δr - GM/c^2)^{-1}.
Consequently the Heisenberg uncertainty principle still holds ΔEΔT ~ ħ. Thus general relativity beyond the Newtonian limit preserves the Heisenberg uncertainty principle. It is interesting to note in the Newtonian limit this leads to a spread of frequencies Δω ~ sqrt{c^5/Għ}, which is the Planck frequency.
The uncertainty in the ΔE ~ ħ/Δ t does have a funny situation, where if the energy is Δ E is larger than the Planck mass there is the occurrence of an event horizon. The horizon has a radius R ~ 2GΔE/c^4, which is the uncertainty in the radial position R = Δr associated with the energy fluctuation. Putting this together with the Planckian uncertainty in the Einstein box we then have
ΔrΔt ~ (2Għ)/c^4 = L^2_{Planck}/c.
So this argument can be pushed to understand the nature of noncommutative coordinates in quantum gravity.
Cheers LC
view post as summary
Rick Lockyer wrote on Sep. 2, 2012 @ 17:29 GMT
Lawrence,
Do you really think the variation of parameters method implies a *particle* “tries” all neighboring paths and “chooses” the one that minimizes variation of the Lagrangian?
I have always believed the mathematician or physicist does the varying as a purely mathematical process to find the *actual* path the particle takes because it has no choice in the matter, nor capacity to make any decision between choices.
This position you appear to be taking seems like a canard to justify or legitimize non-deterministic concepts.
On my essay blog you asked about the fundamental nature of Octonion Algebra, and asked me to look at your essay and particularly the response threads. My essay clearly provides the fundamental connection between Octonions and physical reality, but perhaps not in the way you were looking for. In my response to you I mentioned those of a mind (you particularly) that believe it is important to unify QM with GR might be better off trying to unify QM with Octonion Relativity, especially if there is a link between QM and Octonion Algebra. Your essay responders might find illumination on the fundamental connection between Octonion Algebra and physical reality, and what I mean by “Octonion Relativity” by reading my essay
The Algebra of Everything.
Rick
report post as inappropriate
Rick Lockyer replied on Sep. 2, 2012 @ 18:34 GMT
Author Lawrence B Crowell replied on Sep. 2, 2012 @ 21:09 GMT
I have given you essay a read through, which means I have not yet read it a second time for greater detail and content.
The quantum path integral is a measure over the distribution of a quantum field or particle. It assigns amplitudes to each path, which in the large N limit converges to the classical variational method.
The connection between quantum mechanics and octonions is not completely clear. The associator (ab)c - a(bc) that is not zero is not as well founded according to operators as noncommutative structures are. Further, the physical meaning is not as clear. I think octonions are really a system of quaternions (7 of them) which are related to each other by a general duality principle. This duality principle may then be expressed by the associator.
Cheers LC
Jonathan J. Dickau wrote on Sep. 5, 2012 @ 04:46 GMT
Hi Lawrence,
I read through your essay, but have not returned to it yet - to read for detail. But I've noted some of your comments, and wanted to add one or two of my own. First off; I saw your EJTP paper on "Counting States in Spacetime" which you posted on Rick Lockyer's essay site, and I note several points of overlap with the following paper by Frank Potter.
Our Mathematical Universe: ISecond; as I understand it octonions can indeed be represented as a system of 7 quaternions, but then the quaternion variables must be resolved in a definite order or sequence, or handled in a consistent way, as the effect of each term is cumulative (as with procedural steps or process stages). I think Rick uses the term ensemble multiplication.
But this is not quite the same as saying that the 'octonions are really a system of quaternions.' Maybe O is more fundamental than H, as Rick asserts. But perhaps saying octonions can be treated as an ordered or nested system of quaternions would work, though.
Regards,
Jonathan
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 5, 2012 @ 16:02 GMT
Hi Jonathan,
Thanks for the paper. In looking at it I see many things which are in my notes and which I have in other papers and the book “Sphere Packing, Lattices and Codes” by Conway and Sloane.
The graininess of spacetime is something which I think only comes about with the measurement of black hole states. As I indicated on Giovanni Amelino-Camelia’s essay blog site...
view entire post
Hi Jonathan,
Thanks for the paper. In looking at it I see many things which are in my notes and which I have in other papers and the book “Sphere Packing, Lattices and Codes” by Conway and Sloane.
The graininess of spacetime is something which I think only comes about with the measurement of black hole states. As I indicated on
Giovanni Amelino-Camelia’s essay blog site there is an uncertainty principle,
ΔrΔt ~ (2Għ)/c^4 = L^2_{Planck}/c.
which is commensurate with equation 1 on Giovanni’s
paper . Spacetime appears grainy depending upon the type of measurement one performs. In the case of a quantum black hole a measurement involves spatial and temporal coordinates in a null congruency called an event horizon. If one makes another type of measurement spacetime is then as smooth as grease on an ice skating ring. The measurements of delay times for different wave lengths from very distant gamma ray burstars indicate that space is smooth down to a scale 10^{-50}cm --- far smaller than the Planck scale. This then ties in with some interesting work by
Torsten Asselmeyer-Maluga on the role of exotic four dimensional space in quantum gravity. These are homeomorphic spaces that are not diffeomorphic. In 11 dimensions the 7-dimensional is dual to the 4-dimensional space. The exotic 7-spheres found by Milnor are simpler, with only 7-distinct non-diffeomorphic forms, rather than an infinite number.
The octonions are a system of 7 quaternions. The exotic system in 7-dimensions I think might be connected to the automorphism G_2 in E_8 or SO(O). This would then connect with a physical meaning of octonions and nonassociativity in physics. The Polyakov path integral
Z[A] = ∫δD[ψ]/diff(ψ) Ae^{-iS[ψ]}
“mods out” diffeomorphism or equivalently gauge changes on a moduli. Yet with exotic spaces this definition becomes strange. However, if there are 7 quaternions which are related to each other by nonassociative products (ab)c – a(bc) =! 0, then the measure can maybe be realized according to associators δD[ψ]/diff(ψ).
I discussed octonions a bit with Lockyer, but he seemed a bit put off. As I see it, and from some experience, presenting a gauge theory with nonassociative brackets and stuff falls pretty flat, I am not necessarily saying this is wrong, but doing that sort of work has a way of getting people to present their backside to you. I think the role of nonassociators is best advanced by other means so that in the future they may simply be too convincing to ignore.
Cheers LC
view post as summary
Rick Lockyer replied on Sep. 5, 2012 @ 20:57 GMT
Lawrence,
Sorry you were offended by my calling you out for posting on my essay blog without the common courtesy of having read the essay first. I only meant to inform you that you might possibly find some perspective on your question about how Octonion Algebra relates to physical reality since it was the thesis of my essay. Thanks for reading it later. I am curious about your characterization that it is just a gauge theory using associators. The Lorentz gauge mention was simply to demonstrate a point of commonality between 4D and Octonion presentations of Electrodynamics, that’s it. Hardly a cornerstone of the presentation. I never once mentioned the associator, and frankly have never used non-associative brackets in any mathematical description. Octonion Algebra does indeed present a non-zero associator because it is a non-associative algebra. It MUST be so in order to be a normed composition algebra, hence a division algebra. Without this non-associativity and the remainder of O structure, it would be impossible for the algebraic invariances to match up the math to what we can measure or detect, and algebraic variances to give us clues on the math for what is hidden from us but none the less in play.
Rick
report post as inappropriate
Jonathan J. Dickau replied on Sep. 6, 2012 @ 03:49 GMT
Thanks Lawrence,
That nicely spells out where you are coming from. Glad you enjoyed the Potter paper, also. I've not looked at Giovanni's essay yet, but a quick read through of Torsten's paper has made it a 'must read' for the insights he shares. I am certainly not put off by your comments or Rick's and have found a lot of fascinating insights on the forum - even in the points of dispute.
I am glad the back and forth has kept everybody thinking. More fun lies ahead!
all the best,
Jonathan
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 6, 2012 @ 22:59 GMT
Jonathan,
First off I have not gotten around to reding your paper yet. It is taking me some time to get to them all.
Torsten's work is pretty hard stuff. The differential geometry of exotic spheres runs pretty deep. I studied this for my masters in mathematics. It has been a while since I have thought much about that. It did occur to me that exotic spherse might have something to do with quantum gravity.
I try to get as many people with their theoretical ideas and results together, because it is not likely that any of us b ourselves will come to the "big picture."
Cheers LC
hide replies
Author Lawrence B Crowell wrote on Sep. 8, 2012 @ 02:00 GMT
In indicated to Giovanni Amelino-Camelia there should be some connection between the theory κ-Minkowski spacetimes and the boost system he advances with twistor theory. The connection to twistor theory is I think not hard to see. The boost operator P_μ that acts on [x_i, x_0] = ilx_i such that
P_μ > [x_i, x_0] = il P_μ > x_i
The coordinates (x_j, x_0) we write in spinor form
x_j = σ_j^{aa’}ω_{aa’}
x_0 = σ_0^{aa’}ω_{aa’},
where ω_{aa’} = ξ_a ω_{a’} + ξ_{a’}ω_a. This commutator has the form
[x_i, x_0] = σ_j^{aa’}σ_0^{bb’}[ω_{aa’}, ω_{bb’}]
= iC^{cc’}_{aa’bb’} σ_j^{aa’} σ_0^{bb’} ω_{aa’}
= i|C| σ_j^{aa’}ω_{aa’}
where the magnitude of the structure matrix is |C| = l. In general this may be written for
x_j = σ_j^{aa’}ω_{aa’}
x_0 = σ_0^{aa’}ω_{aa’} + iq_{aa’}π^{aa’},
where the commutator [ω_{aa’}, π^{bb’}] = iδ_a^bδ_{a’}^{b’} and the general form of the commutator is then
[x_i, x_0] = i|C| σ_j^{aa’}ω_{aa’} + iσ_j^{aa’}q_{bb’}[ω_{aa’}, π^{bb’’}
[x_i, x_0] = ilσ_j^{aa’}ω_{aa’} - σ_j^{aa’}q_{aa’}.
The boost operation B = 1 + a^l_jP^j on the commutator [x_i, x_0] is then equivalent to the commutation between spinors [ω_a, ω’_b] for ω’_b = ω_b + iq_{bb’}π^{b’},
[ω_a, ω’_b] = [ω_a, ω_b] + iq_{bb’}[ω_a , π^{b’}]
= C^c_{ab} ω_c + iq_{ab}.
This could be explored more deeply. Ed Witten demonstrated the "twistor revolution" in string theory. If twistors are connected to κ-Minkowski spacetime there might then be a link between string theory and LQG and other "edgelink" type of quantum gravity theories. This would be potentially interesting, for this might serve to correct the difficulties with each of these.
Cheers LC
Pentcho Valev wrote on Sep. 10, 2012 @ 19:15 GMT
Lawrence,
You wrote: "Einstein changed Newton's laws by adjusting the first and third laws, motivated by the locality of electromagnetic fields predicted by Maxwell's equations."
Einstein did not adjust anything - he just introduced two postulates the second of which was false. In 1887 the Michelson-Morley experiment had refuted the light postulate and had confirmed the variable...
view entire post
Lawrence,
You wrote: "Einstein changed Newton's laws by adjusting the first and third laws, motivated by the locality of electromagnetic fields predicted by Maxwell's equations."
Einstein did not adjust anything - he just introduced two postulates the second of which was false. In 1887 the Michelson-Morley experiment had refuted the light postulate and had confirmed the variable speed of light predicted by Newton's emission theory of light. It is time for you, Lawrence, to stop claiming that Banesh Hoffmann, John Norton and John Stachel are wrong:
http://www.amazon.com/Relativity-Its-Roots-Banesh-Hoff
mann/dp/0486406768
"Relativity and Its Roots" By Banesh Hoffmann: "Moreover, if light consists of particles, as Einstein had suggested in his paper submitted just thirteen weeks before this one, the second principle seems absurd: A stone thrown from a speeding train can do far more damage than one thrown from a train at rest; the speed of the particle is not independent of the motion of the object emitting it. And if we take light to consist of particles and assume that these particles obey Newton's laws, they will conform to Newtonian relativity and thus automatically account for the null result of the Michelson-Morley experiment without recourse to contracting lengths, local time, or Lorentz transformations. Yet, as we have seen, Einstein resisted the temptation to account for the null result in terms of particles of light and simple, familiar Newtonian ideas, and introduced as his second postulate something that was more or less obvious when thought of in terms of waves in an ether."
http://www.aip.org/history/einstein/essay-einstein-re
lativity.htm
John Stachel: "An emission theory is perfectly compatible with the relativity principle. Thus, the M-M experiment presented no problem; nor is stellar abberration difficult to explain on this basis."
http://www.philoscience.unibe.ch/documents/kursarchiv/SS07/N
orton.pdf
John Norton: "These efforts were long misled by an exaggeration of the importance of one experiment, the Michelson-Morley experiment, even though Einstein later had trouble recalling if he even knew of the experiment prior to his 1905 paper. This one experiment, in isolation, has little force. Its null result happened to be fully compatible with Newton's own emission theory of light. Located in the context of late 19th century electrodynamics when ether-based, wave theories of light predominated, however, it presented a serious problem that exercised the greatest theoretician of the day."
http://philsci-archive.pitt.edu/1743/2/Norton.pdf
John Norton: "In addition to his work as editor of the Einstein papers in finding source material, Stachel assembled the many small clues that reveal Einstein's serious consideration of an emission theory of light; and he gave us the crucial insight that Einstein regarded the Michelson-Morley experiment as evidence for the principle of relativity, whereas later writers almost universally use it as support for the light postulate of special relativity. Even today, this point needs emphasis. The Michelson-Morley experiment is fully compatible with an emission theory of light that CONTRADICTS THE LIGHT POSTULATE."
Pentcho Valev pvalev@yahoo.com
view post as summary
report post as inappropriate
Author Lawrence B Crowell wrote on Sep. 10, 2012 @ 21:39 GMT
I am not sure why you decided to make your life’s work to discredit relativity. You keep posting the same thing over and over, with the same references.
The invariance of the interval, equivalently the constancy of the speed of light, means in addition to the three rotations of space there are three Lorentz boosts. The physics of this has been tested literally thousands of times in many different ways. The empirical support for relativity is simply overwhelming. You are not going to find many people here who are well grounded in physics who agree with you.
Cheers LC
Pentcho Valev replied on Sep. 11, 2012 @ 09:55 GMT
Lawrence,
Roger Schlafly wrote in his site:
"Pentcho, you are right that the emission theory was the only known explanation [of the null result of the Michelson-Morley experiment] in 1887..."
Is Roger right? Also, Lawrence, you used to claim that John Norton is wrong when he says that:
http://philsci-archive.pitt.edu/1743/2/Norton.pdf
John Norton: "The Michelson-Morley experiment is fully compatible with an emission theory of light that CONTRADICTS THE LIGHT POSTULATE."
Do you still believe Norton is wrong, Lawrence?
Pentcho Valev
report post as inappropriate
Author Lawrence B Crowell wrote on Sep. 11, 2012 @ 12:44 GMT
I have not yet read Schlafly’s essay or read the posts on his blog. I am not particularly interested in revisiting old stuff like this. Whether one can interpret the M-M experiment in different ways is of little interest to me. Lorentz interpreted the result as due to a length contraction that nullified the effect of the putative aether. Einstein was apparently not aware of the M-M experiment at all. Which ever is the case with interpreting the M-M experiment it is not relevant. Special relativity has been tested by many dozens of other types of experiments repeated many thousands of times. I am not sure why anybody would want to take up the cause of trying to overturn relativity this way. There were people up to the early 19th century who wanted to overturn Newton as well.
Cheers LC
Pentcho Valev replied on Sep. 11, 2012 @ 15:10 GMT
The fact is that, in 1887, Newton's emission theory stating that the speed of light varies in accordance with the equation c'=c+v (v is the speed of the light source relative to the observer) was the ONLY existing theory capable of explaining the null result of the Michelson-Morley experiment.
You find this fact unimportant and accordingly occupy the top of the community rating list. I find this fact extremely important and am at the bottom. Simple isn't it?
Pentcho Valev
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 11, 2012 @ 16:01 GMT
As for the rankings, there are two possible reasons for this. The first is that relativity is all wrong and has been propped up for over a century by an international scientific conspiracy. Those involved with the conspiracy or who believe its falsehood are wrongly voting your paper down. The other possibility is that you are simply wrong in your thesis that relativity is wrong based on an interpretation of an experiment performed over 130 years ago. You are not alone in such conspiracy claims. Some people who advance local hidden variables cry how the physics world has gone astray, and more recently a certain politically motivated "alt-science" community claims there is a big conspiracy to demolish the economy with global warming concerns by climatologists.
I tend to avoid these things, along with claims the 9/11 attack was an inside job, grassy knolls with Kennedy's assassination, Princess Diana's death was an inside job, and so forth. It is not possible to absolutely prove these things false, but seriously entertaining them is probably about as productive as masturbation is with impregnating your wife.
Cheers LC
Pentcho Valev replied on Sep. 11, 2012 @ 17:18 GMT
That "relativity is all wrong" is a fact often hinted at by high-ranking Einsteinians:
http://www.perimeterinstitute.ca/pdf/files/9755
47d7-2d00-433a-b7e3-4a09145525ca.pdf
Albert Einstein (1954): "I consider it entirely possible that physics cannot be based upon the field concept, that is on continuous structures. Then nothing will remain of my whole castle in the air, including...
view entire post
That "relativity is all wrong" is a fact often hinted at by high-ranking Einsteinians:
http://www.perimeterinstitute.ca/pdf/files/9755
47d7-2d00-433a-b7e3-4a09145525ca.pdf
Albert Einstein (1954): "I consider it entirely possible that physics cannot be based upon the field concept, that is on continuous structures. Then nothing will remain of my whole castle in the air, including the theory of gravitation, but also nothing of the rest of contemporary physics."
http://www.fqxi.org/community/articles/display/148
"Many physicists argue that time is an illusion. Lee Smolin begs to differ. (...) Smolin wishes to hold on to the reality of time. But to do so, he must overcome a major hurdle: General and special relativity seem to imply the opposite. In the classical Newtonian view, physics operated according to the ticking of an invisible universal clock. But Einstein threw out that master clock when, in his theory of special relativity, he argued that no two events are truly simultaneous unless they are causally related. If simultaneity - the notion of "now" - is relative, the universal clock must be a fiction, and time itself a proxy for the movement and change of objects in the universe. Time is literally written out of the equation. Although he has spent much of his career exploring the facets of a "timeless" universe, Smolin has become convinced that this is "deeply wrong," he says."
http://www.newscientist.com/article/mg20026831.500-wha
t-makes-the-universe-tick.html
"Newton and Leibniz debated this very point. Newton portrayed space and time as existing independently, while Rovelli and Brown share Leibniz's view that time and space exist only as properties of things and the relationships between them. It is still not clear who is right, says John Norton, a philosopher based at the University of Pittsburgh, Pennsylvania. Norton is hesitant to express it, but his instinct - and the consensus in physics - seems to be that space and time exist on their own. The trouble with this idea, though, is that it doesn't sit well with relativity, which describes space-time as a malleable fabric whose geometry can be changed by the gravity of stars, planets and matter. If the central property of space-time is the result of the existence of matter, how can we be sure that space and time exist on their own and are not convenient illusions? "Hence my hesitation," Norton says. While Norton hesitates, Smolin is intent on rescuing time. He believes time has to be real and that it is a fundamental property of the universe."
http://www.amazon.com/Faster-Than-Speed-Light-Spec
ulation/dp/0738205257
Joao Magueijo, Faster Than the Speed of Light: The Story of a Scientific Speculation, p. 250: "Lee [Smolin] and I discussed these paradoxes at great length for many months, starting in January 2001. We would meet in cafés in South Kensington or Holland Park to mull over the problem. THE ROOT OF ALL THE EVIL WAS CLEARLY SPECIAL RELATIVITY. All these paradoxes resulted from well known effects such as length contraction, time dilation, or E=mc^2, all basic predictions of special relativity. And all denied the possibility of establishing a well-defined border, common to all observers, capable of containing new quantum gravitational effects."
https://webspace.utexas.edu/aam829/1/m/Relativity.h
tml
Alberto Martinez: "Does the speed of light depend on the speed of its source? Before formulating his theory of special relativity, Albert Einstein spent a few years trying to formulate a theory in which the speed of light depends on its source, just like all material projectiles. Likewise, Walter Ritz outlined such a theory, where none of the peculiar effects of Einstein's relativity would hold. By 1913 most physicists abandoned such efforts, accepting the postulate of the constancy of the speed of light. Yet five decades later all the evidence that had been said to prove that the speed of light is independent of its source had been found to be defective."
Pentcho Valev
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 11, 2012 @ 18:58 GMT
Science and physics are not about certitude. We can never be certain that our understanding about the world is complete, any more than science can't prove that Cthulhu will not arise from his sleep and destroy everything. We can though say that within the domain of applicability that relativity, and more specifically special relativity, operates well. With respect to the issues raised by Smolin these pertain to questions with a possible quantum underpinning of general relativity.
Cheers LC
hide replies
Yuri Danoyan wrote on Sep. 12, 2012 @ 02:40 GMT
Lawrence
Arnold was great mathematician,not metaphysics
but is favor observation was trinity
http://www.neverendingbooks.org/index.php/arnolds-tri
nities-version-20.html
report post as inappropriate
Member Benjamin F. Dribus wrote on Sep. 18, 2012 @ 06:07 GMT
Dear Lawrence,
I found your essay very intriguing and absolutely packed with interesting information, which will require some more thought to digest. One question: near the end of the paper you are discussing path integration involving paths in "what becomes the emergent spacetime." Now, of course in some theories that make use of path sums, the paths are in a configuration space of "universes" (geometries, triangulations, or whatever), rather than in a single lower-level structure. I am wondering if there are two different quantum notions occurring here, one involving the "spacetime" itself, and one involving paths in the spacetime? Take care,
Ben Dribus
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 18, 2012 @ 17:59 GMT
Ben,
I have not gotten yet to reading your essay. There are lots of these here and it is not possible to read more than one or two in a day. I have it in mind to read yours, as it has been lofted for the most part towards the top of the ratings.
What you are asking is related to a discussion I had with Torsten Asselmeyer-Maluga . He argues that spacetime is completely continuous....
view entire post
Ben,
I have not gotten yet to reading your essay. There are lots of these here and it is not possible to read more than one or two in a day. I have it in mind to read yours, as it has been lofted for the most part towards the top of the ratings.
What you are asking is related to a discussion I had with
Torsten Asselmeyer-Maluga . He argues that spacetime is completely continuous. This could represent one complementary aspect to spacetime; the quantization of spacetime may include a classical continuum in one representation and a noncommutative non-classical representation. This may play some role in the two quantum concepts you are referring to.
Sabine Hossenfelder makes an argument similar to this.
The BCFW recursion formula is a twistor theory, which has been used in the HopHat algorithm for computing gluon amplitudes at the LHC. Gravitation is in one sense the square of QCD gauge theory. After reading
Giovanni Amelino-Camelia , which has regrettably and I think wrongly fallen down the community ranking, I suggested a connection between the boosts employed in κ-Minkowski and twistor theory. This might be a connection between string theory and the more loopy or striangulated theories like LQG. The Wheeler DeWitt equation has not time variables. Physically this means there is no Gaussian surface one may arrange in spacetime to localize energy. So HΨ[g] = i∂Ψ[g]/∂t = 0. The time variables is a coordinate time, used in QFT equations, which is not a proper variable in general relativity. Hence this equation is a constraint equation, classically NH = 0 and N^iH_i = 0. String theory on the other hand requires some external background field from which gravitons as closed strings are represented. This has been a problem the LQG folks like to point out --- never mind LQG has failed to produce even a first order renormalizable calculation. I speculate that somehow the two views of quantum gravity might connect, where LQG provides the background or constraint for string theory, and the field calculations of strings makes LQG more tractable.
Cheers LC
view post as summary
Yuri Danoyan wrote on Sep. 18, 2012 @ 14:37 GMT
What is your opinion about Gerard 't Hooft
Discreteness and Determinism in Superstrings ?
arXiv:1207.3612 (replaced) [pdf, ps, other]
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 18, 2012 @ 23:58 GMT
It looks interesting. It might take me a day or two to read it.
Cheers LC
Hoang cao Hai wrote on Sep. 19, 2012 @ 14:41 GMT
Dear
Very interesting to see your essay.
Perhaps all of us are convinced that: the choice of yourself is right!That of course is reasonable.
So may be we should work together to let's the consider clearly defined for the basis foundations theoretical as the most challenging with intellectual of all of us.
Why we do not try to start with a real challenge is very close and are the focus of interest of the human science: it is a matter of mass and grain Higg boson of the standard model.
Knowledge and belief reasoning of you will to express an opinion on this matter:
You have think that: the Mass is the expression of the impact force to material - so no impact force, we do not feel the Higg boson - similar to the case of no weight outside the Earth's atmosphere.
Does there need to be a particle with mass for everything have volume? If so, then why the mass of everything change when moving from the Earth to the Moon? Higg boson is lighter by the Moon's gravity is weaker than of Earth?
The LHC particle accelerator used to "Smashed" until "Ejected" Higg boson, but why only when the "Smashed" can see it,and when off then not see it ?
Can be "locked" Higg particles? so when "released" if we do not force to it by any the Force, how to know that it is "out" or not?
You are should be boldly to give a definition of weight that you think is right for us to enjoy, or oppose my opinion.
Because in the process of research, the value of "failure" or "success" is the similar with science. The purpose of a correct theory be must is without any a wrong point ?
Glad to see from you comments soon,because still have too many of the same problems.
Regard !
Hải.Caohoàng of THE INCORRECT ASSUMPTIONS AND A CORRECT THEORY
August 23, 2012 - 11:51 GMT on this essay contest.
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 19, 2012 @ 19:37 GMT
Hi,
I am a bit uncertain about what you are saying here. Mass and weight are different things. Weight is just mass under the acceleration of gravity F = ma, where the acceleration a is just the local gravity on a planet, such as on earth a = g = 9.8m/s^2.
The Higgs field is a pair of doublets, with four components in total, where three of them couple to the Z^0 and W^{+/-} particles and the remainder is the Higgs particle recently detected. At very high energy these four components are free, where the three absorbed into the Z^0 and W^{+/-} particles are also free particles. At lower energy these are absorbed. This is the Goldstone mechanism.
Good luck in the essay contest,
LC
Hoang cao Hai replied on Oct. 2, 2012 @ 04:01 GMT
Thank you Lawrence B Crowell
Based on my research: the separation of the concept of "weight" and "mass" is a mistake stems from the failure to identify specific "gravity".
But perhaps we should not be further discussed when we each use a different argument.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 2, 2012 @ 13:43 GMT
It is often the case that mass and weight are used interchangeably in ordinary language.
Cheers LC
Patrick Alan Hutchinson wrote on Sep. 20, 2012 @ 21:34 GMT
Hello Lawrence
Thank you for your essay. It gives the clearest presentation I have yet come across of what seems to be a fundamental muddle pervading the subject. It appears in the notion which you describe very clearly on your page 3:
"The light cone at any point is subject to quantum fluctuations. Consequently the point where all null rays pass through is indeterminate; null rays...
view entire post
Hello Lawrence
Thank you for your essay. It gives the clearest presentation I have yet come across of what seems to be a fundamental muddle pervading the subject. It appears in the notion which you describe very clearly on your page 3:
"The light cone at any point is subject to quantum fluctuations. Consequently the point where all null rays pass through is indeterminate; null rays in the region are not connected to a unique point."
To start with, the term "light cone" is an unfortunate misnomer. Light consists of waves. It does not travel along lines in cones.
The quoted passage does not say what the Heisenberg uncertainty principle asserts. Heisenberg's idea implies that any attempt to observe the region where null rays converge will produce fluctuating answers. This is not the same as asserting that there is no point where these rays converge. The uncertainty principle just says that we can't see it clearly; in fact, any physical phenomenon can't be relied on to behave as if there were such a point because of the modern equivalent of Newton't third law. If an event occurred at that point and had an effect on some physical phenomenon then the phenomenon whould have an equal and opposite effect on whatever caused that event, and knock it off that point.
It would be foolish to speculate how theoretical physicists think. However, maybe one can outline a sequence of mental conceptions which lead in the direction of this muddle.
Conception 1: an event occurs when two or more particles bounce off each other.
Conception 2: we see a thing by detecting particles which have been bounced off that thing.
Conception 3: particles are wave-like, and fuzzy, in accordance with QT.
Conception 4: our faculties and intellects are not fuzzy. If it is impossible for us to see something then it isn't there.
The problem lies in Conception 4. We are fuzzy, and we can't escape this fact precisely because we are huge lumps of interacting wave-like particles. The muddle occurs because we are reluctant to admit our own limitations.
I think this is the root cause of the notion that space-time is granular, not a smooth continuum, and all the consequent hassle.
Best wishes
Alan H.
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 22, 2012 @ 01:06 GMT
A light cone is a spacetime representation of the path a spherically expanding light pulse takes in spacetime. If spacetime is noncommutative on a small scale this light cone point is not a point but a region where a set of null rays pass. Here the term ray used here is mathematical more than an geometric optical concept of a ray.
Cheers LC
Patrick Alan Hutchinson replied on Sep. 22, 2012 @ 20:41 GMT
If one is discussing physics then a "a spherically expanding light pulse" cannot start from a point but from a region. If one is discussing maths then what a light cone is depends on one's assumtions. I know nothing worth mentioning of noncommutative geometry, and take your word for it that a light cone in noncommutative geometry has some sort of a vertex which is, as you say, a region, but one can make other assumptions.
The first two pages of your essay are very nicely written. On page 3, you lose me. Do you regard "quantization of spacetime" as an assumption or a theorem or what? The essay seems to be based on some sort of premiss that spacetime must be quantized in any mathematical model of observed physics. Is that what you assume? If so, as seems on your page 4:
"Our world is on the boundary of an anti-de Sitter (AdS) spacetime, where the interior is quantum gravity ..."
there has to be justification. I think it is conceivable that physics has other models in which spacetime is not "quantized", whatever that means, but is just a straightforward manifold with a metric and connection. The metric and connection may not be as Einstein suggested, but not very different. It seems quite possible that all the noncommutative aspects of observed physics can be modelled by solutions of assumptions expressed using much more conventional functional methods.
This is not to say that noncommutative geometry and quantization of spacetime and e.g. Asselmeyer's ideas about exotic smooth structures are "wrong". It is also conceivable that all these different approaches yield equivalent models which all reflect observed physics, much as the Schrodinger and Heisenberg approaches to QT match each other. (If they are, that would suggest some fascinating theorems.) As yet, I think it is just too soon to commit to one set of assumptions and reject all others. At the very least, assumptions should be stated.
bw. Alan H.
report post as inappropriate
Author Lawrence B Crowell replied on Sep. 22, 2012 @ 23:25 GMT
I attach a picture of a light cone in spacetime. There is a hypersurface of space that is a frame of simultaneity, and the future and past cones meet at the origin of this coordinate system.
The work of Asselmeyer is complementary to the noncommutative description. There are probably deeper principles at play here. The FERMI spacecraft measured the time of arrival of photons from distant gamma ray burstars. Photons of different wavelengths arrived at the same time. If there were Planck scale grainy properties or so called spacetime foam then shorter wavelengths would couple to these more strongly. The result is there would be a dispersion of light. None was observed. This measurement is different from what an extremely high energy experiment might observe where the probe scale is near the Planck scale. In the case of the FERMI experiment the probe scale was cosmological, billions of light years to a burstar, so this reflects a different type of experiment. This may suggest a type of quantum complementarity at work here. Asselmeyer works with exotic spaces which are absolutely smooth, but this exotic structure may have some duality or categorical equivalency with noncommutative geometry.
The AdS spacetime comes in with the AdS~CFT correspondence of Maldacena. You can look this up on Wikipedia. It is a rather deep and involved topic in connection to string theory and D-branes.
Cheers LC
attachments:
light_cone.JPG
Patrick Alan Hutchinson replied on Sep. 24, 2012 @ 08:03 GMT
Thanks for the portrait of a light cone. I first saw pictures like that in the 1960s. It is pretty.
This still doesn't clarify what it means. Is it meant to depict a singular 3-surface in 4-space? It looks like the boundary of the set of points influenced by its vertex under some linear hyperbolic PDE (see e.g. Peter Lax: Hyperbolic PDEs, AMS Courant lecture notes 14, 2006, chapters 1,2). It appears so, but if it is then it does not correspond to any post-1864 theory of a "light pulse" because light pulses are diffuse.
Thanks for your account of the FERMI experiment. It is news to me, and sounds significant. Are you assuming AdS spacetime, and string theory, D-branes etc? Are they compatible with the results of the FERMI experiment?
bw
Alan H.
report post as inappropriate
Patrick Alan Hutchinson replied on Sep. 24, 2012 @ 08:49 GMT
Hello again Lawrence
Please forgive me. I have been nit picking. I hope it has helped to clarify things somewhat. It is such fussiness which distinguishes maths from theoretical physics, and ultimately maths is the more reliable subject.
Best wishes
Alan H.
report post as inappropriate
hide replies
Anonymous wrote on Sep. 27, 2012 @ 01:27 GMT
Dear Lawrence,
I noted the sketch you made concerning shape/causal duality on my thread, and made some remarks in response. In the future, please feel free to post at the bottom of my thread... that way I will see your comments immediately. Take care,
Ben
report post as inappropriate
Steven Dinowitz wrote on Oct. 3, 2012 @ 06:10 GMT
Hi Lawrence,
I think I made an interesting discovery. Check out my post dated 9/19/12. Let me know what you think.
Regards,
Steve
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 4, 2012 @ 01:36 GMT
I will take a look at this paper. The matter of CP violation is of course interesting, and it is important to understand how this discrete symmetry is violated, presumably at lower energy. I do think that solving the problem of CP violations by breaking the equivalence between inertial mass and gravitational mass is at best converting the problem from one form to another. Think of it from a Gauss law perspective. Consider a large mass M made with matter and a smaller mass made of antimatter m. If I were to put a Gaussian surface around the two of them the gravitation at the surface would be that of a mass M - m. Now force the small mass m into M, and BOOM you are left with a mass M- m in the center and a shell of photons of mass 2m approaching the Gaussian surface. The observer on the Gaussian surface would detect this huge pulse of radiation E = 2m and from gravity would now detect a gravitating mass M - m. Now suppose this Gaussian surface is a perfect mirror that reflects the light back to the mass M - m. The Gaussian surface measure of gravity would then have a mass M + m. The interaction between matter and antimatter would increase the amount of gravitational mass.
Solving the CP violation issue with this seems to be a rather odd solution. Of course nature could turn out to be strange. Performing this experiment would be of interest, and I suspect or at least hope that nature does not turn out to be this crazy.
Cheers LC
Peter Jackson wrote on Oct. 3, 2012 @ 14:16 GMT
Lawrence
I was very pleased to be able to understand your essay, until page 6, and the problem of frame boundaries expressed in the conflict of Maxwell's equations/CSL and classical mechanics. I found your resume clear and logical, up to that point. I congratulate you for that and am sure the problem after then was mine.
I wonder if you might use your obvious deep understanding of...
view entire post
Lawrence
I was very pleased to be able to understand your essay, until page 6, and the problem of frame boundaries expressed in the conflict of Maxwell's equations/CSL and classical mechanics. I found your resume clear and logical, up to that point. I congratulate you for that and am sure the problem after then was mine.
I wonder if you might use your obvious deep understanding of QFT to comment on my slightly different suggestion, where the wave equation itself is conserved but the geometry does not commute due to delta lambda. localisation giving quantization is almost instantaneous but the process of charge takes non zero time, so evolves lambda, deriving delta f (Doppler shift). (The light cone IS distorted, as my essay last year). In terms of your text, my essay gives an underlying mechanism, of how Maxwell/CSL and CM can; "...fit into a single theory..." also; "The underlying structure" does NOT require; "...the abandonment of locality and unity."
Though well supported, and consistent with foundations discussed a number of other essays, I'm not sure the ontological construction had the rigorous additional falsification I was hoping for. It is of course only a glimpse of the whole ontology, which I've found has precisely the same structural framework as truth propositional logic; a hierarchical mutually exclusive nested sequence of local compound propositions, none of which has any relevance to any but their local neighbour. Maxwell's near/far field term transition zone forms the turbulent magnetohydrodynamic boundary, working as a fluid dynamic coupling (as ALL scales) via re-scattering at c. A cross section through one such boundary found by the Cluster probes (ion bow shock as non-rotating ECI frame to rotating ECRF) is shown in the Kingsley-Nixey essay Fig 2 with the same logical re-interpretation.
I was very pleased Tom assimilated the set of assumptions and effects of more logical interpretation, which encourages me to believe you may now also do so (beneath the theatrical metaphors). I very much hope you are able. Well done for your own good work, which honestly clarified the limitations of other current approaches. I'm sorry to discuss mainly mine here, but have no criticism of yours.
Many thanks, and best wishes
Peter
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 3, 2012 @ 17:43 GMT
Peter,
The loss of unitarity requires a somewhat different way of thinking. I think it is a great thing that our educations give us rigor in our thinking and abilities to solve problems. There is an unfortunate flip side where with regards to foundations this can lead to a sort of “rigor mortis.” Trying to think of physics that is outside of what is accepted is very difficult. It is...
view entire post
Peter,
The loss of unitarity requires a somewhat different way of thinking. I think it is a great thing that our educations give us rigor in our thinking and abilities to solve problems. There is an unfortunate flip side where with regards to foundations this can lead to a sort of “rigor mortis.” Trying to think of physics that is outside of what is accepted is very difficult. It is very easy to come up with something that is completely wrong, where one has to abandon that ship as soon as possible. I have spent years doing that. It is very difficult to come up with something which is maybe correct and hard to get it accepted.
I would need to think about what you are asking. This seems in some ways similar to causal set theory.
In my essay and reference #13 I illustrate this with noncommutative geometry. The small scale structure of spacetime is then wildly strange near the Planck scale. The best and most interesting about these essay contests is communicating these results with other people. I have found that Amelino-Camelia’s work has interesting connections to twistor space, which my paper is based on in part with the BCFW recursion. The work on shape dynamics and causal sets and so forth of Gryb, Dribus and Barbour are also interesting. What in some ways is the most interesting is the work by Asselmeyer-Maluga and Krol on exotic manifolds, a topic I was taught near the end of a course on differential or algebraic geometry. This turns out to imply physically smooth space or spacetime. This is beginning to look like another door into this area, one where a wild chaotic microstructure to spacetime is dual to a smooth structure.
Donaldson, Drinfeld and Freedman showed that this results in an infinite class of manifolds which are homeomorphic but not diffeomorphic, called exotic manifolds. These substructures one may look at are things like of Casson handle R^2xD^2, D^2 = two disk, or T^2xD^2. T^2 = two torus. These many be removed from an R^4 and replaced with CP^2#CP^2, and this space is homeomorphic to the original space, but not diffeomorphic. If we then consider R^4 = R^3xΠ_n{t_n} (an infinite product of time intervals), the R^4 is a huge stack of four dimensional "slices." Each stack may then be exotic. The infinite product is a form of "time operator."
The stack of exotic spaces is a foliation of 4-spaces mapped to each other by knot operations, or more generally by the Yang-Baxter equations. These are general braid operations, where each line is a Wilson line integral with a gauge field content. Further, the braid operations are quantum groups, or the generator of Hopf C* algebras. So in this setting we may consider quantum gravity as a process of quantum homeomorphisms on both the AdS and its boundary dS, which have opposite orientations. The gravity content is contained within the AdS, but conformal symmetry breaking of the CFT on the boundary means there is some gravity content on the boundary.
I was stuck on this problem of conformal completion of AdS, but recent communications with Asselmeyer-Maluga and Krol have illuminated how this process can be considered with exotic 4-spaces. There are some parallels with C* algebras, and it now turns out that quantum gravity is a Yangian system of conformal dual gauge symmetries --- in part giving the conformal completion of AdS.
It is strange to think spacetime might have a wild and chaotic structure on a small scale, and then at the same time a structure that is perfectly smooth. The Yangian system constructs a dual geometry, and this is reflected here. Experimentally it has been found that gamma ray burstars that are billions of light years away have no dispersion of light. If spacetime is grainy on a small scale it is expected that short wavelength light would interact more strongly with this “foam” and would be slower. However, the observational data does not bear that out. This is an experiment which involves a huge distance, and momentum p = ħk is such that if we consider this distance the “probe momentum” is near zero. Another experiment which might involve enormous energy and large probe momentum might then measure the noncommutative or foamy structure to spacetime.
Cheers LC
view post as summary
Peter Jackson replied on Oct. 3, 2012 @ 20:13 GMT
Lawrence
Thanks, Congratulations, and you're very welcome. Hold on tight, grit your teeth and I'll see if I can stick a pit prop up your nether end to prove Andy Warhol wrong!
Do come back when you've had a think about it. It really does take some 'dynamic' thought and research, and the dropping of deeply rooted assumptions to test the axioms. It helps a lot if you're familiar with the structures of logic, both TFL, and Propositional Modal Dynamic or PDL. It IS rather a different language.
There are still a couple of bits missing, plus lots of dressing, but you've only seen the tip of an ice cube off a glacier that won't stop pouring out physics!
Very best Wishes
Peter
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 4, 2012 @ 13:05 GMT
I would need to read you essay, which I read a while back. I was reminded that Oct 5 11:59EDT is the end of voting. I was thinking for some reason that it went on until near the end of October. So I have to read and vote a fair number of these.
Cheers LC
Peter Jackson replied on Oct. 6, 2012 @ 20:43 GMT
Lawrence
I hope that pit prop didn't hurt when your 15 minutes ran out! You must feel a bit gutted, but there were still some very good essays down below you so there's no shame.
I have to say I missed some of the brilliance of you 'rigor mortice', which I plan to unashamedly steal from you with one letter change to 'rigour mortice'. I promise to give you a credit wherever there's space. Frankly I think theoretical physics has been suffering badly from it, really is beyond rejuvenation and we need to look to the new generation. I'm heartened such a majority of essays here agree. Rigour does not have to be sacrificed, just changed to improve logical rigour.
I really hope you will read mine again, and very slowly. Perhaps read the response I've just made on my blog to Jonathen first. I'm confident you can achieve what many cannot - or if you falsify the ontology, all the better!
Best wishes
Peter
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 7, 2012 @ 00:04 GMT
This morning I was at #41 and I am now down #83. Clearly there is a lot of moving around of the “chess pieces” still going on. Seriously, if the voting stopped at midnight today this should not be happening.
Usually the start of a contest comes about a year after the end of the previous one. If I do enter a paper next time it will be a general review of my research, and not at all any explicit description. This paper I wrote was a bit of both, and the core of the work is in papers I have either published or are in review. One paper I used as a reference in my FQXi paper won the GRF essay contest “honorable recognition or mention,” which is one of 25 papers below the five winners. It has also been accepted for publication. The GRF contest involves thousands of entries. So I feel this is a more reasonable representation of work than FQXi contest. I get the sense that FQXi contests are almost more of a popularity contest, where to be honestly I sense that the whole thing is being “readjusted” according to various dictates. So next time I intend to write a completely informal description of this work. I am not going to bust my butt.
There are a fair number of good papers below mine, and there are a great number of nonsense papers ahead me. The whole thing this year ended in some sort of dissonant collapse.
Cheers LC
hide replies
Author Lawrence B Crowell wrote on Oct. 3, 2012 @ 16:57 GMT
My reference [11] L. B. Crowell, ”Tricritical quantum point and inflationary cosmology,” http://arxiv.org/abs/1205.4710 used in my essay have been accepted for publication. This won an honorable mention in the GRF essay contest earlier this year. The acceptance letter is below.
I see that for some reason I am top of the community rating list. That will probably prove Andy Warhol's 15 minutes of fame, but it is curious that it happened.
Cheers LC
CC: dharamvir.ahluwalia@canterbury.ac.nz
Ref.: IJMPD1055
"Tricritical quantum point and inflationary cosmology"
Dear Dr. Crowell,
I am pleased to tell you that your essay has now been accepted for publication in the International Journal of Modern Physics D.
The proof of manuscript will be e-mailed to you within 2 weeks. For any questions related to publication of your essay, please e-mail to: ijmpd@wspc.com
Thank you for submitting your work to this journal.
Sincerely,
Chee-Hok Lim
for D V Ahluwalia (Editor)
Pentcho Valev wrote on Oct. 3, 2012 @ 18:16 GMT
Lawrence,
I remember criticizing the following statement of yours but you (or some guardian angel) deleted my comment:
You write in the essay: "Einstein changed Newton's laws by adjusting the first and third laws, motivated by the locality of electromagnetic fields predicted by Maxwell's equations."
Could you please elaborate? When and how did Einstein do that?
Pentcho Valev
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 3, 2012 @ 20:37 GMT
Newton’s first law tells you that to observe physics you must do so from an inertial frame. Einstein generalized this with the equivalence principle. Newton’s third law indicates the laws of physics are invariant with respect to displacement and orientation. Einstein supplanted that with Lorentz boosts. In doing to Maxwell equations are invariant in all frames.
LC
Pentcho Valev replied on Oct. 3, 2012 @ 21:33 GMT
Strange (euphemism) formulations of Newton's first and third laws. "Anything goes" would say Paul Feyerabend.
Pentcho Valev
report post as inappropriate
Vladimir Rogozhin wrote on Oct. 3, 2012 @ 20:02 GMT
Dear Lawrence,
Unfortunately, I did not see in your essay ontological justification of your basic assumptions. A physics today requires a fundamental revolution. But can it be done without ontology? Mathematics is also not ontologically grounded science. Sincerely, Vladimir
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 3, 2012 @ 20:42 GMT
I did not spend much time on philosophical issues. I tend to procede in a more operational way. When it comes to ontology and quantum wave functions the basic results of Bell and Kochen-Specker hold, which indicates the wave function is either not ontological or is in some way minimally ontological.
Cheers LC
Vladimir Rogozhin replied on Oct. 4, 2012 @ 09:33 GMT
Dear Lawrence!
Fundamental physics was always together with philosophy, ontology. Especially, when it comes to Space-Time geometry. And this is not just a problem of physics and mathematics, but of human culture, the problem of the knowledge base. "Trouble in physics" just to have a source in the absence of the ontological foundations of geometry of Space-Time. Sincerely, Vladimir
report post as inappropriate
Sergey G Fedosin wrote on Oct. 4, 2012 @ 07:04 GMT
If you do not understand why your rating dropped down. As I found ratings in the contest are calculated in the next way. Suppose your rating is
and
was the quantity of people which gave you ratings. Then you have
of points. After it anyone give you
of points so you have
of points and
is the common quantity of the people which gave you ratings. At the same time you will have
of points. From here, if you want to be R2 > R1 there must be:
or
or
In other words if you want to increase rating of anyone you must give him more points
then the participant`s rating
was at the moment you rated him. From here it is seen that in the contest are special rules for ratings. And from here there are misunderstanding of some participants what is happened with their ratings. Moreover since community ratings are hided some participants do not sure how increase ratings of others and gives them maximum 10 points. But in the case the scale from 1 to 10 of points do not work, and some essays are overestimated and some essays are drop down. In my opinion it is a bad problem with this Contest rating process. I hope the FQXI community will change the rating process.
Sergey Fedosin
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 4, 2012 @ 12:55 GMT
That is the trick, for one does not know R_1. The only way to insure you increase a person's community ranking, or not drop it, is to give them a 10.
LC
Peter Jackson wrote on Oct. 5, 2012 @ 09:46 GMT
Lawrence
You too seem to have been targeted by Trolls. Do you think Brendan can identify the conspirators giving out mass 1's? - giving ridiculous drops of
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 5, 2012 @ 12:32 GMT
I wrote about this to Brendan Foster yesterday. I dropped suddenly 106 points and everything was jumbled up. He says there is a computer glitch that is messing things up. Things are still a mess, and I was averaging around 20 or so for a while and am still down the list. It appears possible that any objective aspect to this contest has ended. I don't know if they have records of where authors were ranked before this happened, but if not it appears this contest might be humpty dumpty.
LC
AndyM replied on Oct. 5, 2012 @ 18:29 GMT
After the contest, in the interests of eliminating any possibility of contest score-fixing, I think that the community scoring of each reviewer should be made public. No where in the rules does it state that the scoring is or should be anonymous.
Note that I am not a participant in the contest, but feel that there is a good chance that there is manipulation.
report post as inappropriate
Author Lawrence B Crowell wrote on Oct. 5, 2012 @ 19:16 GMT
A number of things could be done. I really think there needs to be a better initial screening of these essays. By at least skim reading them essays that have stuff about the speed of light being dependent on the source, or nonlocal communication in quantum mechanics, or some puerile ideas about how some physics 101 idea “explains all,” or an action principle that is clearly contrary to known physics (there is an essay with this), and so forth can be eliminated. The pure crackpot crank stuff should be weeded out right away.
The problem is that you can’t just write to these authors that their work is nonsense. They rate your essay as much as you rate theirs. So you have to do this ridiculous condescending nonsense to them.
The ranking needs to be based on a total score, not on whether they average some score. An essay with 5 rating of 8 should be ahead of an essay with 1 rating of 10. At least this should be the case until they reach some “critical number” of total scores, say 10 of them.
Clearly the system was compromised in some way, and I suspect it was hacked into. The person doing the hacking then had liberties to either give certain contestants multiple voting powers or they themselves gave multiple votes. The whole contest is supposedly set back “right,” but now I see lots of nonsense essays towards to top, mine is still way down. Alves wrote a fine essay that I gave a 9 or 10 and he is way down. I doubt the damage has not really been fixed.
The whole thing has been hopelessly corrupted and is frankly a wash. Those who remained at the top will doubtless make the cut. Those who were in the > 35 rating who are now down the list are out in the cold. There are a number of essays in this top region that are not worth the paper they are printed on.
Cheers LC
Author Lawrence B Crowell replied on Oct. 5, 2012 @ 19:28 GMT
PS, On the screening of essays, I think essays that are just some loose or post-modernist screed should be prevented from being hosted here as well.
LC
Pentcho Valev replied on Oct. 5, 2012 @ 21:37 GMT
"essays that have stuff about the speed of light being dependent on the source (...) can be eliminated. The pure crackpot crank stuff should be weeded out right away."
Be more careful, Lawrence. Crackpots, cranks, trolls etc. are easy to eliminate but VIP people might look badly at you:
Joao Magueijo: "Lee [Smolin] and I discussed these paradoxes at great length for many months, starting in January 2001. We would meet in cafés in South Kensington or Holland Park to mull over the problem. THE ROOT OF ALL THE EVIL WAS CLEARLY SPECIAL RELATIVITY. All these paradoxes resulted from well known effects such as length contraction, time dilation, or E=mc^2, all basic predictions of special relativity. And all denied the possibility of establishing a well-defined border, common to all observers, capable of containing new quantum gravitational effects."
Pentcho Valev
report post as inappropriate
Author Lawrence B Crowell wrote on Oct. 5, 2012 @ 22:42 GMT
You clearly do not understand what you are talking about. The ideas of Joao Magueijo and Lee Smolin have to do with relatvity on a small scale near the Planck length. In no way could I imagine they would agree with your whole assessment that relativity in the proper domain of application is completely false. I am sorry, but that is all there is to this. You keep pestering people with your stuff on this, and we have to condescend to your nonsense to keep from getting one-ratings from you.
This is a major flaw with the whole contest format.
LC
Pentcho Valev replied on Oct. 5, 2012 @ 22:54 GMT
Don't worry about one-rating. I have only given 10's, once 8. One-rating contradicts my ethical principles.
Pentcho Valev
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 5, 2012 @ 23:58 GMT
I have given up on the contest. After this October surprise there is no point in it, and frankly if I get a string of 25 ones at this point I could frankly give a damned.
LC
Pentcho Valev replied on Oct. 6, 2012 @ 06:50 GMT
"I have given up on the contest."
Obviously you haven't. Such a spectacular jump after voting was stopped and the "final" results were posted!
Pentcho Valev
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 6, 2012 @ 12:28 GMT
My apparent jump at the end had nothing to do with my efforts towards the end. Much of the ratings seemed to go into chaotic dynamics there towards the end. How the dice rolled at the end was probably not in the control of most in the contest.
Cheers LC
hide replies
John Merryman wrote on Oct. 7, 2012 @ 03:33 GMT
Lawrence,
Going through comments this afternoon, I came across this exchange you had with Peter and couldn't help responding:
"It is strange to think spacetime might have a wild and chaotic structure on a small scale, and then at the same time a structure that is perfectly smooth. The Yangian system constructs a dual geometry, and this is reflected here. Experimentally it has been found that gamma ray burstars that are billions of light years away have no dispersion of light. If spacetime is grainy on a small scale it is expected that short wavelength light would interact more strongly with this “foam” and would be slower. However, the observational data does not bear that out."
Math may only be concerned with the message of measure and quantity, not whether the medium is apples, oranges, or inches, but physics is concerned with the medium. Just because space and time are measured as units doesn't mean they are equivalent. When we measure space, be it distance, area, or volume, we are measuring space. When we measure time, we measure change caused by action.
Action occurs in space.
Space and action might be inseparable, but are they really indistinguishable?
Centrifugal force is the interaction of spin and inertia. Spin is action, but what is inertia?
Isn't it at least possible space is smooth because it is distinct from physical activity, yet at any scale of space, there will be some level of dynamic activity? That way, there would be both smoothness and foam.
report post as inappropriate
Anonymous replied on Oct. 7, 2012 @ 23:13 GMT
John and Lawrence, inertia is resistance to acceleration, and it is only equivalent with gravity/acceleration given instantaneity and balanced attraction and repulsion, as gravity cannot be shielded.
report post as inappropriate
John Merryman replied on Oct. 7, 2012 @ 23:51 GMT
Anon,
The point is that if you have a spinning object, with no outside reference, such that this spin cannot be defined by some other physical frame, but is only spinning in effectively empty space, what is the basis of the inertia, other than space as an absolute frame?
report post as inappropriate
Frank Martin DiMeglio replied on Oct. 8, 2012 @ 08:04 GMT
John and Lawrence, the point is that there is necessarily a balance between invisible and visible space that involves balanced inertia/resistance to acceleration and gravity/acceleration. This sits at the heart of physics.
report post as inappropriate
Frank Martin DiMeglio replied on Oct. 8, 2012 @ 08:09 GMT
Gravity, inertia, and electromagnetism enjoin and balance visible and invisible space in conjunction with balanced attraction and repulsion.
report post as inappropriate
Frank Martin DiMeglio replied on Oct. 8, 2012 @ 08:24 GMT
What is a larger AND smaller space reduces gravity and (on balance) increases inertia. This balances attraction and repulsion.
report post as inappropriate
Frank Martin DiMeglio replied on Oct. 8, 2012 @ 08:29 GMT
John, Lawrence, Since dreams are fundamental and general to physics, I know all of this. This contest was very unfair.
report post as inappropriate
hide replies
Author Lawrence B Crowell wrote on Oct. 7, 2012 @ 21:58 GMT
General relativity pretty clearly shows that spacetime is an aspect of dynamics. The smoothness vs noncommutative fluctuation or graininess is a dual aspect of quantum principles or nonlocality between different regions.
Cheers LC
Anonymous replied on Oct. 7, 2012 @ 22:28 GMT
No wonder you could not make the final 35.
report post as inappropriate
John Merryman replied on Oct. 8, 2012 @ 00:35 GMT
Lawrence,
Couldn't we use ideal gas laws to show "volumetemperature" "is an aspect of dynamics?" Just as acceleration or gravity slows clock rates, increasing or decreasing the volume of a particular quantity of gas will cause a fairly precise change in its temperature, so it would quite easy to formulate a mathematical model where temperature is another parameter of volume, much as time is modeled as a fourth dimensional vector.
In terms of waves, time is frequency and temperature is amplitude. They are quite obviously aspects of dynamics. Treating space as an aspect of dynamics is another matter. We are subjective, so we need an dynamic process to measure space, but that is obviously due to the subjectivity of perspective. When we measure space; distance, area, or volume, we are measuring space. When we measure time, we are measuring a dynamic process, because duration, the quantity between the points of reference, is not external to the present moment, but is the state of the present between the dynamic occurrence of the reference events, so there is no actual vector of time. It emerges from the dynamic.
On the other hand, space is not emergent. C, the speed of light in a vacuum, is how fast light crosses empty space. How could it be a constant, without the assumption of a stable metric of space?
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 8, 2012 @ 00:55 GMT
There are a lot of papers that failed to make that level. However, I would consider the scholarship in papers by Parikh, Gambini & Pullin, Fields, Anderberg, Rowlands, and Nieuwenhuizen which fell in the 4.2 to 3.8 range where I was at 4.0. Now compare that to some other papers further on up, such as Tamari, Blumschein, Klingman, Leshan, Merryman which either have factually wrong physics, advance silly propositions and in some cases clearly show a lack of basic understanding of physics. There are a range of essays that are post-modernist word salads up the line. The outcome here was unfortunate in my opinion, for there are complete nonsense papers all the way to the near top and some very good papers below, such as an interesting paper by A. Rej which is down in deep mud.
To be honest the rankings of these papers is in about half of the cases a dim reflection of their worth.
Cheers LC
Author Lawrence B Crowell replied on Oct. 8, 2012 @ 01:08 GMT
John,
As much as you seem to try to understand physics, the problem is that no physicist is going to say “temperature is amplitude,” or that time = frequency. In fact frequency is the reciprocal of a time period. It is pretty clear that you have a weak understanding of basic physics, and certainly have no depth of knowledge gained in graduate school. I just gave a list of folks who wrote nonsense stuff in the post above. In fact I could make quite an extensive list. About half the essays ahead of mine are papers that would receive a low grade in an undergraduate course.
If I do this again next year it will be to only write an informal discussion of my research. The paper I wrote this year was meant to be less formal, where I cited some published papers of mine and one of my papers I cited was just accepted for publication. Next time I am going to make it purely informal, I will use references of my papers and those of other researchers. I may also co-author the paper, rather than making it a solo effort.
Cheers LC
John Merryman replied on Oct. 8, 2012 @ 02:53 GMT
Lawrence,
I'll leave it at this for the moment. There are lots of sore feelings here and you are not inclined to loosen up in the best of times. Unfortunately the discipline of physics is falling into its own deconstructionist black hole of self absorption. You need visionaries, not craftsmen. You can polish this model till it glows, but you can't see past it.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 8, 2012 @ 02:56 GMT
The fact that some of my papers referenced are published (others in review) and that one was GRF recognized and just accepted for publication might suggest that I might not be so wrong.
LC
Anonymous replied on Oct. 8, 2012 @ 03:06 GMT
It suggests that you know how to play the publication game.
report post as inappropriate
John Merryman replied on Oct. 8, 2012 @ 10:31 GMT
Lawrence,
Doesn't it occur to you that is the same dynamic of consolidation defining any institution; That it promotes supporters and demotes skeptics? So the same conceptual structure which has devised such ideas as multiworlds, wormholes, blocktime, multiverses, etc, finds you to be a member of the club.
Yes, you do shy away from alot of the more extreme concepts, but consider the validity of a few of history's more derided ideas; Angels dancing on the head of a pin presaged the understanding of microbial life forms. Epicycles and the clock mechanisms evolved to model them are foundational to geometry and technology. All Galileo really did was to make the motion of the earth one more such cycle.
We have previously discussed how Wall St. hired physicists to created their rehypothecated derivatives and flash trading models. Does the fact something can be mathematically devised, mean it must be physically real, or is it simply a sign the wave is cresting, when it's more foam than substance?
Has physics fallen into an age old trap of self-referencing faith? The signs and markers would seem to be pointing in that direction.
Not that many will reconsider what they have devoted their lives to, but as the saying goes; Change happens one funeral at a time.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 8, 2012 @ 12:53 GMT
Physics is not about Charlie Parker's "anything goes." This is why all sorts of proposed ideas that people might have will simply fail. This will happen with ideas advanced by those with a deep knowledge of physics, and is far more likely to happen with ideas advanced by people who have little firm education in physics.
Micheal Faraday was remarkable for his contributions in electromagnetic field theory, but further for his lack of formal education. He was highly exceptional. It is also the case that Einstein was somewhat off the standard track, as were some other major contributors. However, I think it is not the case that somebody who says temperature = amplitude with waves has come up with the great unification of physics required at the time. That is just the way it is. Maybe the great unifier of physics will be some sort of Mozart-like prodigy who's mind cuts through these issues without need of formal education. If that is the case such a person will not by puttering around making cranky "theories" until they hit upon the right thing; such a person will be some young child who is solving nonlinear differential equations at the age of 10 and reworking M-theory in the teen years. Take my work for it.
Cheers LC
Anonymous replied on Oct. 8, 2012 @ 13:52 GMT
"Take my work for it."
Quite a Freudian slip.
report post as inappropriate
John Merryman replied on Oct. 8, 2012 @ 16:17 GMT
Lawrence,
Not to pop your bubble, but everyone, from a kid kicking a soccer ball, to an astronaut, has some knowledge of physics, not just those who spend their lives hitting things with other things and composing descriptions thereof. The problem is there is no clear line between facts and analysis, between analysis and speculation, or between speculation and delusion. It only becomes...
view entire post
Lawrence,
Not to pop your bubble, but everyone, from a kid kicking a soccer ball, to an astronaut, has some knowledge of physics, not just those who spend their lives hitting things with other things and composing descriptions thereof. The problem is there is no clear line between facts and analysis, between analysis and speculation, or between speculation and delusion. It only becomes evident in hindsight, when we cross those lines.
As I observed to Tom a bit ago, those who command armies are called generals, while specialist is a rank somewhere between private and corporal. Unless that brilliant prodigy has some way to scale up his/her insights into some broader, multi-spectrum understanding of reality, they will remain an obscure specialist who might well advance the larger spectrum of knowledge, but not dramatically change it.
M(embrane) theory grew out of string theory. Maybe they can come up with B(lock) theory and we get back to three dimensions. Though didn't string theory grow out of blocktime? Maybe they could come up with P(ot) theory. Either stirring it, or smoking it. Both are spatially dynamic and the effect is non-linear.
Specialists don't like generalization because they translate it as fuzzy detail, but to actual generalists, it means a wide angle, larger view, where all the cross referencing is more evident and patterns become more obvious.
You don't like my relating time and temperature to frequency and amplitude, but for those of us who are too simple-minded to have our brains overflowing with detail, it shouldn't be much trouble relating the two. Frequency is sequence of waves/events and the intervals between, while amplitude is the energy of those waves. So what is time, other than the measure of the frequency of events. Temperature is the level of energy striking your detector/thermometer. Of course, it could be sound being louder, or light being brighter, but I am generalizing scales as temperature, even if you don't see the cross reference.
view post as summary
report post as inappropriate
Edwin Eugene Klingman replied on Oct. 8, 2012 @ 16:24 GMT
Someone anonymous is trying to stir up a fight by posting to everyone that Lawrence Crowell insults with a link to his insults. I had already noticed this but decided to ignore it because Lawrence always complains during each contest that inferior essays are ranked above his. I tend to ignore it. He also says he can't criticize them during the contest for fear of being graded. Now that the contest is over, he has found the courage to insult directly (perhaps in a bottle, who knows?) Anyway, this is a perennial event: Lawrence loses, criticizes the rules, criticizes other authors, and complains that his genius has once again been unappreciated. There's nothing anyone can do about it. He thinks that if others didn't "advance silly propositions" but instead had serious propositions like his own: "the foundations of physics may lead underlying principles based on quantum error correction codes" and "there is only one electron, one up quark, one photon and so forth" in the entire universe, then they would deserve to be taken seriously.
I also noted that the anonymous poster posted to everyone whom Lawrence insults to try to get something going. When I was a teenager my girlfriend's grandmother used to call party A and say that party B had said such and such. Then she would call party B and say that party A had said such and such, then she would be in the middle of a fight. Anonymous reminds me of this sad old woman.
Edwin Eugene Klingman
report post as inappropriate
John Merryman replied on Oct. 8, 2012 @ 16:43 GMT
Lawrence,
I'm still wondering what you meant by "frequency is the reciprocal of a time period."
Did you mean the opposite of, or equivalent to?
If the first, it would seem you are channeling my paper, that time isn't the present going past to future, but the future becoming past, ie. the fading of one note/wave, as the next occurs.
If the second, then you seem to be agreeing with my observation that they are equivalent concepts.
Since I know you mean to refute whatever I say, it doesn't seem clear.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 8, 2012 @ 21:13 GMT
John,
Frequency is a very elementary thing. Frequency is the number of repeated or regular occurrence per unit time. The time therefore between two such cycles T then defines the frequency as ν = 1/T. This is something learned in physics 101.
This of course gets “changed” by Blumshein who thinks you can have negative frequency. This would mean a negative occurrence of regular events per period of time. It does not take much to see this makes little physical sense.
Cheers LC
Vladimir F. Tamari replied on Oct. 9, 2012 @ 01:50 GMT
Lawrence it give me no joy that the rating system is flawed and that serious competent work by you is rated less than the sort of papers, mine included, that you criticize.
Having said that, I agree with Edwin's responses about your attitude. In a remark above you say that "The solution might in part be under our noses.". But as long as mainstream physicists turn up their noses on anything new however simplistic or amateurishly presented, and stick to ossified concepts enshrined in century-old textbooks, quibbling only on details and footnotes, physics cannot possibly advance. There are many journals, conferences, textbooks and universities open to highly qualified physicists like Lawrence.
It will be nice if he leaves us this fqxi as a forum to express our hopes and dreams and half-cooked ideas for a more coherent less disjointed physics. Ideally the professionals might one day sniff out a good idea or two here that they can develop to their heart's content. Respectfully and with best wishes,
Vladimir
report post as inappropriate
John Merryman replied on Oct. 9, 2012 @ 02:26 GMT
Lawrence,
From
wikipedia;
"An atomic clock is a clock device that uses an electronic transition frequency in the microwave, optical, or ultraviolet region[2] of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element."
I really didn't expect you to question the idea of frequency as a measure of time. I thought you'd focus on my using amplitude as an analogy of temperature scale.
Physics obsesses over details and pursues them to the extremes of the very small, very large and very obscure. It needs to consider a broader perspective. George Ellis does consider aspects of this, in his entry on top down causation. Sometimes if you step back and look at the bigger picture, the details might fit together more effectively and efficiently.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 9, 2012 @ 13:41 GMT
John,
This gets to the heart of my complaint here. You appear to be confused over the definition of frequency; this is a very elementary concept. Yet you ended up scoring considerably ahead of me and a number of other reasonably scholarly papers.
In music you have whole, half, quarter notes and so forth. In 4/4 time you have 4 quarter notes per measure, 2 half notes per measure and 1 whole note per measure. The frequency of a quarter note is then 4 in units of a measure. Similarly the frequency of a half note is 2 in units of measure and so forth Think of a measure as being a definition of time and the number of notes defining the frequency as
frequency = number of events / time those events transpired
If you have some repeated set of events, say a metronome or some cyclic process, frequency is measured as the number of those events per time they occurred.
The unit of frequency is Hertz or equivalently inverse seconds ( sec^{-1} ) and so forth. If you have a physics text or can look this up on the web you will find this as well.
Cheers LC
John Merryman replied on Oct. 9, 2012 @ 17:25 GMT
Lawrence,
And your reply goes to the heart of my complaint about the personality type currently dictating the study of physics, where very fine details are obsessed over, but whatever context might be half an inch on either side is completely missed.
In his winning entry in the nature of time contest, Julian Barbour made the argument that the only measure of time "worthy of the...
view entire post
Lawrence,
And your reply goes to the heart of my complaint about the personality type currently dictating the study of physics, where very fine details are obsessed over, but whatever context might be half an inch on either side is completely missed.
In his winning entry in the nature of time contest, Julian Barbour made the argument that the only measure of time "worthy of the name" was using the principle of least action between different configuration states of the universe. Aside all the practical and conceptual complications, it begs the question of what would we call change that isn't regular? It still effects the consequences of change and thus time.
Frequencies vary. So does time. It's called time dilation. That's why when we put an atomic clock on an airplane, it speeds up the frequency. When scientists build an accurate clock, it isn't the length of the measure that matters, but its regularity.
This regularity provides an accurate standard against which to measure actions that are not so regular, because if time was only the most regular standard, with no irregularities to mark it, there would be no "direction" of time. It would only be that metronome.
Think through this statement:
"If you have some repeated set of events, say a metronome or some cyclic process, frequency is measured as the number of those events per time they occurred."
What is "per time?" As Galileo observed, when feeling his own pulse, while watching a pendulum; With time, we are only comparing one action to another. Do I assume you are simply assuming a Newtonian absolute time, when you say "per time?"
With your musical example, your measure is the whole note. Is that some form of absolute standard, akin to Barbour's least action between different configurations of the universe?
My point in saying frequency is time, is that time is a sequential series of events, against which we measure the larger, non-linear dynamic process of change. Frequency is also a series of events, being produced by a distinct process.
"The unit of frequency is Hertz or equivalently inverse seconds ( sec^{-1} ) and so forth."
To rephrase the question; What is a second? Isn't it based on some regular action?
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 9, 2012 @ 19:14 GMT
John,
You miss the point. Forget Barbour, forget principle of least action, forget relativity, forget issues of time existing or not existing and so forth. This is very elementary stuff, which you clearly demonstrate a lack of knowledge in. What you need to do is to study basic physics, Halliday & Resnick or Sears & Zymansky level physics that college freshmen take. If you had this in the past it is apparent that you have forgotten most of it, and if you have not taken this your confusion over the definition of frequency is evidence of this.
A system with a periodicity, such as regular events in certain time intervals, defines a frequency as the number of those events in a measured time interval: # event per time. Don’t get caught up in various issues of time measuring time and so forth. Just take it as a given and learn the basics. To equate frequency with time, where in fact frequency is the reciprocal of time, clearly illustrates a very basic problem.
Before you can play Rachmaninoff’s second piano concerto you must go through a whole lot, which usually starts out with playing twinkle twinkle little star. That’s how it is.
Cheers LC
John Merryman replied on Oct. 10, 2012 @ 01:52 GMT
Lawrence,
"A system with a periodicity, such as regular events in certain time intervals"
Would it be so difficult for you to tell me the difference between "periodicity" and "time interval?"
As the standard, what makes "time interval" absolute? A second? An hour? A day? Your standard is just another period.
I would suspect if you had an effective answer, you would not simply revert to insults.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 10, 2012 @ 02:10 GMT
A periodicity corresponds to regular time intervals between events. There are no absolutes. The interval is set by what ever system you have, or however you want to define it.
Cheers LC
John Merryman replied on Oct. 10, 2012 @ 02:59 GMT
Lawrence,
So the frequency of a cesium atom is both a periodicity and a time interval?
Would it be possible for you consider, just on this very minor point, that I'm not a complete idiot?
report post as inappropriate
Jason Wolfe replied on Oct. 10, 2012 @ 03:10 GMT
John,
Your powers of mathematics are no match for Lawrence's. You better run while you still can. You see, mathematicians can make crazy claims too. But when they say something off the wall, and back it up with mathematics, everyone goes ooh and ahh. But when you and I make crazy claims, we're called crackpots. Sorry, that's just life.
report post as inappropriate
John Merryman replied on Oct. 10, 2012 @ 04:04 GMT
Jason,
A long time ago, I realized I am living on borrowed time. I have nothing to defend, I'm only interested in what is true. If Lawrence can show me I'm wrong, then I would be enlightened. Life is details. Death is nothing. Details don't scare me, but I'm intrigued by nothing.
Yes, I'm a cracked pot.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 10, 2012 @ 11:18 GMT
Frequency is the reciprocal of periodicity.
Cheers LC
John Merryman replied on Oct. 10, 2012 @ 16:05 GMT
Lawrence,
By reciprocal, do you mean equivalent to, or opposite of?
report post as inappropriate
Jason Wolfe replied on Oct. 11, 2012 @ 05:04 GMT
Your heart is a good example of frequency. It beats about 55 times a minute. Faster if you're stressed. So the period of a heart beat is about 1.09 seconds. So frequency has units of cycles per second. The reciprical of that is periodicity which is seconds per cycle. I hope that helps.
report post as inappropriate
hide replies
Yuri Danoyan wrote on Oct. 8, 2012 @ 01:40 GMT
1.In the competition involved both professionals and amateurs.
If number of amateurs far exceeds the number of professionals, it can (I mean just the first round) to follow to question:
Is it possibly fair voting without "Throwing the Baby out with the Bath Water"?
2. Is the voting for FQXi members mandatory or voluntary in the first round?
If it is voluntary, how to save balance between numbers of professionals and amateurs?
3. Lot of essays does not correspond to the criteria of relevant.
Among leaders of contest I see philosophical essays absolutely not common with topic of contest.
4. Level of technical support is not high enough.
5.Contest participants who avoid discussions should be disqualified for passivity. I see such persons among a group of leaders.
report post as inappropriate
Jonathan J. Dickau wrote on Oct. 9, 2012 @ 01:56 GMT
Hello Lawrence,
You have my sympathies. I found it hard to believe when I saw my essay in 29th place at midnight on Friday, after spending most of the past week down in the 60s and 70s. I expected some large fluctuations at the end, with my final rank lower, and this is what happened. It was curious though that you ended up with a lower score, after I had seen your essay in the top tier and given you an 8, earlier on Friday. To calibrate that statement; I gave out no 10s, but used every other number at least once - and was still rating essays in the final hour.
My guess is that someone did attempt some technological jiggery pokery, and that the FQXi admins made an attempt to purge the erroneous entries, but may have voided some legitimate votes in the process. I am sorry that whatever happened affected you adversely, though I am elated at making the cut myself. You mentioned a possible joint effort in the future. I would be honored to co-author a paper with you, and if that works out, to submit a joint essay with you for some future contest.
I do not approach your knowledge level, but I have a knack for explaining highly technical subjects in a way laymen can understand, and apparently that is part of the requirement here. You mentioned the essay of Gambini and Pullin above, and I think they did an excellent job of reducing a subject I'd seen Jorge Pullin present in a highly technical fashion - a few years ago - to language any knowledgeable person could understand, without being a physicist.
That's tougher than it sounds. And perhaps it was my having learned about their work before, that made their paper easy to understand. But it's a shame some excellent essays, like theirs (and yours), did not end with a higher score.
All the Best,
Jonathan
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 9, 2012 @ 14:23 GMT
Jonathan,
The whole kerfuffle this week started when I drew a comparison between the work of reasonable authors and those who wrote rubbish on Oct. 8, 2012 @ 00:55 GM just above, when somebody “anonymous” decided to link this to the authors of essays that I cited as sub-par. Klingman then rose to my challenge, where I indicated on his web site very clearly the problems with the...
view entire post
Jonathan,
The whole kerfuffle this week started when I drew a comparison between the work of reasonable authors and those who wrote rubbish on Oct. 8, 2012 @ 00:55 GM just above, when somebody “anonymous” decided to link this to the authors of essays that I cited as sub-par. Klingman then rose to my challenge, where I indicated on his web site very clearly the problems with the mathematics he presents in his essay. His math is inconsistent, and if you go to his website and read my objections it should be easy to see the flaws I point out. As you might above see I put Merryman in that list of sub-par essays, where it is now clear he fails to understand the definition of frequency as I wrote above in a discourse with him. He finished way ahead of me, and yet he clearly does not understand Freshman or high school level aspects of physics. There is a whole constellation of rather high ranking essays with various problems of this nature.
The “October surprise” that happened before the close did shake things up a lot. I am not sure what happened there. Some of the better essays remained at the top or near there. Others, such as mine got shoved all over the map for a day or two. I think this year it is most egregiously apparent that lots of cranky papers passed up decent ones. The results are such that I think the comprehensive rankings are not accurate, but I am sure the final judgment will select the proper papers out of the top ranks --- 35 I think is the stated cut off for top papers. If the panel selects bogus papers as a winner then this whole contest is crap.
If I do this again next cycle, where the start of one contest seems to happen a year after the close of the last one, I will just write an informal review of my current research. I don’t see the point in writing a detailed paper that ends up being surpassed by an essay written by somebody who does not understand what frequency is. I just got a paper accepted for publication, which I used as a reference in my essay here, and so I think in a year I will have more publications that I can use as references in an informal essay. I think there will be a number of authors who will judge it down because of this recent flap doodle. I doubt this FQXi contest is anything to take seriously.
Cheers LC
view post as summary
Jonathan J. Dickau replied on Oct. 10, 2012 @ 00:27 GMT
Thanks Lawrence,
Your thoughtful reply is appreciated. You will probably have more chances for papers in top journals than any of the less learned authors who finished above you in this contest. But I understand that this contest's ratings were gamed by some less than scrupulous authors, which made many scores including yours do flip flops.
They found out that some individuals figured out how to vote multiple times. It was pointed out that there were also one or two 'fake' entries and other brief papers that may have been written under a pseudonym, which would give an author of a legitimate essay of lesser quality the power to give top marks to their own essay, and two votes each to friends. This certainly yields a stacked deck.
I think perhaps a stricter initial acceptance policy is in order, to assure that nobody can submit multiple essays. And I have already been in touch with the webmaster with insights about how the tamperers could have done what they did, and how to stop them more effectively. I think they do have a handle on it, and I don't think the technical problems we saw this year will be a factor next time.
All the Best,
Jonathan
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 10, 2012 @ 01:09 GMT
Because somebody wanted me to read their essay from the last contest I ended up looking at the rankings. I finished 26 out of 163, which is not bad. There were of course nonsense essays in the upper ranks. It did not appear that things were quite as skewed then as now.
If I do this again I will only write a very general or informal review. I am not going to write anything which is highly technical. Of course writing this will not be entirely easy, but at least I will not have to fuss with LaTeX and the rest.
Since authors rank each other's essays it does mean that cranky authors are judging your paper. I also suspect that cranks up-vote cranks. In part this contest is a bit of a popularity contest. One does have to schmooze a bit to get attention, and this can easily turn into what we used to call brown nosing.
Cheers LC
Georgina Parry replied on Oct. 12, 2012 @ 02:41 GMT
Dear Lawrence,
Julian Barbour's essay "Bit from It" came fourth in last years contest and he left no comments or replies on his essay discussion thread. Which I think shows a really well written and accessible essay can be popular anyway. It is possible get through to the final round on its merit alone. It has been nice for the community that he has participated more this year.
I think your statement that next year you will write something less formal is showing that you have understood why your essay was not more widely popular. It may be brilliant. However if it can't be understood by a lot of the participants the brilliance of it will not be fully appreciated and it will not score as highly as you would like.
Having different categories for the competition, such as one for technical papers, one for formal but less technical writing and one for informal writing would overcome the problem of comparison of very different products. The different kinds of writing have their own merits but are difficult to compare against each other to give a ranking that everyone will consider fair. I think Brendan suggested the possibility of having different categories and participants would decide in which one their essay should belonging. That sounds like a good idea to me. Perhaps participants would then also vote only on the essays within the category they feel best suited to evaluate as well. I would feel a bit happier about that.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 12, 2012 @ 17:14 GMT
That might work, though I suspect there would be a considerable amount of crossing over.
It has been a bit of a dissappointment. Last time I got ranked in the final cut, but did not win anything. I did get a paper published which covered the topic of my last FQXi essay. This year I got a couple of papers published that I used as references, but this year I ranked pretty far down.
I think I need to re-adjust my expectations from this.
Cheers LC
hide replies
John Merryman wrote on Oct. 10, 2012 @ 02:16 GMT
Lawrence,
It is not that I don't understand frequency, but that I'm comparing it, as a vector of intervals, with time, as I am comparing amplitude to temperature. The frequencies of light from another star might cover the same spectrum as the sun, but the amplitude of those waves is less, thus cooler.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 10, 2012 @ 11:52 GMT
Temperature has no meaning for a single wave. Since you compare stars with the sun this refers to a statistical distribution of photons with a range of wavelengths. For a thermal distribution of photons there is a blackbody distribution rule due to Planck
I(ν, T) = (2hν^3/c^2) 1/(exp(hν/kT) - 1)
for h the Planck constant h = 2πħ and ν the frequency of photons. If you take the derivative of this function with respect to temperature, the maximum dI/dT = 0 occurs when there is a frequency
ν = σcT
for σ the Wein’s displacement constant. As a result the frequency of light at the peak of the black body curve scales with temperature. Usually this is expressed according to wavelength λ = c/ν. This is different from the amplitude.
A G-class star has the same blackbody temperature whether that star is the sun or a near copy of the sun 100 light years away. To compute the amount of power incident per unit area (watts/m^2 = irradiance) incident on a surface from the sun or a distant star one must integrate this with respect to frequency (remember frequency = 1/periodicity and power = d energy/d time) and integrate over the solid angle Ω of view to the sun and the star.
P = ∫I(ν,T)dν∫dΩ
The sun has a total solid angle of perspective Ω ~ 2πx(π/1000), while a star has an exceedingly small solid angle of perspective.
This is standard physics going back to the start of the 20th century. Temperature has nothing to do with wave amplitude, but is a statistical effect from a thermal distribution of photons.
Cheers LC
John Merryman replied on Oct. 10, 2012 @ 16:44 GMT
Lawrence,
Thank you for the informative reply, even though I'm not able to fully appreciate it. I suppose I'm applying sound waves to light, that increased amplitude=louder. Rather than test your patience with the various questions that come to mind, I'll stick to one:
You make the points; "Temperature has no meaning for a single wave." and "Temperature ... is a statistical effect from a thermal distribution of photons."
Would time have any meaning for a single event, or regular periodicity from a single interval? If it doesn't, than isn't time also an effect of change resulting from such thermodynamic activity?
One additional thought; If photons become entangled in transit, wouldn't "the frequency of light at the peak of the black body curve scales with temperature" mean this mass of small waves is really one big wave?
Thanks for the engagement. Promise I won't tell the guild.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 10, 2012 @ 17:23 GMT
In classical physics of particles moving in space one can talk about "a point in time." When you have wave mechanics there is an uncertainty relationship ΔωΔt = 1, where for the spread in the angular frequency Δω ~ ω this gives the uncertainty spread in time.
With blackbody radiation the photons are in a state of complete decoherence so there are no entanglements between them. This is probably a bit of an approximation, for entanglements may still exist but are so scrambled up that they are not discernable.
Cheers LC
John Merryman replied on Oct. 10, 2012 @ 22:22 GMT
Lawrence
The problem with a classical point is anything multiplied by zero is zero, yet adding some minute dimensionality gives it volume, which is theoretically fuzzy. Which realistically makes the point in time as long as the event on question occurs, ie, the wave occurring. So time is an effect of activity, like temperature.
Molecules of water are still distinct, yet as a medium transmit energy as waves. With light, it is the energy, but in quantity is a medium, just as many molecules of water are a medium. So the "peak of the black body curve" would be a form of wave generated in this medium. Like temperature, something of a top down effect of the forest acting as one, even if the trees are still distinct units.
Writing this on my phone, which is having a top down effect on my thought process.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 10, 2012 @ 22:54 GMT
A point particle is a sort of approximation, as is classical mechanics. Try to look this up.
The
black body curve is not a wave.
Cheers LC
John Merryman replied on Oct. 11, 2012 @ 03:05 GMT
Author Lawrence B Crowell replied on Oct. 11, 2012 @ 13:23 GMT
The BB curve is a function of wavelength, but it is not itself a wave.
LC
John Merryman replied on Oct. 11, 2012 @ 15:51 GMT
Lawrence,
Yes, a curve is a graph, not an actual wave, but the question is whether its amplitude is an expression of temperature.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 11, 2012 @ 18:09 GMT
The peak of the curve can be found by evaluating where the derivative of the intensity function with respect to wavelength is zero. One can then compute the temperature corresponding to the black body curve. The temperature is proportional to the frequency of radiation at the peak. This is an elementary calculus operation. The peak of this function is not an "amplitude."
LC
John Merryman replied on Oct. 12, 2012 @ 00:58 GMT
Lawrence,
Yet what you have is alot of individual waves creating the effect of one large wave. Yes, you are correct that the forest is not a tree, but a mass can act as a singular entity. It does take a somewhat generalized perspective to see the ways a group of people can function as a larger whole, that is not apparent to someone only focused on the immediate physical manifestation of bodies. The problem with the basic mathematical modeling is that as reductionism, math dispenses with excess functions, so we forget that when we add, we are adding the sets and getting one larger set,not adding the contents. So two sets of apples is one larger set,not applesauce. Just as all the parts and contextual relationships of our bodies and lives adds up to one person. So a curve is not an actual wave, but it presents the characteristics of one.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 12, 2012 @ 01:46 GMT
John,
Please take the time to learn the real stuff if you want to really know about this. This is getting a bit frustrating to keep going over this. You have these ideas about things that you think are true, but they have no bearing on these matters. This issues with black body radiation, statistical distributions of frequencies and temperature is old stuff. Instead of sawing on your own ideas about this, please learn the real stuff.
Cheers LC
John Merryman replied on Oct. 12, 2012 @ 03:24 GMT
Author Lawrence B Crowell replied on Oct. 12, 2012 @ 16:54 GMT
I read Ellis' paper and wrote a comment or two. I am somewhat agnostic about this. These ideas tend to be suggestions of something rather than an explicit tehory that provides calculations. It on the one hand seems plausible that "new rules" or emergent structures determine processes on larger scales. Physics in some wense has this in its theories, such as hydrodynamics of fluids treated as continuous media, when we know they are ultimately made of atoms. However, it seems to beg the question of how such large scale rules can really determine physics on the smaller scale, or the "top down" picture.
Cheers LC
John Merryman replied on Oct. 12, 2012 @ 20:41 GMT
Lawrence,
Not to push the boundaries of the discussion to far, but isn't there an inherent dichotomy between bottom up and top down? That they are opposite sides of the same coin. There seems to be bottom up processes and top down ordering of them. Energy is bottom up, while information/structure is top down. This physically manifests as energy expanding out, while matter /structure contracts inward.
The energy is analog, but our comprehension is digital/ordered, so we only have this top down comprehension.
I'm on my phone and can't refer back to Ellis' paper,so I'm just putting this out for you to ridicule and hopefully postscript a few thoughts.
Regards,
J
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 13, 2012 @ 00:39 GMT
It is not hard to think that lots of small scale physics with lots of particles or atoms can give rise to a collective behavior. One case would be the flow of a gas or fluid, where on a large scale there is this apparent motion of a medium, but it is all built up from below by the motions of atoms. We have lots of physics that does just this, such as statistical mechanics that is built up from the statistical distribution of atomic motions. It is far more difficult to construct the top-down situation in a rigorous way. It is hard to construct a consistent theory where the rules of dynamics on a large scale turn out to determine the laws of physics on the nuts and bolts level.
This is not to say that this might not be the case, but as yet I think a clear convincing case is still needed. It can be seen with the Conway’s game of life. Clearly elementary rules for how various shapes in a space interact by a set of rules in a grid can result in large scale structures. However, the large scale structures do not act back down to alter the rules for how the basic shapes interaction with each other.
Cheers LC
John Merryman replied on Oct. 13, 2012 @ 04:21 GMT
Lawrence,
They many not alter the rules, but they may be part of the rules in the first place. For one thing, do we really understand bottom up from the perspective of bottom up, when we are the most complex top down point of reference we know?
Say that quantum behavior is due to loading, rather than statistics. That the energy builds up in the detector before it trips the atomic...
view entire post
Lawrence,
They many not alter the rules, but they may be part of the rules in the first place. For one thing, do we really understand bottom up from the perspective of bottom up, when we are the most complex top down point of reference we know?
Say that quantum behavior is due to loading, rather than statistics. That the energy builds up in the detector before it trips the atomic structure to a higher level. The result is a wave of energy, with the crest set by the capacity of the detector. So what you have is a bottom up process of accumulating energy, defined by its maximum level of energy. So while the reality is the interaction of the absorbed energy and the structure of the detector, the resulting information is what defines this relationship and what we have to describe the action is the information.
Julian Barbour, in a blog post called
From Time to Shape makes the argument that we cannot perceive distance, only the points of reference defining that distance. Which is a bit like saying we cannot perceive energy, only its effects. Yet wouldn't all points of reference be one, if distance didn't exist to distinguish them? Or effects wouldn't exist without the energy to manifest them?
So the bottom up manifests the top down and the top down defines the bottom up. So it is a dichotomy; If you have bottom up, then top down also exists.
Now think how the feedback loops create ever more complexity. Put in more energy/the wave going up and the amount of potential information increases, yet you don't know the amplitude of the wave before it crests. So it is only when the energy has reached its maximum that that bit of information is determined. Then depending how quickly the energy leaves that wave and starts another decides the frequency of the wave. Say we are talking about something as complicated as a human life. As it is born and grows up, it is absorbing energy and there is no way to really understand what is in store. The energy it absorbs might contain beneficial, or detrimental aspects. Beneficial means it will continue to grow and sustain itself, ie, be a long time interval. As the energy peaks, some estimation can be made of what that person has accomplished, than when all energy is gone and the lifespan is over, some final tally/record/information can be stated. Of course there are the multitude of influences, from influences to off spring, but that is an endless network. It is the amplitude and frequency again. A day is created by the spinning earth absorbing energy from the sun. In any particular location, the amplitude can vary, while the frequency is quite regular. These parameters of time and energy could be used to define just about any object or event.
There is no distinct top down, but is the effect of how bottom up interacts with other bottom up. Which means it does have effects, since it is all energy in action. Large scale structure exists because it is the manifestation of concentrated bottom up energy. How that energy is expended can have lots of effects.
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 13, 2012 @ 22:59 GMT
I am not sure what you mean by quamtum mechanics is due to loading. A lot of what you say is hard to follow or make much sense out of. My best suggestion is that you try to study the real physics. I don't know what else to suggest.
Cheers LC
John Merryman replied on Oct. 14, 2012 @ 02:06 GMT
Author Lawrence B Crowell replied on Oct. 14, 2012 @ 02:23 GMT
I strongly advise again that you endevour to study the real physics. I don't know what else to say. The problem is that without some reading or study of the real science you will continue to thrash around like this. Ragazas stuff is flawed, where he derives a form of action or Lagrangian that is just plain wrong. I can't recall about the other two authors. However, you need to have these concepts clearly delineated, otherwise the situation is hopeless.
Cheers LC
John Merryman replied on Oct. 14, 2012 @ 03:31 GMT
Lawrence,
I'm not quite sure what falls in the category of "real physics," when many of the more public members of the community are currently obsessing over multiverses. Does fantasy physics fall in your definition of what's real?
Eric Reiter came in 14th in the competition. It doesn't say much for your peripheral vision, literally or conceptually.
Carver Mead isn't a contestant. He is a pioneer in the computer industry.
I find in reading about the various versions and perspectives in physics, a significant split between academic theorists and those engaged in actual application of physics.
Is there any room in your view for drawing connections, or is it all delineation? Isn't it the goal of physics to tease out the basic patterns in nature? My sense is that distinction/delineation is only part of what makes nature what it is.
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 14, 2012 @ 21:32 GMT
If you know some physics I would advise studying the Feynman lectures on physics. That would be a good basis to be able to understaind real physics. At least you would be able to write something about physics that employs the appropriate terminology.
LC
Author Lawrence B Crowell replied on Oct. 14, 2012 @ 22:03 GMT
I have heard of Carver Mead, I did not recognize the name right away. The only thing I can say is that while his work on integrated VSLI chips was first rate, his ideas about electrodynamics are not accepted by many in physics. Honestly I have not read any of his papers, but I am aware that the physics community gives his theory a thumb’s down.
As for Reiter, I started reading the first page. It would take me a long time to go through the details of the experiment. I then looked at the end and he concludes things that are problematic. He claims to detect two photons in a double slit experiment, where only one is placed in the input of the splitter. He is using gamma rays, which are notoriously difficult to control in the manner needed to do quantum physics experiments of this sort. If his claimed results were real then it should hold for longer wavelength photons, such as optical light. Optical light is managed quite well, and this physics should be detected in this domain of QED. No such physics has been detected. I strongly suspect that something is terribly wrong with the experiment.
Cheers LC
John Merryman replied on Oct. 14, 2012 @ 22:44 GMT
Lawrence,
The experiment, as I understood it, is that he "pre-loaded" the array of detectors prior to releasing a quanta and it resulted in tripping more than one detector. His argument being that while a quanta is the smallest measurable quantity of energy, it is not an indivisible particle.
I certainly agree Mead's theories are not widely accepted, but I don't hold it against him, given the degree of tolerance for error in some of the theories that are. Specifically dark energy being given a pass, rather than reviewing the assumptions built into Big Bang theory, but we have been through that before, so I'm not trying to open that can of worms. My point being that any theory will raise questions, but there is a definite herd mentality that decides what is to be pursued. Witness string theory and supersymmetric particles. If we were to go back to the beginning of the wave/particle duality and take the other road, as Mead and Eric suggest, ie. waves with particle-like characteristics, rather than the one taken, particles with wave-like characteristics, would we have model that describe more of the reality we experience and not one that is currently using the anthropic principle expressed in multiverses?
I wish I had the time to read more, but I don't. I fact I'm taking a few free minutes to write this on my phone.
report post as inappropriate
John Merryman replied on Oct. 17, 2012 @ 10:42 GMT
Lawrence,
One final note;
You think I'm being over-broad in relating different manifestations of wave-like patterns, yet complexity theorists relate synchronized wave patterns in everything from financial crashes to epileptic seizures and find it a useful predictive tool.
So what if we were to see reality more as waves than particles? Wouldn't it be just standing waves, rather than tiny strings vibrating in eleven dimensions? The difference is just that one insists there is some solid under the action and one sees it as action in context of other action. So what are the eleven dimensions, other than folded up waves? Waves don't collapse, so much as they contract? Still nothing solid.
Same reality, different biases.
report post as inappropriate
S Halayka replied on Nov. 11, 2012 @ 12:26 GMT
Hem, maybe you're right... It does kind of totally seem like that guy who wrote the essay about focusing on Lagrangian dynamics kind of totaly ripped off Jos Stam's groundbreaking work "Stable Fluids". Of course, it wouldn't be a question of whether or not that's the case if academia wasn't run by troglodytes.
report post as inappropriate
hide replies
Constantin Leshan wrote on Oct. 11, 2012 @ 09:14 GMT
Dear Lawrence B. Crowell,
In the past contest I have found flaws and errors in ~ 20 essays, including leading essays. However, in this contest'2012 I decided not to judge any essays because Brendan recommended avoiding the judgment atmosphere. Nevertheless, since Dr. Crowell estimates my essay, I also have the right to estimate his essay.
Some contestants invented a fast method that...
view entire post
Dear Lawrence B. Crowell,
In the past contest I have found flaws and errors in ~ 20 essays, including leading essays. However, in this contest'2012 I decided not to judge any essays because Brendan recommended avoiding the judgment atmosphere. Nevertheless, since Dr. Crowell estimates my essay, I also have the right to estimate his essay.
Some contestants invented a fast method that allows producing quickly many senseless essays in order to make money in FQXi. Meanwhile, I observed that many professionals use this technology. In fact, this fast "technology" allows creation of many high level essays that looks "very scientifically". In order to produce such a story-essay, you must simply retell the accepted physics from textbooks and internet but using your own words. Therefore, it is not the copy/paste operation because the author simply changes the words, but he repeats the generally known information from accepted physics.
Dear Crowell, what means this proposition from your essay: "The acceleration is directly proportional to a force applied to it. The momentum of a body is its mass times its velocity p = mv as determined by an inertial observer'? It is the statements copied from a textbook - the author only changed the words. Or another proposition: "D-branes are composed of strings in a way similar to a Fermi-electron surface in a crystal' - it is a statement copied from brane theory. In the same way, I can show that the most part of Crowell's essay repeats the generally known information. You see, it is the retelling of the generally known information by using other words. In this way, you can prepare hundreds of different essays simply by retelling GR, quantum mechanics and other accepted theories. For example: 'The coupling constant G of general relativity (GR) with units of area, or G1=2 with units of inverse mass, while quantum field theories (QFT's) are unitless coupling constants in naturalized units'. You see, it is simply a story that repeats GR using other words.
Does such essay-story that repeats the generally known information deserve any prize? I can produce 10 such essays-stories like Crowell's essay during a week.
Besides, it is a collection of propositions without any logical connection between them. What logical connection is between Newton, QCD and string (brane) theory?
Since experimentation at Plank scales is not possible, many scientists, including Crowell, publish their free fantasies about Plank scales because they know that all these fantasies cannot be proved or disproved because experimentation at Plank scales is not possible.
Thus, Crowell tells us about Newton and Einstein, then about GR and QFT, quantization of spacetime, QCD, Feynman's rules, then string theory… In my view, Crowell simply tries to fill his essay with generally known information that "looks scientifically". I quote: "Cardenas proposed inflation preserves the holographic principle. In [10] it is demonstrated how a closed FLRW spacetime cosmology with k = 1 upon turn around to recollapse exceeds the entropy bound of Bekenstein and the holographic principle. Crowell [11] proposed from this quantum cosmologies".
You see, it is a collection of random fragments and propositions; Yes, it looks 'scientifically' but it is simply the retelling of accepted physics.
However, the question of essay contest is 'Which of Our Basic Physical assumptions are Wrong'. I do not see any wrong physical assumptions found by Crowell. 'The foundations are not foundations' is not wrong assumption, I can also write that physics is not physics. Why this essay was accepted? It is simply the retelling of known information. The contribution of Crowell is simply that he RETELLS (copy) the accepted physics (even if he copies the SENSE but not copy/paste).
Does this collection of random propositions and fragments of text that repeats the generally known information deserve any prize? I repeat that I can produce 10 such "high level" essays during a week.
For example, first I'll copy the fragments of text from academic papers about Lorentz symmetry, Heisenberg uncertainty, GR, Feynman diagrams. Then I change the words and formulae, and the new essay is ready for publication! Pay attention that it is not a copy/paste because I changed the words! Therefore the essay appears to be 'original' whereas it actually repeats the accepted physics. For example, Crowell wrote: 'The coupling constant G of general relativity (GR) with units of area, or G1=2 with units of inverse mass, while quantum field theories (QFT's) are unitless coupling constants in naturalized units'. You see, it is simply a story that repeats GR and other theories using other words. The contribution of Crowell is simply that he RETELLS the accepted physics but not copy/paste.
If Dr. Crowell will receive a prize for his collection of copied random stories then this (above) new-produced essay also deserve a prize.
It is very difficult to create an original research with Unique information, but is very easy to create the stories about generally known information like Crowell's essay. I propose to eliminate all such stories from contest, it is a fraud only; In such a way professionals make money.
Sincerely,
Constantin
view post as summary
report post as inappropriate
Author Lawrence B Crowell wrote on Oct. 11, 2012 @ 13:50 GMT
Constantin,
I am not sure why you find it objectionable that I reference work by other physicists. For instance you bring umbrage to my referencing Cardenas’ work where he demonstrates the k = 1 FLRW model that is closed and recollapses violated the entropy bound of Bekenstein and Bousso. I do take these results and further work a consistent theory involving quantum forms of the Landau tri-critical point to demonstrate the onset of inflationary cosmology. This paper, referenced in my FQXi essay, received an accolade by GRF and was accepted for publication is Int. J. Theo. Phys. You similarly find it objectionable that I would use other references as well. Of course this is common practice.
In your paper you reject a range of physics, such as declaring quark or QCD wrong. Big bang is judged wrong by you, this in spite of a growing preponderance of evidence in its favor. You also have funny things, such as your figure 1 with photons moving at v = c + 75km/sec, which nobody who is well grounded in physics is going to take seriously.
Cheers LC
Constantin Leshan replied on Oct. 13, 2012 @ 16:03 GMT
Dr. Crowell,
I have found that your essay copy the generally known information, therefore it is a fraud! Imagine that I send such a story to contest. In this essay I tell first about Newton, then about Einstein and GR, string theory, Plank scale physics. It is a simple story about physics only that repeats the generally known information. Does such essay deserve any prize? I can produce...
view entire post
Dr. Crowell,
I have found that your essay copy the generally known information, therefore it is a fraud! Imagine that I send such a story to contest. In this essay I tell first about Newton, then about Einstein and GR, string theory, Plank scale physics. It is a simple story about physics only that repeats the generally known information. Does such essay deserve any prize? I can produce hundreds of such stories about physics! For example, first I'll tell about quantum mechanics, then about string theory, brane theory, cosmology, quark model, particle physics, GR and so on. Everyone can produce hundreds of such stories about physics as your essay; it does not deserve any prize, such essay doesn't have any scientific value.
The question in our essay contest was 'Which of Our Basic Physical assumptions are Wrong'. I don't see any wrong physical assumptions in your essay. Your statement that 'Foundations are not Foundations' is a joke only. In the same way I can say that 'Physics is not Physics', it is senseless.
Thus, your essay have nothing to do with our essay contest, it is a simple story about physics. I can produce hundreds of such stories similar to Crowell's essay; it does not deserve any prize. If FQXi reward such simple story about physics then we'll send next time only stories for contest.
You wrote 'why you find it objectionable that I reference work by other physicists'. I do NOT find it objectionable that you reference work by other physicists; - you are free to reference what you want. I gave this example 'Cardenas proposed' in order to show that your essay is a collection of random statements without any logical connection between them. For example, I do not see any logical connection between Newton's laws, brane theory and Plank physics. It is a proof that your essay is a collection of random (and copied) statements without any logical connection between them. Therefore this work cannot be called even 'essay' because an essay must have a central idea; it is a collection of random and copied stories.
You wrote: 'Big bang is judged wrong by you' - Where you see Big Bang theory in my essay? My essay does not contain any words about Big Bang.
You wrote: 'your figure 1 with photons moving at v = c + 75km/sec, which nobody who is well grounded in physics is going to take seriously'.
I remind you, that the goal of this contest is just to find flaws in accepted physics (wrong physical assumptions), and therefore my example with entangled photons must be accepted and welcomed by this contest. Also, it is naturally that mainstream scientists will criticize all attempts to destroy their pet theories.
My example is absolutely correct physically and may be proven experimentally. It seems that only the experiment can prove this phenomenon for mainstream scientists. Nevertheless, I'll try to explain it again theoretically.
The definition of motion is that: The motion is when one object changes its position in respect to other objects. Since photons change their position concerning the source, it is motion by definition!
Then I can show that the position of photons changes in such a way that this motion is faster than light. In order to protect Relativity, scientists invented a trick that it is not motion at all; but this trick violates the definition of motion! Besides Relativity does not need any protection but need improvement; the authors of the dogmas simply do not understand correctly Einstein's Relativity. I can show again that the position of the photon relative source changes faster than light, and it is Motion because it is in agreement with definition of motion. (Meanwhile, I call such motion as Apparent motion in my essay, therefore it is in agreement with mainstream dogmas).
Thus, my example with entangled photons is physically correct and must be welcomed in this contest because the goal of this contest is just to find flaws in accepted physics.
Sincerely,
Constantin
view post as summary
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 13, 2012 @ 22:46 GMT
The connection between Newton and quantum mechanics with quantum gravity is that a structure exists "under our noses." With variational mechanics there is an extension to the Feynman path integral. With BCFW recursion for QCD gluon calculations there is a correspondence with gravitons and M-theory. That was why I started out discussing Newton.
By declaring photons motion depending on the source you have really cut your throat. This simply is wrong.
LC
Constantin Leshan replied on Oct. 15, 2012 @ 10:35 GMT
Dr. Crowell wrote:"By declaring photons motion depending on the source you have really cut your throat. This simply is wrong".
Your statement shows that you don't read and don't understand my essay, therefore it is senseless to discuss it with you. Where you have found my declaration that "photons motion depend on the source"? Every scholar know that the photons speed do NOT depend on the source. In my example the photons move locally with the speed v = c. Now consider the contribution of cosmological expansion which increases the distance between these photons and the source. For this reasons, the distance between photons and the source grows in such a way as if photons move faster than light concerning the source. My example is absolutely correct physically, it is very strange that you don't understand such simple things.
Constantin
report post as inappropriate
Author Lawrence B Crowell replied on Oct. 15, 2012 @ 11:51 GMT
The way to properly deal with photons in the FLRW metric is to work with general relativity and to compute the null geodesics. Nobody goes around saying stuff like v = c + 75km/s or that it is "as if it were traveling facster" and so forth. To use terms like "as if" in this case appears to be an application of nonsense meant to cover over nonsense.
LC
hide replies
Login or
create account to post reply or comment.