CATEGORY:
FQXi Essay Contest - Spring, 2017
[back]
TOPIC:
Fundamental is Non-Random by Ken Wharton
[refresh]
Login or
create account to post reply or comment.
Author Ken Wharton wrote on Feb. 2, 2018 @ 18:59 GMT
Essay AbstractAlthough we use randomness when we don't know any better, a principle of indifference cannot be used to explain anything interesting or fundamental. For example, in thermodynamics it can be shown that the real explanatory work is being done by the Second Law, not the equal a priori probability postulate. But to explain the interesting Second Law, many physicists try to retreat to a "random explanation," which fails. Looking at this problem from a different perspective reveals a natural solution: boundary-based explanations that arguably should be viewed as no less fundamental than other physical laws.
Author BioKen Wharton is a professor in the Department of Physics and Astronomy at San Jose State University. His primary research field is Quantum Foundations.
Download Essay PDF File
Edwin Eugene Klingman wrote on Feb. 2, 2018 @ 20:56 GMT
Dear Ken Wharton,
Congratulations on an excellent essay! Both a pleasure to read and insightful. In some ways it resembles Roger Schlafly's essay in that it uses common sense to develop conclusions based on numerous instances of physics.
You begin by analyzing "randomness" and point out that entropy is associated with a state of knowledge of the 'macrostate', not the unknown...
view entire post
Dear Ken Wharton,
Congratulations on an excellent essay! Both a pleasure to read and insightful. In some ways it resembles Roger Schlafly's essay in that it uses common sense to develop conclusions based on numerous instances of physics.
You begin by analyzing "randomness" and point out that entropy is associated with a state of knowledge of the 'macrostate', not the unknown 'microstate' of the system. Since "randomness is at its best when your knowledge is at its worst", this imply statistics of microstates are appropriate. Such 'averaged' approaches are not fundamentally explanatory.
Your discussion of 'boundary conditions' as fundamental is unique. No other essay I have seen focuses on this. Your example of the boundary of a metallic conductor, with half of the parameters constrained, is very powerful. You point out that in classical physics, our dynamical equations are often viewed as less fundamental than the boundary-constrained Lagrangian density that generates them. I hope you will read my essay and note my equation (11) derived from Noether's theorem and related to Maxwell-Hertz dynamical equations. When you note that, in the Lagrangian case, one puts a boundary around the whole of space-time, not just the past, I hear echoes of your block time approach. Yet you mention (page 6) "the biggest 3-D boundary of all – the cosmological boundary of the universe." I would recommend Eckhard Blumschein's essay for treatment of the infinite time extension.
While I agree with you that fundamental boundary explanations need to be taken seriously and literally, I would focus on 3D and very carefully consider how time fits into the 'boundary'.
In your Klein-Gordon paper you treat the
time-energy uncertainty relation ("never been put on an even footing with the position-momentum uncertainty principle"). In my current essay I treat
time-energy conjugation as an alternative interpretation to
space-time symmetry. I hope you will find time to read my essay and comment on it.
Thanks for an excellent essay and good luck in the contest.
Edwin Eugene Klingman
view post as summary
report post as inappropriate
Author Ken Wharton replied on Feb. 5, 2018 @ 00:24 GMT
Dear Edwin -- very perceptive! Yes, lots of side connections to my main research interests, although I tried to keep that to a minimum... :-) As you can probably tell, though, I'm quite skeptical of treating time independently from space, and tend to think about things in 4D as much as possible. I'll try to get to your own essay next week. Best, Ken
a l wrote on Feb. 3, 2018 @ 10:01 GMT
Dear Ken Wharton,
your essay is remarkable, well argued and nicely written. It is also original as it proceeds in a negative way by emphasizing what is not fundamental: randomness. Actually many popular writings (and some serious ones) advertise randomness as the ultimate explanation even if in most cases it is just a convenient boundary to keep away problems we are not interested in; people who have never heard about Bertrand paradoxes are liable to offer a probabilistic treatment of anything. Just like Penrose's cosmological arguments which go against the mainstream, your essay deserves to be widely read and appreciated.
Best
a.l.
report post as inappropriate
Author Ken Wharton replied on Feb. 5, 2018 @ 00:25 GMT
Thanks for the kind words! Cheers, Ken
Joe Fisher wrote on Feb. 3, 2018 @ 15:32 GMT
Dear Professor Ken Wharton,
FQXi.org is clearly seeking to confirm whether Nature is fundamental.
Reliable evidence exists that proves that the surface of the earth was formed millions of years before man and his utterly complex finite informational systems ever appeared on that surface. It logically follows that Nature must have permanently devised the only single physical construct of earth allowable.
All objects, be they solid, liquid, or vaporous have always had a visible surface. This is because the real Universe must consist only of one single unified VISIBLE infinite surface occurring eternally in one single infinite dimension that am always illuminated mostly by finite non-surface light.
Only the truth can set you free.
Joe Fisher, Realist
post approved
Marcel-Marie LeBel wrote on Feb. 3, 2018 @ 19:09 GMT
Ken,
Excellent essay, a keeper! You have proven my definition of a truth. I say that a truth is an absence of choice, a fact. The strongest absence of choice is an impossibility. So strong that such impossibilities are a type of truth we call “Postulates”, truths so strong they can never be proven right. Impossiblities are the boundaries that define a truth, as in DE-FINE, or make it finite, real. In other words, a truth is the result of choiceless boundary conditions.
In my essay, I tackle existence with the rule of non-contradiction as boundary condition as a starting point.. Have a look.
Best of luck,
Marcel,
report post as inappropriate
Flavio Del Santo wrote on Feb. 3, 2018 @ 21:00 GMT
Dear Prof. Wharton,
I found your essay interesting indeed, and well written. However, the impression I had is that your main point in the treatment of physical randomness could have been made more that a century ago, in the debate over statistical mechanics versus deterimistic microphysics. The point is that before quantum mechanics it was totally legitimmate to think of randomness as a collective statistical description, but after all the struggles in quantum theory in the last nine decades, it seems very difficult to avoid (except for de Broglie-Bohm model). I would have liked a more thorough and explicit consideration of quantum physics, that is at the core of the debate over fundamental randomness.
Best regards,
Flavio Del Santo
report post as inappropriate
Author Ken Wharton replied on Feb. 5, 2018 @ 00:50 GMT
Dear Flavio,
Good point! Indeed, I certainly could have written a much longer essay about my take on the use of probability in quantum theory. (That's my primary research interest, after all.) But I think it's important to distinguish probability from randomness -- at least randomness of the 'equal a priori probability postulate' variety that I talk about here. That sort of randomness shows up in quantum statistical mechanics, but not really in quantum theory per se. Probabilities in quantum theory are notoriously *not* random in this manner -- some events are more probable than others -- which makes the quantum issue a somewhat different question than the main point I'm making here.
On that issue, I'm firmly of the opinion that all probabilities -- even quantum probabilities -- are due to things that we don't know. But I'm not at all sure whether there's still room for *non*-fundamental randomness (of the "all-at-once" variety I describe here: http://www.mdpi.com/2078-2489/5/1/190/htm ) , or whether the boundary constraints on the universe really fix the whole history of the universe down to the last detail. Both options still seem to be in play, as I see it.
Thanks for your comment and interest! -Ken
Marcel-Marie LeBel wrote on Feb. 3, 2018 @ 22:13 GMT
Ken Wharton, in his excellent essay, shows that the boundary conditions are what is fundamental. In this, he supports my definition of what a truth is. “A truth is an absence of choice for everyone”. The strongest absence of choice is an impossibility and this, in the most universal sense, is failing the rule of non-contradiction (RNC). All truths are bound by the rule of non-contradiction. In other words, respecting the rule of non-contradiction de-fines, or makes finite and real a truth. The RNC is the basis of maths, logic, and pretty much everything else. The biggest gap in all this was, I believe, not having a clear definition of a truth...
Neither Ken or I may lay claim to this. This was Aristotle’s claim all along. “The rule of non-contradiction is the most important rule in the universe... “
Right now Aristotle is spinning in his grave shouting..
“ I told you SOOOOOOooooooooooooooo........!!!!!
Thanks Ken,
Marcel,
report post as inappropriate
Heinrich Luediger replied on Feb. 5, 2018 @ 11:23 GMT
Dear Marcel-Marie,
basically everyone agrees on RNC, the question is what it means: is it logical (A; not-A), complementary (A; everything A is not) or categorical (orthogonal) (∫AB=0)?
Heinrich
report post as inappropriate
Marcel-Marie LeBel replied on Feb. 5, 2018 @ 22:03 GMT
Heinrich,
This is an excellent question! In the system described in my essay, it works much like the “logical” RNC you define. It is A is not A*. The asterisk only indicates a different form of A (while minus (-) A suggests a mathematical context.)
For the system to work logically, its elements must be comparable i.e. A, A*, As, etc. and distinguishable (different) so that the RNC may Work. On the other hand, A and B are not comparable and “A not equal B” is simply the definition of them being two different elements (substances A and B in my essay).
So, for example, the following are “not A”:
A* (or any variation of A) : --- comparable and distinguishable, therefore, logically operational within the logical system based on A.
B : (or the set of all except A) not comparable ------ distinguishable by definition.
Nothingness: both comparable and distinguishable, it supports logically the existence of A, A*, .As, etc (all forms of A)... and (if not in the same system) B and anything else.
So, to answer your question, the RNC, INSIDE a substantial system (real stuff), operates in a logical sense (comparable and distinguishable) between all the A and A* and other A variations.
Outside a substantial system, the RNC works in a complementary way (A; all A is not) would differentiate different logical systems, say, one based on A and its variations and another based on B and its variations, or any other ..
Heinrich, could you describe the categorical (orthogonal) (∫AB=0)? RNC?
Thanks,
Marcel,
report post as inappropriate
Dizhechko Boris Semyonovich wrote on Feb. 4, 2018 @ 15:23 GMT
Dear Ken Wharton, You have invested a lot of energy to write this good essay. You write:“What’s needed is some ‘startingpoint’” This starting point can be the principle of the identity of space and matter of Descartes'.According to Descartes, space is matter, and matter is space that moves. Thus, space is the foundation for constructing fundamental theories. The space has information, which is then realized in the device of the world. Look at my essay,
FQXi Fundamental in New Cartesian Physics by Dizhechko Boris Semyonovich Where I showed how radically the physics can change if it follows this principle. Evaluate and leave your comment there. Then I'll give you a rating as the bearer of Descartes' idea. Do not allow New Cartesian Physics go away into nothingness, which wants to be the theory of everything OO.
Sincerely, Dizhechko Boris Semyonovich.
post approved
Steven Andresen wrote on Feb. 6, 2018 @ 05:13 GMT
Dear Ken Wharton
Just letting you know that I am making a start on reading of your essay, and hope that you might also take a glance over mine please? I look forward to the sharing of thoughtful opinion. Congratulations on your essay rating as it stands, and best of luck for the contest conclusion.
My essay is titled
“Darwinian Universal Fundamental Origin”. It stands as a novel test for whether a natural organisational principle can serve a rationale, for emergence of complex systems of physics and cosmology. I will be interested to have my effort judged on both the basis of prospect and of novelty.
Thank you & kind regards
Steven Andresen
post approved
Francesco D'Isa wrote on Feb. 6, 2018 @ 20:16 GMT
Dear Ken,
thank you for your essay, which I found very interesting and pleasurable. It reminded me a famous poetry by the Italian poet Montale, who said, "Codesto solo oggi possiamo dirti,ciò che non siamo, ciò che non vogliamo." [we can tell you just what we are not and what we don't want].
You write that
> Explaining the relationship between two things does not really explain either of them. What’s needed is some ‘starting point’.
This is very interesting for me, since I proposed in my essay about absolute relativism that everything is relational. But I agree that we can fully know something just within boundaries.
All the best,
Francesco D'isa
report post as inappropriate
Juan Ramón González Álvarez wrote on Feb. 6, 2018 @ 22:38 GMT
I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause. I do not find any reason to believe that Universe is deterministic.
Does energy conservation follow from Noether...
view entire post
I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause. I do not find any reason to believe that Universe is deterministic.
Does energy conservation follow from Noether theorem? Or it just that the Lagrangian formalism is only valid for non-dissipative systems and thus has conservation law just hidden in its symmetries. Moreover, all treatises on Noether theorem I know confound conservation of energy (d
iE/dt=0) with invariance of energy (dE/dt=0).
The equal a priori probabilities postulate is not associated to the second law. The postulate is needed in equilibrium statistical mechanics to get thermodynamic properties for systems at equilibrium and routinely used for the description of reversible processes. In fact the postulate is not valid outside equilibrium and, thus, not valid to study the irreversible evolution of a system towards equilibrium.
The "past hypothesis" not only do not explain the second law, but shows a basic missunderstanding about the second law. The second law is not about features of the initial state.
Indeed the time-asymmetry encoded in the second Law cannot come about from time-symmetric dynamical laws. We need time-asymmetric dynamical laws.
"The Second Law tells us that entropy always increases". Not true. That is only a superficial and misguided formulation of the law. The second law says that the production of entropy is non-zero. The secondf law is
not dS>0. The second law is d
iS >=0. And this is the classic formulation, where thermal fluctuations are ignored.
All the subsequent attempt to explain that superficial and misguided formulation of the law is invalid as well. Asigning a low entropy to the initial instant of the Universe does not explain anything, and the incompatibility between the second law of thermodynamics and mechanics (time-reversible) remains. Effectively, we solve the Liouville equation (or its von Neuman quantum analog) and set initial state of very low entropy and the evolutions predicted by the equations continue contradicting the second law and observations.
"Boundary explanations" do not explain anything. Noticing that the systme evolved from A to B because it was first on A and latter was found on B is vacous of content. Moreover this kind of boundary 'explanations' often hide another serious missunderstanding of the second law; if all what was needed to explain that the system evolved irreversibly as A --> B because it was initially on A, then the second law would not be needed. The first law would be enough.
Initial states and boundaries are alredy used in the laws of mechanics and electrodynamics, but those laws cannot describe irreversibility. And that is the reason why thermodynamics and the second law was born as a separate field of physics.
The arrow of time, the irreversibility of the second law, has a dynamical origin: resonances. There are a broad literature in the topic.
view post as summary
report post as inappropriate
Author Ken Wharton replied on Feb. 7, 2018 @ 17:19 GMT
Thanks, Juan, for a careful reading and interesting points. Lots to parse here.
>I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause.
Agreed -- see my...
view entire post
Thanks, Juan, for a careful reading and interesting points. Lots to parse here.
>I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause.
Agreed -- see my response to Flavio above. Some things don't need an explanation. But the Second Law does need an explanation, for several reasons. 1) It supplies many other subsidiary explanations, so it's not devoid of content. 2) It can't be fundamental in its own right, because it only applies to macrostates, not microstates. 3) It can't be explained from our time-symmetric dynamical laws, or randomness.
> I do not find any reason to believe that Universe is deterministic.
I probably agree with you here, but the real question is whether it's time-symmetric. Even our indeterminstic theories predict time-symmetric micro-phenomena.
>Does energy conservation follow from Noether theorem? Or it just that the Lagrangian formalism is only valid for non-dissipative systems and thus has conservation law just hidden in its symmetries. Moreover, all treatises on Noether theorem I know confound conservation of energy (diE/dt=0) with invariance of energy (dE/dt=0).
Well, when you do it correctly in classical field theory, you get constraints on the Stress Energy tensor, which is really the right way to go. "E" is a bit of a fiction, certainly in GR.
>The equal a priori probabilities postulate is not associated to the second law.
I'm skeptical. Could you point me to a stat mech argument for the 2nd law that doesn't implicitly assume it at some stage?
>The postulate is needed in equilibrium statistical mechanics to get thermodynamic properties for systems at equilibrium and routinely used for the description of reversible processes. In fact the postulate is not valid outside equilibrium and, thus, not valid to study the irreversible evolution of a system towards equilibrium.
Applying the word "valid" to the EAPPP seems like a category error. Of course, it's never *truly* valid: at any given time the actual system is in 1 microstate, with 100% certainty, and all others with 0%. The "a priori" means that you use it as a Bayesian prior, when you have no other information. And as you note, if you did this, you would predict it would be in an equilibrium macrostate. (Which it might not be, certainly, but that would be your best bet given no other knowledge.) And if you *knew* it wasn't in equilibrium, or knew anything else at all, you'd update your priors. But usually that just means applying the EAPPP to all possible states that were compatible with your updated knowledge. That's how you get to the 2nd Law from the EAPPP. Knowledge always trumps randomness.
>The "past hypothesis" not only do not explain the second law, but shows a basic missunderstanding about the second law. The second law is not about features of the initial state. Indeed the time-asymmetry encoded in the second Law cannot come about from time-symmetric dynamical laws. We need time-asymmetric dynamical laws.
That was certainly what Eddington thought -- but that challenge has been open for a century with no answer in sight. (If there are time-asymmetric dynamical laws, what are they?) By now, the question has been settled by computer simulations that show entropy increasing (from low initial boundary constraints!) using explicitly time-symmetric dynamics. In computer simulations, there is no possibility of hidden dynamics we don't know about.
>"The Second Law tells us that entropy always increases". Not true. That is only a superficial and misguided formulation of the law. The second law says that the production of entropy is non-zero. The secondf law is not dS>0. The second law is diS >=0. And this is the classic formulation, where thermal fluctuations are ignored.
Agreed! (But in our universe, at any reasonable coarse graining, it does increase.) Also agreed about the fluctuation issue; I talk about this in the 'anthropic' section.
> All the subsequent attempt to explain that superficial and misguided formulation of the law is invalid as well. Asigning a low entropy to the initial instant of the Universe does not explain anything, and the incompatibility between the second law of thermodynamics and mechanics (time-reversible) remains.
See the computer example above. Some of the best work on this topic has been done by Larry Schulman. He puts a low entropy *final* condition on systems and shows that entropy *decreases* in computer simulations. He also used initial and final boundaries and showed that entropy went up and then down again. There's no incompatibility whatsoever; all the asymmetries come from the boundaries.
> Effectively, we solve the Liouville equation (or its von Neuman quantum analog) and set initial state of very low entropy and the evolutions predicted by the equations continue contradicting the second law and observations.
I don't understand what your point is here.
>"Boundary explanations" do not explain anything.
Obviously, I disagree.
>Noticing that the systme evolved from A to B because it was first on A and latter was found on B is vacous of content.
True... But those boundaries can still explain what happens inbetween. And if you don't impose anything at B, the boundary at A can also be used to explain asymmetries, if A is "special" or essentially different from how it ends up at B. Furthermore, (the case I'm most interested in) consider *partial* boundary constraints (say, half the Cauchy parameters), constrained on both ends. Once the system is solved, these partial boundaries then explain the un-constrained parameters, at the beginning and the end, and all the parameters in the middle, too. So boundaries can absolutely be used to explain things, when combined with some way to solve the system.
> Moreover this kind of boundary 'explanations' often hide another serious missunderstanding of the second law; if all what was needed to explain that the system evolved irreversibly as A --> B because it was initially on A, then the second law would not be needed. The first law would be enough.
I think you're perhaps mixing up macro- and micro- concepts here. There are no irreversible events at a micro scale. (Or so most physicists believe; maybe we're all wrong.)
>Initial states and boundaries are alredy used in the laws of mechanics and electrodynamics, but those laws cannot describe irreversibility. And that is the reason why thermodynamics and the second law was born as a separate field of physics.
Yes, it was born separate, but Boltzmann (and others) figured out how to reunite them. The key difference is that when you zoom out to the macrostate, tossing away some of the data as unknown, then apparently irreversible (macro-) processes enter the story (assuming you have a special low-entropy boundary condition, so that the Second Law is in play). But if you know everything, even that apparent irreversibility goes away.
>The arrow of time, the irreversibility of the second law, has a dynamical origin: resonances. There are a broad literature in the topic.
There is indeed a very broad literature, and the vast bulk of physicists are perfectly happy with the boundary-based account, even if they're not willing to treat boundaries as fundamental in their own right. If special time-asymmetric resonances were needed, then why would entropy increase in computer simulations that lacked them? Don't you think you're already using Second-Law-style logic when you try to infer a time-asymmetry from a resonance? (Classical chaos is time-symmetric, too, but you can get an arrow from it if you impose a low entropy initial boundary.)
In general, I'd note that we have lots of time-asymmetric intuitions, and they're all too easy to slip into our analysis without properly seeing where they come in (as happened to Boltzmann himself). The discovery of fundamental time symmetry has been a big surprise to those intuitions, but a surprise that we should take very seriously.
All the Best,
Ken
view post as summary
Juan Ramón González Álvarez replied on Feb. 28, 2018 @ 01:26 GMT
The classical second law cannot be fundamental, but we have microscopic analogs of the second law: for instance the Austin-Brussels condition
[equation]
Everywhere around me I see time-asymmetric phenomena. Time-symmetric laws were created in early physics, because physics was born from the observation of simple planetary motions; however biology and chemistry were born from the...
view entire post
The classical second law cannot be fundamental, but we have microscopic analogs of the second law: for instance the Austin-Brussels condition
Everywhere around me I see time-asymmetric phenomena. Time-symmetric laws were created in early physics, because physics was born from the observation of simple planetary motions; however biology and chemistry were born from the observation of complex processes and time-symmetry was never an option for the theorist. This is why the mass action law in chemical kinetics was irreversible since its very beginning.
The ordinary derivation of the second law in kinetic theory of diluted gases doesn't assume the equal a priori probabilities postulate; at contrary the probabilities are all different and given by the Maxwell & Boltzmann distribution.
The equal a priori probabilities postulate is *truly* valid on equilibrium. The argument that "at any given time the actual system is in 1 microstate, with 100% certainty, and all others with 0%" doesn't invalidate the postulate because it is about the ensemble, not about individual systems in the ensemble. Each individual system has properties that differ from the ensemble as Q
i = Q
ensemble + dQ
i.
There are several proposals for time-asymmetric dynamical laws; for instance modifications of the Schrödinger equation to make it irreversible.
Time symmetric dynamics cannot explain irreversible phenomena. Some people run computer simulations in such a way that it reproduces the expected result, such as a positive diffusion coefficient. But inverting the simulation gives results invalid with real-life experiment. Computer simulations with time-reversible dynamics can show entropy increasing or decreasing independently of the initial state. That people has been on safe situation up to now because they have run simulations in cases where previous knowledge did exist. So they could just get the expected numerical answer. The problem starts when one want to make a simulation in regimes outside experience. Time-symmetric simulations cannot differentiate real phenomena from invalid phenomena. That is the reason for the recent emphasis on introducing directly time-asymmetry on the equations of motion that will be solved by the computer. For instance in DPD the simulation uses the next decomposition for the total force
F = F
C + F
D + F
RF
D are dissipative forces which break time-reversibility and ensure the systems evolve in agreement with the second law, independently of the initial state (low or high entropy).
The problem with fixing the final boundary is that evolution is not deterministic, and so it is not defined at early times. Why would fix the initial entropy to be lower and the final entropy to be higher? This would work for certain kind of processes such as evolution from an initial equilibrium state to a final equilibrium state. But this two-times boundary will fail in other regimes, for instance in non-Markovian regimes where production of entropy can be *negative* without violating the classical thermodynamics laws.
The correct approach consists on developing time-asymmetric equations and left the system evolve from an initial state t=0; not say to the system which has to be the state at some t>0, based in prejudices like that the final entropy at t>0 always has to be higher than the initial entropy at t=0.
My comment about the Liouville equation (or its von Neumann quantum analog) did mean the next. Integrating the mechanical equation of motion we obtain
by Liouville theorem this mechanical equation of motion predicts entropy S is constant. This is obvious because the equation is time reversible. Then some people defines a coarse-grained entropy S
cg, which is then allowed to vary. The problem is that the equation continues being-time reversible and for each dS
cg>0 there exists a dS
cg
view post as summary
report post as inappropriate
Juan Ramón González Álvarez replied on Feb. 28, 2018 @ 01:30 GMT
...is negative.
What people do in practice is to select the evolution compatible with the classical second law and ignore the other. The problem is not only that the basic equation gives a set of incorrect solutions. The problem is when we want study dynamical regimes whre the classical second law doesn't apply. In those regimes the second law cannot be used as a consistency check to select the valid solutions; the solutions compatible with Nature, and discard the others.
"There are no irreversible events at a micro scale (Or so most physicists believe; maybe we're all wrong.)". Research made on last 50 years show irreversibility has microphysical roots
http://onlinelibrary.wiley.com/doi/10.1002/0471619574.c
h17/summary
https://books.google.es/books?id=Px7Wnx0K_EQC&pg=
PA301&lpg=PA301&dq=MICROPHYSICAL+IRREVERSIBILITYAND+TIME+ASY
MMETRIC+QUANTUM+MECHANICS
Boltzman was wrong and when pressed by critics he vacilated and gave inconsistent answers. Unfortunately a part of physicists continue repeating his mistakes forever. A macro description of time-reversible microscopic physics doesn't introduce irreversibility aparent or otherwise. Rigorous tracing or coarse-graining procedures generate macroscopic equations incompatible with experience, what Boltzmanians do is to produce mathematically invalid derivations where time-symmetry is explicitly broken by introducing some extra-dynamical ad hoc concept that selects the correct description compatible with the second law. The resulting macro equations no longer are compatible with the initial micro equations. Irreversibilty hasn't been derived fmr reversibility. Irrversibility has been imposed by breaking microscopic reversibility. Moreover those ad hoc procedures are oly valid for simple systems such as diluted gases, markovian dynamics in heat baths, and so. That is why Boltzmanians have never produced an equation of motion valid for arbitrary regimes.
report post as inappropriate
Heinrich Päs wrote on Feb. 7, 2018 @ 18:37 GMT
Dear Ken,
Very nice essay. Actually I noticed that you have written fantastic essays in the previous essay contests as well. So I will check them out one after the other. One idea I missed in your discussion though is that time itself could be emergent - in the sense that entropy increase defines time and that the very notion of „initial“ boils down to small entropy. For example, Claus Kiefer and Dieter Zeh have shown that it is quite reasonable that such an emergent arrow of time can be retrieved by tracing out uninteresting degrees of freedom, and that this arrow of time always points into the direction of an increasing scale factor of the Universe. In practice, I believe, this is equivalent to assume an initial boundary condition, though it applies to the macrostate while the micostate itself has entropy zero and is timeless (this is what I‘m arguing for in my own essay). Best regards, Heinrich
report post as inappropriate
Author Ken Wharton replied on Feb. 8, 2018 @ 03:05 GMT
Thanks, Heinrich! The previous essay that got the most attention was "The Universe is not a Computer", which you might enjoy. There's an extended version on the arXiv.
I am in agreement that the *arrows* of time could certainly be emergent -- indeed, they *must* be if all the laws are time-symmetric. But not time *itself* -- at least not as you describe. For one thing, you can't talk about *anything* increasing without having a concept of time already on the table. For another, entropy isn't fundamental; it applies to macrostates of partial knowledge, not microstates.
As far as Kiefer+Zeh's idea, it doesn't sound like a boundary explanation at all -- sounds like they're linking it to the dynamics. And I would expect that even if large-scale arrows of time emerged due to an increasing scale factor on large scales, that one would be hard pressed to make sure the same arrow emerged at much smaller scales, in all instances. In that respect, I don't see much of a distinction between it and Barbour's "Janus Point" idea that I critique in the essay. But perhaps I'm missing some nuance there.
I took a peek at your own essay, and agree that if entanglement is a "real thing", that certainly pushes one in the direction you take. I happen to be a contrarian on the topic, though, and I strongly align myself with the 'psi-epistemic' viewpoint. Now I just have to figure out what the ontic state really is... Details, details. :-)
Thanks again! --Ken
Marcel-Marie LeBel wrote on Feb. 8, 2018 @ 04:37 GMT
Ken,
Time runs slower toward the ground. So, an object falling toward the ground is moving spontaneously toward “slower time”. Slower time means “longer seconds”. In order for c (m/s) to remain constant, space must increase just as much as the seconds get longer. In other words, the object is falling spontaneously into larger space, which is dispersion, a classic example of an entropic process.
Both gravity and entropy are spontaneous processes that show a higher probability of existence in one direction. They have the same underlying logical cause.
My essay shows (?) that the universe as a logical system admits only one type of stuff or substance and only one type of logical cause. This logical system also appears to operate using a single logical operation, the logical substitution. Both types of motion, in gravity and in entropic dispersion, are the resolution in progress of an illogical state of affair, a non-uniform state of existence due to a non-uniform logical substitution.
Finally, a clock is a spontaneous device (energy in spring is ok). As such, it represents, for comparison, the local rate of evolution of other (co-located) spontaneous processes, including time. In order for the clock to respond to the local rate of evolution of time “via a logical operation”, they both have to be of the same nature, same type of stuff. The clock is just a more complex form of time.
All mumbo jumbo...Right?
All the bests,
Marcel,
report post as inappropriate
Narendra Nath wrote on Feb. 9, 2018 @ 08:51 GMT
As i see Physics develop we find the individual processes happening in Nature to be random or probabilty conscious. But if we go into the details of the process we find logic or order in the same. Thus, to me random and order appear as two sides of the same coin that nature throws as dice to us!The spontaneity of the process is random or probability conscious. While the logic behind the process has an order behind. All logical aspect of any processes are restrained by conservation laws. But then two canonically conjugate quantities are governed by the Uncertainity principle according to the Quantum theory. Energy relates to time while space relates to the momentum/motion. It therefore seems that any infermity in space gives rise to motion while any energy infermity leads to phase change in time. Classically we can not understand the reasons behind and that is where Quantum theory comes to explain the process. It mostly governs the microscopic phenomena while classically theory explains the gross picture about the same process. To understand the QM predictions visibly , a teacher has to invoke the classical analogue as reality becomes difficult to visualize quantum mechanically! Such di-echtomy has become the rule we proceed in Physics today!
report post as inappropriate
Avtar Singh wrote on Feb. 12, 2018 @ 17:38 GMT
Hi Ken:
Completely agree with your conclusion - "Although we use randomness when we don't know any better, a principle of indifference cannot be used to explain anything interesting or fundamental.
The above is vindicated in my paper -“
What is Fundamental – Is C the Speed of Light”. that describes the fundamental physics of antigravity missing from the widely-accepted mainstream physics and cosmology theories resolving their current inconsistencies and paradoxes. The missing physics depicts a spontaneous relativistic mass creation/dilation photon model that explains the yet unknown dark energy, inner workings of quantum mechanics, and bridges the gaps among relativity and Maxwell’s theories. The model also provides field equations governing the spontaneous wave-particle complimentarity or mass-energy equivalence. The key significance or contribution of the proposed work is to enhance fundamental understanding of C, commonly known as the speed of light, and Cosmological Constant, commonly known as the dark energy.
The paper not only provides comparisons against existing empirical observations but also forwards testable predictions for future falsification of the proposed model.
I would like to invite you to read my paper and appreciate any feedback comments.
Best Regards
Avtar Singh
report post as inappropriate
Satyavarapu Naga Parameswara Gupta wrote on Feb. 14, 2018 @ 01:21 GMT
Respected Prof Ken Wharton
Wonderful arguments.... " Looking at this problem from a different perspective reveals a natural solution: boundary-based explanations that arguably should be viewed as no less fundamental than other physical laws." Best wishes for your essay sir...
I hope you will not mind that I am not following main stream physics...
By the way…Here in my essay...
view entire post
Respected Prof Ken Wharton
Wonderful arguments.... " Looking at this problem from a different perspective reveals a natural solution: boundary-based explanations that arguably should be viewed as no less fundamental than other physical laws." Best wishes for your essay sir...
I hope you will not mind that I am not following main stream physics...
By the way…Here in my essay energy to mass conversion is proposed...……..….. yours is very nice essay best wishes …. I highly appreciate hope your essay ….You may please spend some of the valuable time on Dynamic Universe Model also and give your some of the valuable & esteemed guidance
Some of the Main foundational points of Dynamic Universe Model :-No Isotropy
-No Homogeneity
-No Space-time continuum
-Non-uniform density of matter, universe is lumpy
-No singularities
-No collisions between bodies
-No blackholes
-No warm holes
-No Bigbang
-No repulsion between distant Galaxies
-Non-empty Universe
-No imaginary or negative time axis
-No imaginary X, Y, Z axes
-No differential and Integral Equations mathematically
-No General Relativity and Model does not reduce to GR on any condition
-No Creation of matter like Bigbang or steady-state models
-No many mini Bigbangs
-No Missing Mass / Dark matter
-No Dark energy
-No Bigbang generated CMB detected
-No Multi-verses
Here:
-Accelerating Expanding universe with 33% Blue shifted Galaxies
-Newton’s Gravitation law works everywhere in the same way
-All bodies dynamically moving
-All bodies move in dynamic Equilibrium
-Closed universe model no light or bodies will go away from universe
-Single Universe no baby universes
-Time is linear as observed on earth, moving forward only
-Independent x,y,z coordinate axes and Time axis no interdependencies between axes..
-UGF (Universal Gravitational Force) calculated on every point-mass
-Tensors (Linear) used for giving UNIQUE solutions for each time step
-Uses everyday physics as achievable by engineering
-21000 linear equations are used in an Excel sheet
-Computerized calculations uses 16 decimal digit accuracy
-Data mining and data warehousing techniques are used for data extraction from large amounts of data.
- Many predictions of Dynamic Universe Model came true….Have a look at
http://vaksdynamicuniversemodel.blogspot.in/p/blog-page_15.h
tml
I request you to please have a look at my essay also, and give some of your esteemed criticism for your information……..
Dynamic Universe Model says that the energy in the form of electromagnetic radiation passing grazingly near any gravitating mass changes its in frequency and finally will convert into neutrinos (mass). We all know that there is no experiment or quest in this direction. Energy conversion happens from mass to energy with the famous E=mC2, the other side of this conversion was not thought off. This is a new fundamental prediction by Dynamic Universe Model, a foundational quest in the area of Astrophysics and Cosmology.
In accordance with Dynamic Universe Model frequency shift happens on both the sides of spectrum when any electromagnetic radiation passes grazingly near gravitating mass. With this new verification, we will open a new frontier that will unlock a way for formation of the basis for continual Nucleosynthesis (continuous formation of elements) in our Universe. Amount of frequency shift will depend on relative velocity difference. All the papers of author can be downloaded from “http://vaksdynamicuniversemodel.blogspot.in/ ”
I request you to please post your reply in my essay also, so that I can get an intimation that you repliedBest
=snp
view post as summary
report post as inappropriate
Thomas Howard Ray wrote on Feb. 15, 2018 @ 22:34 GMT
Ken,
Yours is the first essay I have been able to comprehend, from first word to last, and on first reading. Thank you!
I find that it's fully consistent with Einstein's wish to have boundary conditions that would eliminate the need to specify boundary conditions--and therefore lead to a singularity free general relativity. You write:
" ... the initial state of the universe is often referred to as an 'initial boundary condition'. The only problem is that many physicists want to then explain this boundary condition, via dynamics or randomness."
If it's true, however, that the 3 dimension boundary is identical to the 4 dimension horizon, "(3D spatial volumes have 2D boundaries; 4D spacetime- volumes such as our universe have 3D boundaries.)", the 3-d boundary has one negative element + + + - , i.e. (-1), and the 4-d spacetime - - - +, one positive element (+1) though we always measure changes in relations between center mass points, so the positive mass theorem must apply here, for a non-arbitrary initial condition.
Reduce to a 1-dimensional model, and you have
my essay.
All best,
Tom
report post as inappropriate
Marcel-Marie LeBel wrote on Feb. 16, 2018 @ 04:34 GMT
TH,
You say “....such as our universe have 3D boundaries”. The 3D belongs to our reality, not to the universe; there's a difference. The 3D is just the definition of a point like relational observation, the observer. A lot of what we think we learn about the universe is in fact about ourselves.
Bests,
Marcel,
report post as inappropriate
Jonathan Kerr wrote on Feb. 16, 2018 @ 18:55 GMT
Dear Ken Wharton,
I like your essay, though I don’t agree with all of it. You actually get to grips with the concepts, rather than jumping through them, as some do. And I enjoyed your way of writing. I agree that randomness can’t be used to explain things, but with one exception - unless one suspects that it’s at the very deepest level.
The symmetries and patterns the laws...
view entire post
Dear Ken Wharton,
I like your essay, though I don’t agree with all of it. You actually get to grips with the concepts, rather than jumping through them, as some do. And I enjoyed your way of writing. I agree that randomness can’t be used to explain things, but with one exception - unless one suspects that it’s at the very deepest level.
The symmetries and patterns the laws contain, which you mention as perhaps suggesting they didn’t arise randomly, might be caused by something underneath that happens to generate a lot of symmetry in the levels further up. If so, how that underlying layer was selected would be a very open question, and in general, the question of how the laws arose is unanswered, and to me separate from these questions.
But looking at randomness within existing physics, at each level of description, there are things that behave with a mixture of randomness and predictability, and these mixtures make patterns. But at the next level of description down, the randomness disappears, and what was random gets predictable. So when we find something very deep that appears partly random, as in QM, we wonder if there’s some even deeper level where it goes away. And we’ve found we can limit the possibilities for that, and that only non-local theories have the option, if they can find a way to do it.
But without knowing what the underlying picture is, if there is one, we don’t know if the randomness is fundamental or superficial. I’d say it could be either, and unless one happens to believe one of the existing interpretations for QM (as I don’t), one can choose to say it’s an open question. I agree with what you say about boundary explanations, and that taking boundaries as fundamental is a possibility.
I’d appreciate it if you’d rate my essay - I’ve only had four ratings so far, and (although that included high ratings and nice comments from Fabio and Edwin), I’ve found one needs ten ratings for the average to be taken seriously.
The essay deals with what relates the levels of description in physics, and argues that explanation does, alongside emergence. It also looks at questions to do with time, and makes a new point near the top of p2, which I’d say removes emergent time as a possibility.
Anyway, best regards,
Jonathan
view post as summary
report post as inappropriate
Author Ken Wharton replied on Feb. 19, 2018 @ 16:44 GMT
Thanks, Jonathan! With the caveat that I see an important distinction between generic probability and randomness (the latter being when all possibilities are equally probable), I would also share your hope that we can find a deeper level under QM that would better explain what we see. I took a peek at your essay; it looks like we have opposite perspectives on the "flow of time". I'll try to get back to it later this week. Best, Ken
Anonymous replied on Feb. 19, 2018 @ 22:35 GMT
Thanks Ken, I know you see time differently. I'd like your opinion on a new point about emergent time, which no-one had refuted so far. It's near the top of page 2 of my essay, and boils down to the need to explain a coincidence - if a real or apparent 'flow of time' emerged, then why was it so appropriate that it allowed laws of physics (such as laws of motion), which were already pre-implied in the sequence of the time slices in the block, to function? What were the laws doing, sitting there in the block in this 'just add water' sort of way?
Btw, I'd be grateful if you'd rate my essay - I've only had four ratings so far, although it was at number three a week ago. It seems that without 10 ratings, the average is not taken seriously. Anyway, thanks. Best regards, Jonathan
report post as inappropriate
Jonathan Kerr replied on Feb. 19, 2018 @ 22:38 GMT
Sorry, wasn't logged in - that was me. JK
report post as inappropriate
Member Dean Rickles wrote on Feb. 17, 2018 @ 00:54 GMT
Hi Ken,
Great essay as always.
On your explanation of the anthropic explanation coming from Boltzmann's account, you write: ""eventually something like our universe would randomly happen, and we find ourselves here because we’re not anywhere else". Of course, this would happen over and over, so depending on how you define "we," we could be somewhere else (as you say, given infinite time anything that can happen will happen, but it will do so again and again). The anthropic part is really that we find ourselves here because conditions permit, not because we aren't anywhere else. Nitpicky I know! And you are right about the problems with this approach in any case.
The boundaries response is a nice alternative (and I like your conceptual motivation of it), and in line with the top-down approaches I mentioned. Of course, we will want to know "why this boundary?" Especially if there are other apparently possible boundaries.
Good luck!
Best,
Dean
report post as inappropriate
Author Ken Wharton replied on Feb. 19, 2018 @ 16:49 GMT
Thanks, Dean! Yes, I certainly could have phrased the anthropic point better... As far as figuring out "why this boundary?", it's important that we *first* figure out what the boundary actually *is*. And I think we'll have a much better chance of answering that question if we come at it with the attitude that once we knew the boundary, it would be *obvious* that this was the only real possibility. If we come at it from the usual assumption that there will even be other possible boundaries, I think it will be much harder to induce in the first place. Just a hunch... Cheers! -Ken
Peter Jackson wrote on Feb. 18, 2018 @ 20:10 GMT
Dear Ken,
A few years ago I derived & published a well fitting cyclic evolution model for galaxies with a mechanism also proving an excellent match to the complex CMBR anisotropies at the larger scale. That suggested a cyclic cosmologyy without the issues of the Penrose and other models. I've been focused on SR/QM but you've reminded me that its specific non-random re-ionisation mechanism...
view entire post
Dear Ken,
A few years ago I derived & published a well fitting cyclic evolution model for galaxies with a mechanism also proving an excellent match to the complex CMBR anisotropies at the larger scale. That suggested a cyclic cosmologyy without the issues of the Penrose and other models. I've been focused on SR/QM but you've reminded me that its specific non-random re-ionisation mechanism also suggests 'cyclic entropy'. I'll pass you a link or pdf if you wish.
Rob Phillips essay identifies the ability to find a Gaussian (or Bayesian) distribution wherever we wish to look. This may seem inverse to your view. I agree both your arguments are correct but see yours as more productively fundamental. What's your view of his?
Most importantly. I ask for your help and advice on QM foundation; Say we endow each pair particle with 4 Maxwell state momenta, which I show exist in inverse cos distributions, and an anti-parallel polar axis (each pair random, but each one opposite) we have A (N/S), B (S/N). We also endow A & B with polariser field electrons and a dial to rotate them, N/S so the fields find either 'SAME' or 'OPPOSITE', switchable by A,B. Then I found the EPR paradox resolves in the way John Bell anticipated. That is Classic QM! The rest including non-integer spin and 'squared modulus' is (astonishingly!) in a full ontological mechanism in my essay.
Can you identify any errors? & Offer any help?
Declan Traill's short essay gives the corresponding code and CHSH >2 plot, plus a 'steering violation' closing the detection loophole and representing a 'pattern underlying the apparent randomness' (so you prevail on Phillips).
I know it
"requires.. ..radical conceptual renewal." and is a
"...the new way of seeing things will involve an imaginative leap that will astonish us. In any case it seems that the quantum mechanical description will be superseded." (J Bell p172 & 27).
Your excellent essay analysis seems consistent with the model but please confirm that. The model, of 'discrete fields' emerged from 'boundary condition' interpretation of SR which allows unification, also consistent with your model as well as with Minkowski and Einstein's 1908 and 1952 conceptions of infinitely many 'spaces within spaces' (as my earlier high rated essays 2011>>).
Thank you and well done for yours.
Peter
view post as summary
report post as inappropriate
Author Ken Wharton replied on Feb. 19, 2018 @ 17:00 GMT
Hi Peter,
I'm not against normal distributions, of course -- I'm just against looking for answers to fundamental questions by choosing a random sample out of them.
On the quantum entanglement front, I share your desire to figure out what is "going on" along the worldlines of the two particles, and would very much like to be able to describe all entanglement experiments in terms of those localized parameters. But thanks to Bell, we know that any such model has to either have 1) faster than light influence, 2) direct influence at a distance, 3) retrocausality, or 4) superdeterministic conspiracies. I can't tell from your description if you're in camp 1) or 2) -- hopefully you're not a Bell-denier! -- but I'm firmly in camp 3). If you're interested in (3), you might start with some of the pieces I've written with Huw Price.
Best, Ken
Peter Jackson replied on Feb. 21, 2018 @ 14:14 GMT
Ken, Thank. Some questions;
Is it best to open mindedly assess theories (SM) or be wedded to a particular one?
I learned the Sci.Method is more important than any past papers. Do you disagree?
Do you think we should consider Bell's own analysis of his proof, or just others?
I'm no Bell denier, but unlike most I also agree with his views! Thing is, your 4 options omit...
view entire post
Ken, Thank. Some questions;
Is it best to open mindedly assess theories (SM) or be wedded to a particular one?
I learned the Sci.Method is more important than any past papers. Do you disagree?
Do you think we should consider Bell's own analysis of his proof, or just others?
I'm no Bell denier, but unlike most I also agree with his views! Thing is, your 4 options omit them? Is that by design? I suspect not, in which case you'd need an option 5)
"Some starting assumption used for QM is incorrect." Is that fair? He said he 'freely used' (so was testing) QM's assumptions. To further quote him;
"..in my opinion the founding fathers were in fact wrong.. ..quantum phenomena do not exclude a uniform description of micro and macro worlds" p171. and
"..quantum mechanics is at the best, incomplete.” p.26.
So it may only be me who'se NOT a 'Bell denier'! I've tested all assumptions and found a hidden one to be is flawed; 'singlet states'. The experiment in my essay shows 4 REAL orthogonal states in OAM, and Ulla Mattfolk has just sent me links where I found the 'Poincare Sphere' which, as Maxwell, had already found them! Bohr! made 'NO' initial assumption on states but then made one; ('superposed/singlet states') without checking for others!!
Now if you follow my ('our') mechanism carefully you'll find the predictions of QM faithfully reproduced in full. No need for anything weird.
If you can't show it wrong it's be great if you collaborated and joined in with an early paper 'developing' your view to be consistent - so being more scientific & less 'religious' All 'camps' will be lost in the flood! unless you can show the crack in the dam I found isn't real! (CSHS >2 with closed detection loophole can't be denied!)
Interestingly the model followed my previous rationalisation of SR, so entirely unifies the two (in QM's 'absolute' time but with Doppler shifted 'signals' from metronome/"clock" emitters on co-moving medium transitions. (Bell did say the solution would 'astonish'!!) But one thing at a time!.
Looking forward to questions.
Very best
Peter
view post as summary
report post as inappropriate
Gordon Watson replied on Feb. 21, 2018 @ 20:51 GMT
Dear Ken, Peter, etc:
re Bell's theorem (BT).Ken's position is
firmly in the camp of retrocausality, with Huw Price.Peter's position is:
Some starting assumption used for QM is incorrect.My position: BT (1964) is developed in the context of EPRB. BT is false in such settings; see Aspect's experiments, etc.
A starting assumption in BT is incorrect.I provide a concise [half-page]
refutation of BT on page 8 of my essay.
I look forward to critical comments, etc, on my refutation and/or my position (above).
I'll happily expand on the refutation if there are steps that are not clear, etc.
Best, Gordon Watson
More realistic fundamentals: quantum theory from one premiss.
report post as inappropriate
Gordon Watson wrote on Feb. 19, 2018 @ 09:56 GMT
Dear Ken,
Thanks for the brilliant essay
* from a like-minded
** researcher in Quantum Foundations.
* Me hoping there follows something like this: “If the fundamental is non-random then (after existence), the fundamental is determined and law-like!”
** Me relating to this: "The very concept of a “random explanation” is as meaningless as the...
view entire post
Dear Ken,
Thanks for the brilliant essay
* from a like-minded
** researcher in Quantum Foundations.
* Me hoping there follows something like this: “If the fundamental is non-random then (after existence), the fundamental is determined and law-like!”
** Me relating to this: "The very concept of a “random explanation” is as meaningless as the above suggestions concerning random laws of physics."
For my work on a foundational "wholistic mechanics" (WM) is intended to advance a classical/deterministic reformulation of physics in spacetime. The ultimate goal being WM = {CM, SR, QM, GR, QFT, QG, EFT, ...|TLR},
*** my essay being an introduction.
*** My starting premiss (my classical boundary condition) is true local realism (TLR): the union of true locality (no influence propagates superluminally, after Einstein), and true realism (some existents may change interactively, after Bohr). [I'm surprised that naive-realism remains ubiquitous in physics.]
Rejecting
the weird claims associated with Bell's theorem (BT), I studied EPRB, the experiment analysed in famous Bell (1964). I did not accept that the assumptions behind BT were valid in that setting, I rejected nonlocality, and (as an aside) time-reversal would not hold.
Then, revising EPR's naive definition of "elements of physical reality", I find determinism in play, refute BT, and (from first principles, in spacetime) find the Laws of Malus, Bayes and Born validated in our quantum world. Born's law (an effective field theory, in my terms; in the space of probability amplitudes and without mystery) can then be tested by confirming the correct result for the EPRB expectation; then the correct DSE results; then onward to the stars.
In thus eliminating "wavefunction collapse" and nonlocality from QM, it follows that such weirdness need no longer trouble the foundations of QFT; etc. And since my calculations are conducted in spacetime (not Hilbert space), I'm thinking QG is covered automatically.
Ken, such is my long way of saying that I will welcome your comments at any time.
With my thanks again for your stimulating essay,
Gordon Watson (determined and free-willed)
view post as summary
report post as inappropriate
Gordon Watson replied on Feb. 20, 2018 @ 00:56 GMT
Ken, if/when you reply to my post, please copy it to my essay-thread so that I'm alerted to it. I'm having trouble keeping abreast of many good discussions this year.
Many thanks; Gordon
More realistic fundamentals: quantum theory from one premiss.
report post as inappropriate
Conrad Dale Johnson wrote on Feb. 19, 2018 @ 17:39 GMT
Ken,
The clarity of your argument is impressive, and it’s given me a lot to think about, relating to the key issue – how do we explain the smooth (but not perfectly smooth) distribution of matter in the early universe? You argue that since it can’t be explained by randomness, or by the operation of dynamic laws, any possible explanation has to relate to higher-level constraints –...
view entire post
Ken,
The clarity of your argument is impressive, and it’s given me a lot to think about, relating to the key issue – how do we explain the smooth (but not perfectly smooth) distribution of matter in the early universe? You argue that since it can’t be explained by randomness, or by the operation of dynamic laws, any possible explanation has to relate to higher-level constraints – your boundary conditions on the universe as a whole. Then the question is, how are these constraints to be explained?
The point of
my essay is to interpret the physics of our universe as constrained by the need to define itself – that is, to provide contexts in which all the various kinds of information it contains are measurable, in terms of each other. I argue that it takes a very special kind of system to support any quantitative measurement, and suggest that the diverse modes of interaction in our universe, along with the fine-tuning of it parameters, will eventually be explained by the stringent requirements on any such system.
From this perspective, the low-entropy initial state is one basic condition for a self-defining and self-measuring system to emerge. (I take that as having occurred during the era memorialized in the Cosmic Microwave Background.) I suspect this is not the kind of boundary condition you have in mind, but I don’t think your objections to “random explanations” apply. The problem with my proposal is not that the conditions are random or arbitrary, but that they’re hard to define from an a priori standpoint. It’s fairly easy to see that without atomic structure, for example, no kind of measurement the would be physically possible. But it’s not at all easy to list the requirements for any system that can measure its own constituent elements – since we have only one example of such a system to consider, and that quite a subtle and complicated one.
Thanks for a very prize-worthy contribution, and for taking me a level or two deeper into the issue of the meaning of entropy.
Conrad
view post as summary
report post as inappropriate
Marcel-Marie LeBel wrote on Feb. 20, 2018 @ 23:45 GMT
Ken,
Your essay exposes in fact the working of epistemology. In that sense, it is fundamental not only to science and physics, but to all our truth systems.
An impossibility is the boundary that defines or make finite a truth system, all truth systems. The impossibility to measure faster than light (SR), the impossibility to distinguish acceleration from gravity (GR), the impossibility to measure both position and momentum (QM) etc.are examples of this. In that respect, SR, GR and QM are separate truth systems because they are derived from different original boundaries or impossibilities.
Boundaries are fundamental to all our truth systems, geometry, maths, logic etc. But the universe happens by itself, spontaneously. It requires a deeper and ontological explanation or “starting point”. This starting point is the law of non-contradiction which the universe’s substance follows from its creation to its evolution.
All the bests,
Marcel,
report post as inappropriate
Steven Andresen wrote on Feb. 22, 2018 @ 06:34 GMT
Dear Ken
If you are looking for another essay to read and rate in the final days of the contest, will you consider mine please? I read all essays from those who comment on my page, and if I cant rate an essay highly, then I don’t rate them at all. Infact I haven’t issued a rating lower that ten. So you have nothing to lose by having me read your essay, and everything to...
view entire post
Dear Ken
If you are looking for another essay to read and rate in the final days of the contest, will you consider mine please? I read all essays from those who comment on my page, and if I cant rate an essay highly, then I don’t rate them at all. Infact I haven’t issued a rating lower that ten. So you have nothing to lose by having me read your essay, and everything to gain.
Beyond my essay’s introduction, I place a microscope on the subjects of universal complexity and natural forces. I do so within context that clock operation is driven by Quantum Mechanical forces (atomic and photonic), while clocks also serve measure of General Relativity’s effects (spacetime, time dilation). In this respect clocks can be said to possess a split personality, giving them the distinction that they are simultaneously a study in QM, while GR is a study of clocks. The situation stands whereby we have two fundamental theories of the world, but just one world. And we have a singular device which serves study of both those fundamental theories. Two fundamental theories, but one device? Please join me and my essay in questioning this circumstance?
My essay goes on to identify natural forces in their universal roles, how they motivate the building of and maintaining complex universal structures and processes. When we look at how star fusion processes sit within a “narrow range of sensitivity” that stars are neither led to explode nor collapse under gravity. We think how lucky we are that the universe is just so. We can also count our lucky stars that the fusion process that marks the birth of a star, also leads to an eruption of photons from its surface. And again, how lucky we are! for if they didn’t then gas accumulation wouldn’t be halted and the star would again be led to collapse.
Could a natural organisation principle have been responsible for fine tuning universal systems? Faced with how lucky we appear to have been, shouldn’t we consider this possibility?
For our luck surely didnt run out there, for these photons stream down on earth, liquifying oceans which drive geochemical processes that we “life” are reliant upon. The Earth is made up of elements that possess the chemical potentials that life is entirely dependent upon. Those chemical potentials are not expressed in the absence of water solvency. So again, how amazingly fortunate we are that these chemical potentials exist in the first instance, and additionally within an environment of abundant water solvency such as Earth, able to express these potentials.
My essay is attempt of something audacious. It questions the fundamental nature of the interaction between space and matter Guv = Tuv, and hypothesizes the equality between space curvature and atomic forces is due to common process. Space gives up a potential in exchange for atomic forces in a conversion process, which drives atomic activity. And furthermore, that Baryons only exist because this energy potential of space exists and is available for exploitation. Baryon characteristics and behaviours, complexity of structure and process might then be explained in terms of being evolved and optimised for this purpose and existence. Removing need for so many layers of extraordinary luck to eventuate our own existence. It attempts an interpretation of the above mentioned stellar processes within these terms, but also extends much further. It shines a light on molecular structure that binds matter together, as potentially being an evolved agency that enhances rigidity and therefor persistence of universal system. We then turn a questioning mind towards Earths unlikely geochemical processes, (for which we living things owe so much) and look at its central theme and propensity for molecular rock forming processes. The existence of chemical potentials and their diverse range of molecular bond formation activities? The abundance of water solvent on Earth, for which many geochemical rock forming processes could not be expressed without? The question of a watery Earth? is then implicated as being part of an evolved system that arose for purpose and reason, alongside the same reason and purpose that molecular bonds and chemistry processes arose.
By identifying atomic forces as having their origin in space, we have identified how they perpetually act, and deliver work products. Forces drive clocks and clock activity is shown by GR to dilate. My essay details the principle of force dilation and applies it to a universal mystery. My essay raises the possibility, that nature in possession of a natural energy potential, will spontaneously generate a circumstance of Darwinian emergence. It did so on Earth, and perhaps it did so within a wider scope. We learnt how biology generates intricate structure and complexity, and now we learn how it might explain for intricate structure and complexity within universal physical systems.
To steal a phrase from my essay “A world product of evolved optimization”.
Best of luck for the conclusion of the contest
Kind regards
Steven Andresen
Darwinian Universal Fundamental Origin
view post as summary
report post as inappropriate
Cristinel Stoica wrote on Feb. 22, 2018 @ 07:56 GMT
Dear Ken,
Very enjoyable essay, well explained and insightful. You did a great job dispelling some superstitions even physicists have about the origin of the time arrow. I fully agree that no matter what one tries, the least problematic remains the past hypothesis. What amazes me is that I see once in a while people trying to explain the arrow of time by introducing some time asymmetry...
view entire post
Dear Ken,
Very enjoyable essay, well explained and insightful. You did a great job dispelling some superstitions even physicists have about the origin of the time arrow. I fully agree that no matter what one tries, the least problematic remains the past hypothesis. What amazes me is that I see once in a while people trying to explain the arrow of time by introducing some time asymmetry (with or without care to not be compensated by P and T symmetries) in the evolution equations themselves! In fact, even Penrose's Weyl Curvature Hypothesis is in my opinion not an explanation of the second law, but it may be a prediction of it, in the sense that at the Big Bang the sources of gravity exist, but the field didn't spread yet, somehow like the retarded solution of Maxwell's equations. Imposing WCH is equivalent to imposing law entropy for gravity, and it is just a partial restatement of the problem, not the explanation.
And I think you are right to propose that the boundary conditions may have the same status as the dynamical laws themselves. Now maybe this will lead, in the case of a universe which big-crunches itself (excluded by measurements of the cosmological constant) that the time arrow reverts when the universe reaches the mid-life crisis. But since it expands forever, maybe the initial and final boundary conditions manifest differently, even if they may be subject to the same principle. Maybe this can be connected to Penrose's Conformal Cyclic Cosmology (CCC), although there may be some complications here.
Your essay motivated me to organize some random thoughts I have about this subject.
So here is an idea I have about boundary conditions. I can call it "boundary law without boundary law", but is not as one may expect just another way to obtain boundary conditions out of the dynamics, as in fluctuations or Janus point. It is based on conformal symmetry. You know Maxwell's equations on Minkowski spacetime (backreaction ignored) are invariant to conformal transformations, which extend the Poincaré group. The conformal boundary of the Minkowski spacetime is mapped by conformal inversion to the lightcone at the origin, and vice-versa. So on the conformal boundary the (conformal transformations of the) solutions have the same kind of regularity as on the lightcone at origin. What's beautiful is that this conformal invariance applies to the entire Standard Model as long as all masses are 0, so his is equivalent to the absence/vanishing of the Higgs field (I don't know yet if this holds for the neutrino too). Now turn on gravity, the global properties of spacetime change. Maybe it is asymptotically flat, in which case the boundary remains similar. The conformal invariance is broken. But we know that there is a richer conformal invariance, the local one, consisting in rescaling the metric tensor independently at each point of spacetime. So I suspect that there is also a generalization of translations and in fact of the full conformal group. The reason I suspect this is that we still have conservation of momentum, because the stress-energy tensor is locally conserved, but it is local now. So I am in particular interested to see how I can make local the full conformal group. Now going back to the boundary, I suspect that some regularity persists even after we turn on gravity and the Higgs field. Even if maybe the conformal boundary changes because of the cosmological constant. So I think it worth seeing what are the effects on this regularity, what remains of it, after the breaking of the conformal symmetry. Note that maybe this can be connected to Penrose's CCC, but not necessarily, because maybe a scale inversion happens, maybe not, when crossing the final boundary to go back to the initial boundary. This needs to be investigated, maybe the asymmetry between the initial and final boundaries allow it, maybe not. If the asymmetry is too strong to admit scale inversions as in Penrose's CCC, I think this will be instead more like Penrose's WCH. Anyway, it seems to me that conformal symmetry may hold a key to obtain some boundary law from the dynamical law, in a fundamental way, as opposed to assuming fluctuations and anthropic principles.
About Boltzmann's brains, they indeed seem to be much more probable than any sort of stable brain which is the product of evolution. But there are two factors which I don't think were taken into account when making such calculations, at least not to my knowledge. Brains belonging to evolving species crowd together, both in time and space. In our world at least, life is finite, brains die and new brains are born, and they are born from one another. This implies a huge number of brains crowded together. One can speculate that it is possible to have species which are immortal, but it seems plausible that they are much rarer than species whose individuals have a reasonably short life span (but not too short), because this allows selection and evolution. So if we take this into account, could it be possible that the Boltzmann brains are in fact much less likely to exist than brains enrolled in a species? Moreover, a brain which is part of a surviving and evolving crowd, being subject to natural selection, it is much better adapted to observe the environment. So a crowd of n brains has a much larger probability than the probability of n independent Boltzmann brains. In the index counting of such brains we should also take into account not only the number of brains, but also their life span, so an ephemeral Boltzmann brain will be a very brief fluctuation very little connected to the environment, while for an observer it may be more likely to be part of a crowd of brains having longer lives. (of course the life span affects in two opposite ways this calculation). I think this argument relies on more parameters, and they are very difficult, if not impossible to estimate, and in addition we encounter similar problems as in the doomsday argument. Similarly for single planet orbiting a single star in a high-entropy background, we should take stability into account, Boltzmann brains may seem more likely, but it may be more likely for a mind to be in a stabler brain, and also stars tend to crowd too, although life on one planet seems not to be significantly correlated with life on other planets in the same galaxy. So although I think a boundary law or special initial conditions are a better explanation, it is not as easy to reject the explanation based on fluctuations based on Boltzmann brains.
After this long comment which turned into a mini-essay, how can I now invite you to also read my actual essay? :)))
Again, excellent essay! Good luck in the contest!
Best wishes,
Cristi Stoica, Indra's net
view post as summary
report post as inappropriate
Wayne R Lundberg wrote on Feb. 25, 2018 @ 15:18 GMT
Dear Ken,
We agree pretty well that the current situation in which our more fundamental physical theories involve far too many adjustable parameters. For one, I do not believe, as the essay implies a standard theorist must, that these parameters were somehow (chosen at) random.
You seem to agree, but react in an altogether different way. I (as in my essay) seek to explain the...
view entire post
Dear Ken,
We agree pretty well that the current situation in which our more fundamental physical theories involve far too many adjustable parameters. For one, I do not believe, as the essay implies a standard theorist must, that these parameters were somehow (chosen at) random.
You seem to agree, but react in an altogether different way. I (as in my essay) seek to explain the respective systems of equations as the result of a more fundamental formula, EVALUATED AT "|" the respective space-time. In the traditional sense, that expands the meaning of |H> to include other scales, besides the traditional weak scale. *It is clear from your consideration of BC that you understand the fundamentality of Hamiltonian, H and in QM |H>, and Classical mechanics.
I emphasize that this is a re-interpretation of Dirac's notatation, but it aligns somewhat with your thinking on the subject. After all, a space-time average is simply taken over designated space-time 'boundaries'. So I derive GR and QC/ED from a generic basis via choosing the 'boundary conditions' used in "|", i.e. the traditional |weak, and in particular |astrophysical in which GR (at the time=present) is derived.
The resulting |H> at strong scale is quite revealing, and can avoid the "appearance of 'randomly selected' adjustable parameters".
However, I must note that this work is based on and constructed in 100% agreement with Hartle, Hertog and Hawking's work on the cosmological "no-boundary condition" basis. I would suggest that you consider their work quite highly, as it also helps me eliminate both that pesky "initial state of cosmology" AND "cosmological coincidence" problems.
J.B. Hartle, S.W. Hawking and T. Hertog, “The Classical Universes of the No-Boundary Quantum State” hep-th/0803.1663v1 March 2008.
I'd be willing to come out to San Jose to explain better via a seminar... but please feel free to read my work here and online arxiv.
Wayne
https://fqxi.org/community/forum/topic/3092
view post as summary
report post as inappropriate
Member Alyssa Ney wrote on Feb. 25, 2018 @ 17:05 GMT
Hi Ken,
Great, provocative read. You say an appeal to randomness should not appear in fundamental explanations. But what if the universe really has some fundamental randomness built into it? Isn't this one way the world could be? And if so, then shouldn't this be part of our fundamental picture?
I also was unsure about your conductor analogy. As you concede, in that case, we do think a more fundamental dynamical explanation is possible. So I am not sure that this gives precedent for boundary conditions being explanatory *on their own* as you want them to be. It gives precedent for boundary conditions being explanatory in contexts where we know there are more fundamental explanations that are being black-boxed. But that is not where we are in the cosmological case.
Best,
Alyssa
report post as inappropriate
Vladimir Nikolaevich Fedorov wrote on Feb. 27, 2018 @ 03:47 GMT
Dear Ken,
I highly appreciate your well-written essay in an effort to understand.
Your essay allowed to consider us like-minded people.
I hope that my modest achievements can be information for reflection for you.
Vladimir Fedorov
https://fqxi.org/community/forum/topic/3080
report post as inappropriate
Vladimir Nikolaevich Fedorov wrote on Feb. 27, 2018 @ 04:01 GMT
I just increased your rating from 6.6 to 6.8, but someone two again put you down.
report post as inappropriate
Gordon Watson wrote on Mar. 15, 2018 @ 07:07 GMT
Dear Ken; further to my earlier comments, please: Since
we cannot both be right, would you mind commenting on my half-page refutation of Bell's theorem?
See ¶13 in
More realistic fundamentals: quantum theory from one premiss.
NB: I clarify Bell's 1964-(1) functions by allowing that, pairwise, the HV (λ) heading toward Alice need no be the same as that (μ) heading toward Bob; ie, it is sufficient that they are highly correlated via the pairwise conservation of total angular momentum. Thus, consistent with Bell's 1964-(12) normalization condition:
Further, in my analysis: after leaving the source, each pristine particle remains pristine until its interaction with a polarizer. Then, in that I allow for perturbative interactions, my use of delta-functions represents the perturbative impact of each such interaction.
My equation (26) then represents the distribution of perturbed particles proceeding to Alice's analyzer. Thus (with b and μ similarly for Bob):
PS: Bridging the continuous and the discrete -- and thus Bell's related indifference -- integrals are used here by me for generality. Then, since the arguments of Bell's 1964-(1) functions include a continuous variable λ, ρ(λ) in Bell 1964-(2) must include delta-functions. Thus, under Bell's terms, my refutation is both mathematically and physically significant.
PLEASE: When you reply -- or if you will not -- please drop a note on my essay-thread so that I receive an alert. Many thanks; Gordon
Current FQXi essay
report post as inappropriate
Login or
create account to post reply or comment.