CATEGORY:
FQXi Essay Contest - Spring, 2017
[back]
TOPIC:
Fundamental as Fewer Bits by Terry Bollinger
[refresh]
Login or
create account to post reply or comment.
Author Terry Bollinger wrote on Jan. 30, 2018 @ 22:35 GMT
Essay AbstractIn this essay I propose that a subtle but universal principle of binary conciseness lies behind the most powerful and insightful equations in physics and other sciences. The principle is that if two or more precise descriptive models (theories) address the same experimental data, the theory that is more concise in terms of Kolmogorov complexity will also be more fundamental in the sense of having the deepest insights.
Author BioTerry Bollinger retired in 2017 from The MITRE Corporation, where on behalf of the US Office of Naval Research he helped define, acquire funding, and oversee research by major universities and small businesses in robotics and the cognitive sciences. Prior to that he was a chief technologist helping the US Department of Defense find high-value emerging technologies relevant to US federal needs. Since retiring he hired himself to do full-time research in particle physics, with one of his goals being to assess the merits of applying machine learning and cognition methods to intransigent hard science issues.
Download Essay PDF File
Alan M. Kadin wrote on Jan. 31, 2018 @ 02:44 GMT
Dear Dr. Bollinger,
I enjoyed reading your essay, and your emphasis on simplicity and coherence.
My contention, as described in my essay,
“Fundamental Waves and the Reunification of Physics”, is that both GR and QM have been fundamentally misunderstood, and that something close to classical physics should be restored, reunifying physics that was split in the early 20th century. QM should not be a general theory of nature, but rather a mechanism for creating discrete soliton-like wavepackets from otherwise classical continuous fields. These same quantum wavepackets have a characteristic frequency and wavelength that define local time and space, enabling GR without invoking an abstract curved spacetime.
This neoclassical picture has no quantum entanglement, which has important technological implications. In the past few years, quantum computing has become a fashionable field for R&D by governments and corporations. But the predicted power of quantum computing comes directly from entanglement. I predict that the entire quantum computing enterprise will fail within about 5 years. Only then will the mainstream start to question the foundations of quantum mechanics.
Alan Kadin
report post as inappropriate
Author Terry Bollinger replied on Jan. 31, 2018 @ 03:30 GMT
Dear Dr Kadin,
Thank you for your kind remarks, and I am glad you enjoyed reading my essay. The concept of binary conciseness is definitely compatible with the idea that the most fundamental particle solutions are wave packets rather than point particles. I am in fact working on a separate paper, under the acronym PAVIS, that explores certain implications of treating point particles as asymptotic limits rather than a pre-existing mathematical entities.
I will comment in more detail after reading your essay carefully and making sure I really understand your framework.
I do have one immediate question, however. While I agree that quantum computing has yet to prove it can meet its claims, companies such as ID Quantique sell off-the-shelf entanglement-based quantum encryption devices to achieve high levels of transmission security. If you have not already, it might be interesting to examine such devices in terms of whether your ideas could provide a different interpretation of how they achieve high transmission security.
Thank you, and I will be sure to add comments on your essay page after studying your essay carefully.
Sincerely,
Terry Bollinger
Dizhechko Boris Semyonovich wrote on Jan. 31, 2018 @ 14:48 GMT
Dear Terry Bollinger, I admire your method of testing the theory on fundamentalism. I believe that the New Cartesian physics, in the basis of which the identity of space and matter Descartes' is closer to other theories to Kolmogorov's minimum. You may be interested in my essay, in which I, among other things, showed the relationship between the Lorentz factor and the probability density of quantum states, and most importantly showed that the mass-energy equivalence formula is due to the pressure of the universe. I will be grateful to you for the evaluation you leave.
Sincerely, Dizhechko Boris Semyonovich.
report post as inappropriate
Author Terry Bollinger wrote on Jan. 31, 2018 @ 18:33 GMT
Dear Dizhechko Boris Semyonovich,
Thank you for your kind words. I have downloaded and read your essay. Alas, and as is also the case for me with string theory, I was unable to make any unambiguous conceptual connections between your framework and the best-documented results of experimental physics. I promise to read your essay one more time to see if I can arrive at meaningful positive suggestions or comments.
Sincerely,
Terry Bollinger
Joe Fisher replied on Feb. 1, 2018 @ 16:41 GMT
Dear Terry Bollinger,
FQXi.org is clearly seeking to confirm whether Nature is fundamental.
Reliable evidence exists that proves that the surface of the earth was formed millions of years before man and his utterly complex finite informational systems ever appeared on that surface. It logically follows that Nature must have permanently devised the only single physical construct of earth allowable.
All objects, be they solid, liquid, or vaporous have always had a visible surface. This is because the real Universe must consist only of one single unified VISIBLE infinite surface occurring eternally in one single infinite dimension that am always illuminated mostly by finite non-surface light.
Only the truth can set you free.
Joe Fisher, Realist
post approved
Dizhechko Boris Semyonovich replied on Feb. 17, 2018 @ 10:28 GMT
Thanks, Terry Bollinger,, for his criticism of my essay. I understand that it was written poorly. Its main aim is to attract researchers to continue the theory of everything of Descartes’ taking into account modern achievements in physics. The principle of identity of physical space and matter of Descartes’ allows us to remodel the principle of uncertainty of Heisenberg in the principle of definiteness of points of physical space, according to which in order to get the point of it required an infinitely large momentum. Look at my essay,
FQXi Fundamental in New Cartesian Physics by Dizhechko Boris Semyonovich Where I showed how radically the physics can change if it follows this principle. Evaluate and leave your comment there. Do not allow New Cartesian Physics go away into nothingness.
Sincerely, Dizhechko Boris Semyonovich.
report post as inappropriate
Joe Fisher wrote on Feb. 2, 2018 @ 16:07 GMT
Dear Fellow Essayists
This will be my final plea for fair treatment.,
FQXI is clearly seeking to find out if there is a fundamental REALITY.
Reliable evidence exists that proves that the surface of the earth was formed millions of years before man and his utterly complex finite informational systems ever appeared on that surface. It logically follows that Nature must have permanently devised the only single physical construct of earth allowable.
All objects, be they solid, liquid, or vaporous have always had a visible surface. This is because the real Universe must consist only of one single unified VISIBLE infinite surface occurring eternally in one single infinite dimension that am always illuminated mostly by finite non-surface light.
Only the truth can set you free.
Joe Fisher, Realist
post approved
Flavio Del Santo wrote on Feb. 4, 2018 @ 23:01 GMT
Dear Terry,
your essay is indeed interesting, and very readable.
I rated it high, and I hope you will get the visibility you deserve in the contest.
For what concerns our view, as you pointed out, they are not so distant, but I shall ponder more on how to express the differences. I will answer to your specific comments you added in the page dedicated to my essay soon.
All the best,
Flavio
report post as inappropriate
Author Terry Bollinger wrote on Feb. 5, 2018 @ 00:35 GMT
Flavio,
Thank you, that was very kind of you to read my essay. I... did not expect that?
I've sort of given up on this contest, to be honest? So now I'm just trying to go back to my roots of my day job before retiring, which was literally assessing the hidden potential of leading-edge hard science and information research ideas and technologies, and before that being an associate editor-in-chief for a technical magazine. My research support job(s) was one of the more fun jobs in this world to be honest, but I've not missed it one bit since retiring to work for myself!
Again, I think you and your co-author have an idea there that is important. IT also fits very nicely into the "what's fundamental" theme. So good luck, and I'll look back at your page soon.
Thanks also just for submitting that essay. Due to a bad random choice in the first half dozen essays I sampled, I was very bummed out about even having submitting an essay to this contest at all. So it was important to me to see your essay with its precision use of very solid and interesting experimental work. (Sorry any FQXi staff listening in, but I ain't gonna sugar coat it. You really, really need to fix certain aspects of your review process.)
Cheers,
Terry
Satyavarapu Naga Parameswara Gupta wrote on Feb. 10, 2018 @ 01:21 GMT
Hi Terry Bollinger
Very nice idea about "binary conciseness lies behind the most powerful and insightful equations in physics and other sciences. The principle is that if two or more precise descriptive models (theories) address the same experimental data...." is very progressive for understanding of consciousness, very good... Bythe way....
Here in my essay energy to mass conversion...
view entire post
Hi Terry Bollinger
Very nice idea about "binary conciseness lies behind the most powerful and insightful equations in physics and other sciences. The principle is that if two or more precise descriptive models (theories) address the same experimental data...." is very progressive for understanding of consciousness, very good... Bythe way....
Here in my essay energy to mass conversion is proposed...……..….. yours is very nice essay best wishes …. I highly appreciate hope your essay and hope for reciprocity ….You may please spend some of the valuable time on Dynamic Universe Model also and give your some of the valuable & esteemed guidance
Some of the Main foundational points of Dynamic Universe Model :-No Isotropy
-No Homogeneity
-No Space-time continuum
-Non-uniform density of matter, universe is lumpy
-No singularities
-No collisions between bodies
-No blackholes
-No warm holes
-No Bigbang
-No repulsion between distant Galaxies
-Non-empty Universe
-No imaginary or negative time axis
-No imaginary X, Y, Z axes
-No differential and Integral Equations mathematically
-No General Relativity and Model does not reduce to GR on any condition
-No Creation of matter like Bigbang or steady-state models
-No many mini Bigbangs
-No Missing Mass / Dark matter
-No Dark energy
-No Bigbang generated CMB detected
-No Multi-verses
Here:
-Accelerating Expanding universe with 33% Blue shifted Galaxies
-Newton’s Gravitation law works everywhere in the same way
-All bodies dynamically moving
-All bodies move in dynamic Equilibrium
-Closed universe model no light or bodies will go away from universe
-Single Universe no baby universes
-Time is linear as observed on earth, moving forward only
-Independent x,y,z coordinate axes and Time axis no interdependencies between axes..
-UGF (Universal Gravitational Force) calculated on every point-mass
-Tensors (Linear) used for giving UNIQUE solutions for each time step
-Uses everyday physics as achievable by engineering
-21000 linear equations are used in an Excel sheet
-Computerized calculations uses 16 decimal digit accuracy
-Data mining and data warehousing techniques are used for data extraction from large amounts of data.
- Many predictions of Dynamic Universe Model came true….Have a look at
http://vaksdynamicuniversemodel.blogspot.in/p/blog-page_15.h
tml
I request you to please have a look at my essay also, and give some of your esteemed criticism for your information……..
Dynamic Universe Model says that the energy in the form of electromagnetic radiation passing grazingly near any gravitating mass changes its in frequency and finally will convert into neutrinos (mass). We all know that there is no experiment or quest in this direction. Energy conversion happens from mass to energy with the famous E=mC2, the other side of this conversion was not thought off. This is a new fundamental prediction by Dynamic Universe Model, a foundational quest in the area of Astrophysics and Cosmology.
In accordance with Dynamic Universe Model frequency shift happens on both the sides of spectrum when any electromagnetic radiation passes grazingly near gravitating mass. With this new verification, we will open a new frontier that will unlock a way for formation of the basis for continual Nucleosynthesis (continuous formation of elements) in our Universe. Amount of frequency shift will depend on relative velocity difference. All the papers of author can be downloaded from “http://vaksdynamicuniversemodel.blogspot.in/ ”
I request you to please post your reply in my essay also, so that I can get an intimation that you repliedBest
=snp
view post as summary
report post as inappropriate
Author Terry Bollinger wrote on Feb. 10, 2018 @ 04:35 GMT
Hi Satyavarapu Naga Parameswara Gupta,
First I must make you aware that because I was a magazine editor for many years, I took the following pledge just to allow myself to participate in this phase of the FQXi evaluation:
goo.gl/KCCujt
This pledge explicitly requires that I not engage in any form of reciprocity when evaluating essays, since reciprocity unavoidably would mean I am not giving my honest opinion based only on what I see in the essay. I simply cannot work any other way. For that reason, I will not yet promise to point-score your essay even if I comment on it, because I cannot keep the promise you just requested.
That is lot of material you cover! I do promise to take a look. My warning in advance is that I look for deep continuity in every essay, and have yet to see one that introduced that many concepts in which I saw that kind of continuity. But I will try hard to make any positive comments that I think might help.
Alas, I already see indications that my honest, no-inflation-allowed point score could come out low... which is why I might choose not to assign a point score to it. You are being very up front, and I appreciate that very much. I regret if you gave me a strong rating if you do not truly feel that rating justified.
Cheers,
Terry
Domenico Oricchio wrote on Feb. 10, 2018 @ 14:42 GMT
It is an interesting essay.
I am thinking that there are infinite sequence of trascendental numbers, that admit a convergent series like an approximation; so that the optimal approximation using pi number is a compact way to saying that a generic sequence is an extraction of a string in a convergent series with simple terms description; so that if the terms have a simple, compact, description then the Kolmogorov complexity is low, if the convergent series have a complex description, then the Kolmogorov complexity must include the terms description complexity.
The pi number could contain each sequence, for each length, then a single trascendental number could contain the minimum description if the starting point is not too high (for example greater of the string description).
I am thinking that the minimum Kolmogorov message for a trajectory and the principle of least action could have a connection if the Kolmogorov complexity of the trajectory and the measure of the Lagrangian was proportional.
A good essay make think.
Regards
Domenico
report post as inappropriate
Author Terry Bollinger replied on Feb. 10, 2018 @ 19:14 GMT
Domenico,
Thank you for your kind words, and I am glad my essay gave you some interesting ideas to pursue.
I just downloaded your essay, which is almost surely the shortest essay submitted! I did not realize that a one-page essay would be allowed, but in retrospect the FQXi rules only prescribe maximum size limits, not minimums.
Please be aware that I have taken the following pledge:
goo.gl/KCCujt
If you do not wish me to review your essay, please let me know quickly and I will gladly just skip over it. If I do comment on any essay, I always try to add some positive or constructive strategy remarks, even if I do not see the essay as strong. For point scoring, alas, I do not do inflation, so I can be pretty tough.
Again, thanks for your comment.
Cheers,
Terry
Domenico Oricchio replied on Feb. 10, 2018 @ 23:10 GMT
I read your essay for interest in new ideas: everyone here, in the contest, are curious; my interest in the my score is close to zero.
Every opinion, critical or benevolent, on my essay is welcome.
Regards
Domenico
report post as inappropriate
corciovei silviu wrote on Feb. 11, 2018 @ 12:51 GMT
Mr. Bollinger
In your essay you cherish simplicity (and for good reasons) and I admire the elegance of your style. Let me ask something though: how do you recognize that simplicity in this contest? Should that simplicity be interpreted in just one way?
I am asking this because I noticed a confusing comment of yours regarding
an essay that more or less is pointing (in terms of that simplicity recalled by you), in the same direction.
Thank you again for the elegant and personal approach of simplicity emergent from your essay (please don’t tell me is not emergent) )
Joyfully and respectfully
Silviu
report post as inappropriate
Edwin Eugene Klingman wrote on Feb. 12, 2018 @ 05:53 GMT
Dear Terry Bollinger,
Thanks for a most enjoyable essay. I first fell in love with information theory in 1967 when I encountered Amnon Katz's "
Statistical Mechanics: an Information Theory Approach". Later I realized that ET Jaynes had done this circa 1951, and had noted that equating thermodynamic entropy to information entropy led to a lot of nonsense theorems being...
view entire post
Dear Terry Bollinger,
Thanks for a most enjoyable essay. I first fell in love with information theory in 1967 when I encountered Amnon Katz's "
Statistical Mechanics: an Information Theory Approach". Later I realized that ET Jaynes had done this circa 1951, and had noted that equating thermodynamic entropy to information entropy led to a lot of nonsense theorems being proved.
Your Kolmogorov approach reminds me somewhat of 'Software Physics', circa 1977, relating the complexity of software to the number of operations and data types. I discovered the literal truth of this when I designed and built a prototype for Intel and Olivetti while they kept adding functionality (more operations) to the original spec.
Anyway, I found your essay easy to read and understand and significant for the question "
What is fundamental?" Congratulations.
You say "
the defining feature of the message is that it changes the state of the recipient." I have in a number of comments remarked that energy flows through space. If it crosses a threshold and changes the structure (or state) of a physical system, it 'in'-forms that system and the information, or record, or number of bits, comes into being. It has meaning only if a codebook or context is present: "
One if by land, two if by sea." The proof of information is in the pudding, in my opinion it is energy that flows across space, there is nothing "on top of" that energy that is 'physical information'. If the frequency or modulation or what have you causes the change of state, that change is
information (if it can be interpreted). While messages carrying "information" provide a very useful way to formulate the problem, I believe many physicists project this formulation on to reality and then come to believe in some physical instantiation of information aside and apart from the energy constituting the message. In that sense I enjoyed your section "Physics As Information Theory" in terms of "foundation messages".
I believe your second and third challenges have essentially the same answer, but current theories based on false assumptions get in the way – one reason I am currently working to uncover the false assumptions.
I like that you mention Pauli's treatment of spin. His projection of the 'qubit' formulation onto Stern-Gerlach data was one of the first (after Einstein's projection of a new time dimension onto every moving object) instances of physicists accepting the utility of projecting mathematical structure onto physics, and then coming to believe in the corresponding physical structure. My belief is that until the current belief in the structure of reality imposed by mathematical projections is overcome, your second and third challenges will not be satisfied. For brief comments on related topics, review three or four comments prior to yours on my essay page.
You conclude by suggesting new insights are most likely to come from data. My belief is that the only solution to our current problems (your three challenges for instance) is to unearth the false assumptions that are built into our current theories and have been for generations. That is not a popular proposition. Almost all working physicists would prefer new ideas to add to their toolbox, not ideas that contradict things they have taught and published and that got them where they are -- another reason to value FQXi. I hope your experience is such that you will come back year after year. And I hope you enjoy your newly gained retirement. It's a good time in life.
Best regards,
Edwin Eugene Klingman
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 13, 2018 @ 02:33 GMT
Dear Edwin Eugene Klingman,
Thank you also for your thoughtful comments and generous spirit. I am pleased to see that I was reasonably on target in understanding several of your key points, since we seem to share a number of views that are definitely not “standard” according to prevalent physics perspectives.
I’m going to cut to the chase on one point:
May I suggest that...
view entire post
Dear Edwin Eugene Klingman,
Thank you also for your thoughtful comments and generous spirit. I am pleased to see that I was reasonably on target in understanding several of your key points, since we seem to share a number of views that are definitely not “standard” according to prevalent physics perspectives.
I’m going to cut to the chase on one point:
May I suggest that what you seem to be proposing is that our universe is a state-machine computer simulation?
The “now” (my term) of the post-GR Einstein ether would be the current state of that simulation. But more critically, your essay concept of a single universal time would no longer be time as we measure it within our universe. Instead, it would be the external time driving this universe simulator. That is why it is perfect time, time that is never affected by the local clock variations seen within our universe. It would be a form of time whose source is not even accessible from within our universe! Within our universe, you are instead forced (as Einstein was) to use the physical oscillatory behaviors of matter and energy -- of clocks -- as your only available definition of time. And as physical objects, they are of course fully subject to the rules of special relativity.
Given your impressive computer background, I suspect that you may already be thinking along these lines, and are just being cautious in how you present such an idea to a physics audience. But even if that is true, there is a very lively subset of physics that like this idea already, so you would not be alone.
Also, if you define your universal time as
external to our universe, folks who like the beauty and incredibly good experimental validation of every aspect of SR would breathe a lot easier when they read what you are saying. The SR concept of time remains just as Einstein defined it, using only physical clocks, so none of that is impacted. Instead, you would be introducing a
new concept, an external, perfect, and truly universal time -- the clock of the simulator in which all these other clock are running as simulations.
So, I just a thought and an idea for presenting your ideas in a way that might (?) help you get a bit more traction.
I'm getting to your comments on my own essay thread, BTW, though likely not this evening. You do bring up lots of interesting points!
Cheers,
Terry
view post as summary
Edwin Eugene Klingman replied on Feb. 13, 2018 @ 03:44 GMT
Dear Terry,
Thanks for your response. I think you're beginning to see how valuable FQXi comments are. For example, I learned from your response to Noson Yanofsky, in the comment that follows mine. The essays have a nine page limit, but there is no limit to how much information we can exchange in the comments!
My dissertation, "
The Automatic Theory of Physics" was based on...
view entire post
Dear Terry,
Thanks for your response. I think you're beginning to see how valuable FQXi comments are. For example, I learned from your response to Noson Yanofsky, in the comment that follows mine. The essays have a nine page limit, but there is no limit to how much information we can exchange in the comments!
My dissertation, "
The Automatic Theory of Physics" was based on the fact that any axiomatic formulation of physics (including special relativity) can be reformulated as an automaton. So I appreciate your suggestion that "universal time" is the 'external' trigger to the state sequencer that yields the (simulated) universe. However the theme of "
Universe as a simulation" shows up every so often in these contests, and I always argue against it. My point, repeated in my comment above, is that "physics", our models of reality, are based on
projection of mathematical structure onto physical reality – the more economical the better, whether Kolmogorov or Occam's razor. But I do not believe these structures are actually replicated in reality. Rather, I believe that the root of our current problems is based on structures imposed early, and now accepted as gospel. Clearly GR and QM are correct, so far as they go, but I believe they can be physically reinterpreted (retaining almost all mathematical structure, since it works) and a better theory would result.
Your major point, if I'm reading you correctly, is that we don't measure time per se. We measure duration, based on imperfect clocks. Einstein imagined perfect clocks, and distributed them profusely, but they don't exist, and his space-time symmetry leads to nonsense that is not supported by reality. You mention experimental validation of "every aspect of SR", but reference 10 in my essay argues that length contraction has
never been measured. And the space-time symmetry of SR is
asymmetric in the
Global Positioning System. So I question this "every aspect".
Nevertheless, your informative comment gives me more to think about and can only improve my approach.
Best regards,
Edwin Eugene Klingman
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 15, 2018 @ 21:09 GMT
Edwin Eugene Klingman,
I am delighted and more than a little amused at how badly I
misunderstood your intent! I would have bet that your answer was going to be “yes, I was just being subtle about simulation”… and I was so wrong!
I’ll look more closely to figure out why I got that so wrong. I may even look up your thesis, but no guarantee on that -- theses tend to be
long in most cases!
I downloaded your ref [10] and definitely look forward to looking at that one! I would say immediately that a lot of particle and especially atomic nuclei folks would vehemently disagree, since e.g. things like flattening have to be taken into account when trying to merge nuclei to create new elements. But that's not the same as me having a specific reference hand, as you do here.
So: More later on that point. Thanks for an intriguing reference in any case!
Cheers,
Terry
Author Terry Bollinger replied on Feb. 16, 2018 @ 14:13 GMT
Hi Edwin Eugene Klingman,
Let’s get to the main point: You surely realize that the Curt Renshaw paper (your ref 10) contains has no data whatsoever disproving special relativity? I assume you do, since you worded your description of the paper as “arguing” that SR length contraction does not exist, versus saying that the paper actually provides data contradicting SR.
The...
view entire post
Hi Edwin Eugene Klingman,
Let’s get to the main point: You surely realize that the Curt Renshaw paper (your ref 10) contains has no data whatsoever disproving special relativity? I assume you do, since you worded your description of the paper as “arguing” that SR length contraction does not exist, versus saying that the paper actually provides data contradicting SR.
The Crenshaw paper instead only asserts that when the NASA Space Interferometry Mission (SIM) satellite is launched in 2005 (it is an old paper), it
will disprove SR, because the author
says it will:
“The author has demonstrated in several previous papers that the Lorentz length contraction likely does not exist, and, therefore, will not be found by SIM.”
SIM was supposed to be launched in 2005 but was cancelled. Its nominal successor, SIM Lite, was also cancelled. Thus no such data exists, either for or against SR. The title of the paper, “A Direct Test of the Lorentz Length Contraction”, is at best misleading, although it could perhaps be generously interpreted as a paper about a
proposed test that never happened.
In sharp contrast to this absence of data, all of the effects of SR, including time dilation, relativistic mass increases, and squashing of nuclei, is unbelievably well documented by hundreds or thousands of people who use particle accelerators around the world. Particle accelerators easily explore velocities very close to the speed of light, and so can produce extremely strong signals regarding the effects of SR. Shoot, even ordinary vacuum tubes prove SR if you crunch the numbers, since the electrons must accumulate energy under the rules of SR. Denying the existence of this gigantic body of work, engineering, and detailed SR-dependent published papers is possible only by saying “I don’t like
that enormous body of data, so I will just ignore it.”
Bottom line: You gave me a data-free reference with a title that fooled me into thinking it had actual data in it. I like you and your willingness to explore alternatives, but I sincerely wish you had not done that. My advice: Go look at the thousands of papers from the particle accelerator community, and stop focusing on a single deceptively titled non-paper (or author, since Crenshaw has other papers).
Sincerely,
Terry Bollinger
view post as summary
hide replies
Member Noson S. Yanofsky wrote on Feb. 12, 2018 @ 18:44 GMT
Dear Terry Bollinger,
Thank you for an interesting essay.
I was wondering about the relationship between Kolmogorov Complexity and Occam's razor? Do simpler things really have lower KC? Also what about Bennett's logical depth? Why is KC better than logical depth?
Please take a look at my essay.
Thank you again for great read.
All the best,
Noson
report post as inappropriate
Author Terry Bollinger replied on Feb. 13, 2018 @ 01:46 GMT
Dear Noson,
Thank you for your excellent and insightful question! The answer is yes. If you translate a solution that has survived Occam’s Razor into binary form (that is, into software), then the binary form of that solution will exhibit both the brevity and high information of a (near) Kolmogorov minimum.
One can think of the “side trips” of a non-compact message as the information equivalents of the various components of a Rube Goldberg contraption. Simplifying the message thus becomes the equivalent of redesigning an information-domain Rube Goldberg contraption to get rid of unnecessary steps. The phrase “Occam’s Razor” even suggests this kind of redesign, since for both messages and physical Rube Goldberg contraptions the goal is to cut away that which is not really necessary.
One point that I think can be a bit non-intuitive is that solutions near their Kolmogorov minima are information dense -- that is, they look like long strings of completely random information. The intuitive glitch comes in here: If the goal of Occam’s Razor is to find the simplest possible solution, how can a Kolmogorov minimum that is packed to the gills with information be called “simple”?
The explanation is that to be effective, messages -- strings of bits that change the state of the recipient -- must be at least as complex as the tasks they perform. That means that even an Occam’s Razor solution must still encode non-trivial information, and depending on the situation, that in turn can translate into long messages (or lengthy software, or large apps).
If the desired state change in the recipient is simple in terms of how the recipient has been “pre-programmed” to respond (which is a very interesting issue in its own right), then the Kolmogorov minimum message will also be very short, perhaps as short as just one bit. But even though a single bit “looks” simple, it still qualifies as having maximum information density if the two options (0 or 1) have equal probability.
The other extreme for Occam’s Razor extreme occurs when the state of the recipient requires a major restructuring or conversion, one that is completely novel to the recipient. That can be a lot of bits, so in that case Occam’s Razor will result in a rather lengthy “simplest possible” solution. Notice however that once this new information has been sent, the message recipient becomes smarter and will in the future no longer need the full message to be sent. A new protocol has been created, and a new Kolmogorov minimum established. It’s worth pointing out that downloading a new app for your smart phone is very much an example of this scenario!
We see this effect all the time in our modern web-linked world. As globally linked machines individually become more “aware” of the transformations they are likely to need in the future -- as they receive updates that provide new, more powerful software capabilities -- then the complexity of the messages one needs to send
after that first large update also shrinks dramatically.
This idea that Kolmogorov messaging builds on itself in a way that over time increases the “meaning” or semantic content of each bit sent is a fascinating and surprisingly deep concept. It is also deep in a specific physics sense, which is this: The sharing-based emergence of increasingly higher levels of “meaning” in messages began with the emergence of our specific spacetime and particle physics, and then progressed upwards over time across a spectrum of inorganic, living, sentient, and (particularly in the last century) information-machine based message protocols. After all, how could we know some of the elements in a distant quasar if the very electrons of that quasar did not share the design and signatures of the electrons within our detection devices? We assume that to be so, but there is no rule that
says it must be so. It is for example certainly conceivable that some distant quasar might be made of a completely different particle set from matter in our part of the universe. But if the universe did not provided these literally universally shared examples of “previously distributed” (by the big bang e.g.) information baselines, then such transfers of information would not even be possible.
So here’s an important insight into the future of at least our little part of the universe:
Meaning, as measured quantitatively in terms of observable impacts on physical reality per bit of Kolmogorov minimum messages sent, increases over time.
This idea of constantly expanding meaning is, as best I can tell, the core message of this year’s deeply fascinating essay (topic 3088) by Nobel Laureate Brian Josephson, of Josephson diode fame. His essay is written in a very different language, one that is neither physics nor computer science, so it is taking me some time to learn and interpret it properly. But reading his essay has already prompted me to reexamine my thoughts a year or so ago (on David Brin’s blog I think?) regarding the emergence over the history of the universe of information persistence and combinatorics. Specifically, I think focusing on “meaning,” which I would define roughly as impact on the physical world per bit of message sent, may provide a better, cleaner way to interpret such expanding combinatoric impacts. When I reach the point where I think I understand Professor Josephson’s novel language adequately, I will post comments on it. (I should already note that I am already deeply troubled by one of his major reference sources, though Professor Josephson does a good job of filtering and interpreting that extremely unusual source.)
Please pardon my overly long answer! You brought up a very interesting topic. I’ll download your essay shorty and take a look. Thanks again for your comments and question!
Cheers,
Terry
Stephen James Anastasi wrote on Feb. 13, 2018 @ 11:42 GMT
Dear Terry
Thank you for your feedback on my essay, ‘
I will use a similar marking system to that used by you.
What I liked:
Easy to read. Well set out. Your core idea came through well.
What I thought about as I read it:
The initial idea seemed quite a lot like Occam’s razor, shifted from a philosophical stance to a mathematical...
view entire post
Dear Terry
Thank you for your feedback on my essay, ‘
I will use a similar marking system to that used by you.
What I liked:
Easy to read. Well set out. Your core idea came through well.
What I thought about as I read it:
The initial idea seemed quite a lot like Occam’s razor, shifted from a philosophical stance to a mathematical stance.
Einstein’s derivation of E=mc^2 was moderately complex at certain points, and each step represented the last, using more letters, yet each says the same thing, so must express the same level of fundamentality. The difference is only in he or she who reads it. Fundamentality would then be in the eye of the beholder?
There is an implicit assumption in your example of replacing a gigabyte file. Say I write this program, but before it ends, I pull the plug! The program is time dependent while the gigabyte file is more space dependent. This was a cool idea, but I don’t think the file and the system are equivalent, only potentially equivalent. Also, reproducing the original file by a computer, even a perfect computer, would require an expenditure of energy under Landauer’s ‘information is physical’ correlation that would require more effort to retrieve than simply keeping the original file. I mention this because it hints at hidden aspects that make some things seem more fundamental than they really are.
I thought the reference to pi was quite fun. It occurred to me that given pi is irrational, if one had some way of referencing a point in its decimal expansion, then every finite number would be expressed within it and the larger the number, the more efficient it would be to specify the start and end point? I see this would likely have some limit due to a trade-off, but it is fun to think about. I suppose the same would apply to any irrational number.
You say, ‘the role of science is first to identify such pre-existing structures and behaviors, and then to document them in sufficient detail to understand and predict how they work.’ For philosophical reasons, there is a well-known (Hume, Kant, Popper) strong epistemic schism between science and reality. We can never identify, from an empirical standpoint, pre-existing structures. We can only guess at them from hints provided by experiment. But you probably know this already. My essays were written exactly to span the gap from a rationalist perspective.
On the Spekkens Principle. I haven’t read that essay, but the terminology suggests his work echoes that of Raphael Sorkin and his Causal Set Theory, only interpreted in an informational universe context (I think). It is exactly the dynamics, the ultimate cause, which my work addresses.
On your ‘Challenge 3’. I am a bit surprised that you didn’t mention in your comments on my essay argument for an increasing baryon mass (whether due to intrinsic change of the baryon or extrinsic change due to the ‘shape of space’, for which my model is not yet sufficiently advanced) which would answer at least one part of this challenge, namely why the ration of electron to proton mass is at it is.
‘The Standard Model qualifies overall as a remarkably compact…framework.’ Oi! Are we reading the same model? That said, I see you point that compared to SUSY and string theory and so forth, it is relatively compact, but with all those parameters, might they be hiding a very large set of other theories?
I hope you take the time to read my previous essay in ‘It from Bit’. In it there may be a gemstone of simplicity. My whole model comes from a single principle, and that principle is no more than an expression of our idea of equivalence.
I am providing a short response to your comments on my paper.
Best wishes,
Stephen
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 13, 2018 @ 13:50 GMT
Stephen,
Thank you for your thoughtful and intriguing comments! I will look at your comments on your essay thread. A few quick responses:
-- Ironically, I’m not a big fan of E=mc
2, since I immensely prefer the energy-complete, Pythagorean-triangle-compatible form:
[equation] E=mc
2 addresses only the very special case of mass at rest, and so is not very...
view entire post
Stephen,
Thank you for your thoughtful and intriguing comments! I will look at your comments on your essay thread. A few quick responses:
-- Ironically, I’m not a big fan of E=mc
2, since I immensely prefer the energy-complete, Pythagorean-triangle-compatible form:
E=mc
2 addresses only the very special case of mass at rest, and so is not very useful in any actual physics problem. I also find it deeply amusing that Einstein's original paper in which he announced his mass-equals-energy insight, "Does the Inertia of a Body Depend on Its Energy Content?", uses L in place of E, and for some odd reason never actually gives the equation! Instead, Einstein uses a sentence only to describe an equivalent equation with c on the other side:
"If a body gives of the energy L in the form of radiation, its mass decreases by L/c
2."
-- Your points about the subtleties of the gigabyte example are correct, very apt, and interesting! In some of my unpublished work I am strong on distinguishing very carefully between “potentiality” (yor word) versus “actuality”, and not just in physics, but especially in mathematics. The very ease with which our brains take potentials to their limits and then treat them as existing realities is both an interesting efficiency aspect of how capacity-limited biological cognition works, and a warning of the dangers of being sloppy about the difference. Computer science flatly forces one to face some of these realities in ways that bio-brain mathematical abstractions do not. This is why I think it is very healthy for mathematicians to contemplate what happens if they convert their abstract equations into software.
In any case, issues such as the ones you mention are where the real fun begins! While a short essay is great for introducing a new landscape to a broader audience, there is a lot more going on underneath the hood, as you have just pointed out with your comments on both Einstein’s mass equation and my gigabyte file example. An essay is at its best more like a billboard enticing viewers to visit new land, including at most a broad, glossy, overly simplified view of what that new land looks like. The real fun does not begin until you start chugging your vehicle of choice over all the &%$^# potholes and crevices that didn’t show up in that enticing broad overview!
-- I would note that while in principle it may be possible to find any random sequence of numbers somewhere in pi (wow,
that would be an interesting proof or disproof...), there is a representation cost for the indices themselves that must be also be taken into account. A full analysis of the potential of pi or other irrational numbers as data compression mechanisms would be mathematically fun, and who knows, might even end up pointing to some subset of methods with actual compression value. The latter to me seems unlikely, though, since the computation cost is likely to get huge pretty quickly.
-- As an everyday passionate Hume in being, I Kant speak all that knowledgeably about the biological structures behind how our minds perceive reality. But I try hard to avoid the subconscious assumptions of truth that Popper up so often in physics and even math, when all one can really do is prove that at least some of these assumptions are false. Thus when writing for a broad audience, I try hard to make it easier for the reader to follow an argument and stay focused on it by simply explaining any necessary philosophical point “in line” and as succinctly as possible. The less satisfying alternative would be to give them a hyperlink to some huge remote body of literature that would then require a lot of reading on their part before they could even get back to the original argument... which by that point they have likely forgotten... :)
-- I am also surprised that I did not mention your baryon mass idea in my comment, since I certainly was thinking about them when I wrote the comment! I guess I was just more focused on the experimental constraints on the idea?
-- On the Standard Model being “relatively compact”, that is, say, in comparison to string theory. But oi indeed, my emphasis was very definitely on the word “relatively”! The Standard Model as it stands is a huge, juicy, glaringly obvious target for some serious Occam outtakes and Kolmogorov compressions.
By the way, as someone who is deeply convinced that space, a very complex entity if you think about it, is emergent from simpler principles, I found your constructive approach to it interesting.
I will try to read It from Bit soon.
Cheers,
Terry
view post as summary
Vladimir Nikolaevich Fedorov wrote on Feb. 16, 2018 @ 05:42 GMT
Dear Terry,
With great interest I read your essay, which of course is worthy of the highest praise.
I found in the forum thread of Kadin your questions, which are much more interesting and relevant than the questions of FQXi.
My opinion on these issues:
(1) Entanglement - is the only remote mechanism in the Universe for forming the force of interaction between the...
view entire post
Dear Terry,
With great interest I read your essay, which of course is worthy of the highest praise.
I found in the forum thread of Kadin your questions, which are much more interesting and relevant than the questions of FQXi.
My opinion on these issues:
(1) Entanglement - is the only remote mechanism in the Universe for forming the force of interaction between the elements of matter, which is realized as a result of the interaction of the de Broglie toroidal gravitational waves at the common frequencies of the parametric resonance.
This quantum mechanism of gravity is shown in a photo of phenomena observed in outer space (essay 2017) "The reason of self-organization systems of matter is quantum parametric resonance and the formation of solitons" (https://fqxi.org/community/forum/topic/2806).
For example, a molecule is a state of entanglement (interaction) of atoms at common resonant frequencies of the de Broglie toroidal gravitational wave complex (including tachyon waves) belonging to different levels of matter.
(2) Full fundamental fermion zoo - is described by simple similarity relations of the fractal structure of matter, on the basis of the parameters of the electron and the laws of conservation of angular momentum and energy.
Fermions of different levels of matter are neutrinos for each other. All this is given in the essay 2018 "Fundamental" means the underlying principles, laws, essence, structure, constants and properties of matter (https://fqxi.org/community/forum/topic/3080).
Also given are the ratios for the deterministic grids of all the main resonance frequencies of the zoo of toroidal gravitational waves (fundamental fermions), and comparisons are made with known observed resonant frequencies.
(3) Recreating GR predictive power - is possible only after understanding the fact of the existence of potential stability pits in all fundamental interactions, both in strong interaction.
After such an understanding, logically easily is solved the paradox of electrodynamics, when the orbital electron does not radiate.
Potential stability pits (de Broglie toroidal gravitational waves, orbital solitons) are formed due to quantum parametric resonance in the medium of a physical vacuum.
With understanding of potential pits comes an understanding of inertia and mass.
(4) Clarifying waves vs superposed states - This is the result of the interaction of the toroidal gravitational waves of de Broglie (fundamental fermions), it can be determined by solving classical quantum parametric resonance problems, for example, using the Mathieu equations (as in radio engineering).
The solutions of these equations can be represented as a Fourier series, which is actually a set of real toroidal gravitational waves interacting (entangled) in a system on deterministic grids of a set of resonance frequencies.
I'm sorry that everything is wrong, «how very much like space curvature could create such observed effects».
Instead of curvature of space-time, there is a derivative of spatial coordinates in time. Equivalent of "curvature of space" is the speed of propagation of gravitational interaction.
I hope that my modest achievements can be information for reflection for you.
Vladimir Fedorov
https://fqxi.org/community/forum/topic/3080
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 16, 2018 @ 12:01 GMT
Vladimir,
Thank you for your kind remarks and comments. I will take a look at your essay sometime today, Friday Feb 16.
Cheers,
Terry
James Lee Hoover wrote on Feb. 16, 2018 @ 18:33 GMT
Terry,
There is a lot to like in your essay. It gives good guidance for simplistic discovery, including your 3 challenges, which add to a relatively out-of-the-box perception of simplistic processes of investigation. We all marvel over Einstein's equation, it's simplistic epiphany of the duality of energy and mass. Euler's identity is intriquing to all and fermion-boson spin baffling. And if we programmed in our careers we staggered over the mind-numbing immensity of mishmash of recursive equations years of coding piled on. I speak of new approach and discovery as well in my essay. Fundamental does involve fewer bits but also new discovery in following a more simplistic thread as you mention. I rate your essay high on several points. Hope you get a chance to look at mine.
Jim Hoover
report post as inappropriate
Author Terry Bollinger replied on Feb. 16, 2018 @ 19:40 GMT
Jim,
Thank you for your positive and thoughtful remarks! I look forward to seeing your essay, and will download a copy of it shortly.
Cheers,
Terry
Gary Valentine Hansen wrote on Feb. 16, 2018 @ 18:49 GMT
Thanks Terry,
There are questions that arise concerning our interests in simplification that are not commonly admitted. For example:
1. Is simplification ‘simply’ a means of reducing complexity to a level of understanding that is acceptable (i.e. comfortable) and thereby communicable to others?
2. Is the search for simplification acknowledgement that the subject under consideration is beyond the capacity of a person to comprehend in its totality?
3. Is simplification a means by which one can get connected to people operating at a higher (or lower) level of consciousness?
4. If simplification is assumed to promote a common cause, the purpose of which is to unite one's interests with those of others, at what point does the process of simplification become too simple and thereby confuse rather than clarify issues?
5. Is the FQXi question so simple that it stimulates multiple lines of enquiry rather than serving to unite people in a common understanding?
At issue is how many people are reasonably expected to benefit from any process of simplification. If that family is limited to professional physicists, mathematicians, or people that happen to speak a particular ‘foreign’ language, then is the quest for simplification really justified?
Does being ‘more fundamental in the sense of having the deepest insights’ really contribute to understanding, or was Einstein the only person that truly understood what he was saying at the time?
Thank you Terry for inviting us along your chosen path. You carry my best wishes.
Gary.
report post as inappropriate
Author Terry Bollinger replied on Feb. 16, 2018 @ 20:58 GMT
Gary,
Thank you for your positive remarks! And wow, that is an intriguing set of questions you just asked!
I like in particular that you are addressing the human and social interactions aspects of communications simplification. These are critical aspects of what I call collaborative or collective intelligence, that is, the “IQ” of an entire group of people, systems, and...
view entire post
Gary,
Thank you for your positive remarks! And wow, that is an intriguing set of questions you just asked!
I like in particular that you are addressing the human and social interactions aspects of communications simplification. These are critical aspects of what I call collaborative or collective intelligence, that is, the “IQ” of an entire group of people, systems, and environments. The idea of a collective IQ addresses for example why free market economies tend in comparison to authoritative economies tend to be hugely more clever, efficient, and adaptable in their use of available resources. The intelligences that emerge from free market economies are examples of intelligences that are beyond detailed human comprehension; that is precisely why the human-in-charge authoritarian structures are so ineffective.
Intelligence is never fully spatially localized, and that is the source of many deep misunderstandings about its nature. Even when you do something as simple as read a book, you have extended your intelligence beyond the bounds of your own body, since you are now relying on an external memory. I would suggest that the main reason human intelligence can be oddly difficult to distinguish from animal intelligence is because it is not the innate cleverness of any one human that defines human intelligence, but rather the extraordinarily high level of
networking in both time (writing) and space (language) of human intelligence that makes us unique. For example, a very clever bonobo can I think be
individually not that different from a human in terms of innate problem solving and cleverness. But that same bonobo lacks the scaffolding of language, both internally (e.g. for postulating complex imaginary worlds) and externally (for sharing with other bonobos), and so is unable to “build on the shoulders of others,” as we like to say.
(A bit of a physics tangent: I would also suggest that intelligence is deeply intertwined with the emergence of information within our universe, in ways we do not yet fully comprehend. At the very origin of our universe the emergence of "I need my own space!" fermions in flat space enabled the emergence of what we call information, via persistent configurations of fermions within that accommodating flat space. But only obstinately persistent and isolationist fermions can readily create the kinds of unique configurations that we call history. Once the universe made history (information) possible, higher levels of complexity also became possible, including only very recently networked human intelligence.)
Your particular questions can be answered specifically only by first grappling with the curiously probabilistic issues that underlie all forms distributed intelligence, but which are particularly conspicuous in human interactions. Pretty much by definition, an intelligent system must deal with issues that cannot be fully anticipated in advance, but which also can be at least
partially anticipated. These complex underlying probabilities in turn affect the nature of “simplifications” needed in any one messaging event. Three major simplification options include subsetting (sending only a small but specific subset), generalizing (capturing an overall message, but leaving the recipient to synthesize the details), and complete transfer (e.g., loading a new app onto a smartphone).
The nature of and state of the recipient is of course also critical, and just to confuse everything a bit more, often highly variable over time. The general trend is that due to accumulation of earlier messages and their implications, meaning-per-message increases over time. That also complicates the idea of summarization, since what previously was an incomplete message may over time become entirely adequate. You can watch that effect in slow-but-real time as your Alexa or Hey Google or whatever grows a little smarter each week about how to interpret exactly the same human sentence.
I will address your specific questions after I’ve read your essay. Again, thank you for such excellent questions!
Cheers,
Terry
view post as summary
Scott S Gordon wrote on Feb. 17, 2018 @ 01:25 GMT
Hi Terry,
I read you essay and I loved the last paragraph...
If you see such a thread and find it intriguing, your first step should be to find and immerse yourself in the details of any high-quality experimental data relevant to that thread. Some obscure detail from that data could become the unexpected clue that helps you break a major conceptual barrier. With hard work and insight, you might just become the person who finds a hidden gemstone of simplicity by unravelling the threads of misunderstanding that for decades have kept it hidden.
Now even though I am going to say this - I still loved your essay... Your conclusion is completely wrong and this is the reason why...
I can assure you with utmost confidence that no high-quality experiment with its high quality data will help in revealing what is hidden from us which is required to figure out the theory of everything. Yes I know I am making a very bold statement but, I just wanted you to hear this for future reference when physicists start looking into Gordon's Theory of Everything.
The law of conservation of energy is what is preventing us from realizing what dark energy is... Yes it would actually break the law of physics to solve the theory of everything the way you are proposing. :)
Anyway - if you have any interest - a very limited exposure to my theory is presented in my essay, "The Day After the Nightmare Scenario"
All the best to you
Scott S Gordon, MD/Engr.
report post as inappropriate
Author Terry Bollinger wrote on Feb. 17, 2018 @ 03:41 GMT
Hi Scott,
I love it!!
Yep, you are right: Details of past data are unlikely to do squiddly for such incredibly important issues as “dark matter” and “dark energy”. You nailed me royally on that point! I was thinking in particular about overlooked issues in the Standard Model, but hey, even there the whole dark-dark issue has to come in
somehow.
I’ve added you to my reading list, which is a getting a bit long, but I hope to get to it soon.
Thanks again! Since I am Missourian by upbringing, it is the well-stated critiques that make my day. I’ve found by hard experience that if I start getting way too confident in my own ideas, I start looking and acting like the rear end of one of those Missouri mules. :)
Cheers,
Terry
Anonymous wrote on Feb. 17, 2018 @ 05:13 GMT
Hi Terry,
I liked that you provided a simple model of what is fundamental. And your essay followed its own premise: "Fundamental as Fewer Bits". I really enjoyed reading it.
In particular I liked:
"Because gravity is so weak, principles of quantum mechanics drove the scale of such models into both extremely small length scales and extraordinarily high energies. This in turn helped unleash so many new options for “exploration” that the original Standard Model simply got lost in an almost unimaginably large sea of possibilities.[9]"
I my essay "The Thing that is Space-Time" I attempt to pull gravity out of the Standard Model.
I postulate a graviton is not a Boson and that, and in general has very low energy and very large distances (aka wavelength) that span all the matter in the universe. Thus it is a very low energy particle. I use three basic equations to produce this theory: 1. The Planck-Einstein equation. 2. E=mc^2 and 3. The equation for the Planck mass. The general overview is that the graviton is much like a guitar string that is anchored on opposing Planck masses. This quantum mechanical guitar string (the graviton) has a mass and instead of supporting musical notes it supports the different frequencies of light (photons).
Question: Would you take a look at my entry and let me know if this version of gravity has any merit in terms of meeting your criteria of having fewer bits? Any response appreciated!
Thanks,
Don Limuti
report post as inappropriate
Author Terry Bollinger replied on Feb. 17, 2018 @ 18:00 GMT
Don,
Thank you both for your supportive remarks, and for your intriguing comments on a non-boson approach to gravity! I will definitely take a look, though I should warn you that my reading queue is getting a bit long.
I'd say that your proposing a "non-boson" approach sounds pretty radical... except that after about 40 years of trying, the boson approaches still haven't really worked, have they? Also, general relativity, which does succeed very well experimentally (well, there is that dark energy thing) is anything but "boson" based. I think folks underestimate just how utterly incompatible the boson approach of quantum gravity and the geometric approach of general relativity are! The very languages are so utterly different that it's hard even to say what either one means in the language of the other.
So, thanks again, and I'll get to your essay as soon as I can.
Cheers,
Terry
Author Terry Bollinger replied on Feb. 24, 2018 @ 18:27 GMT
Don,
My apologies, I completely forgot this one.
I have now created a response folder for your essay and my responses. (Yes, I create an entire folder for each essayist with whom I interact.)
Most likely I got distracted (left my laptop) right after responding to you. With so many essays and so many posts (and other distractions), I tend to forget my promises if I do not immediately created the corresponding folder.
Please note in advance that due to my own pledge (see link at bottom) I can be a pretty tough reviewer. So, when folks request reviews I reserve the right just to make comments and
not to score the essay in cases where I know I would give a low score. I don’t mind giving blunt feedback— sometimes we all need that — but I just don’t feel good giving low scores in response to a polite request for a review.
It’s best to mention all of this
before I look at your essay, since I have no idea in advance what I’ll be seeing or how I may react.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
Conrad Dale Johnson wrote on Feb. 17, 2018 @ 17:12 GMT
Terry,
This is a fine essay with many interesting points, eminently clear and sensible. On your main theme of simplicity, you should check out Inés Samengo’s excellent essay. She has a similar take, but also considers the scope of a theory as a second key factor in determining what’s fundamental. And she makes the point that these two criteria are not necessarily in synch. FYI,...
view entire post
Terry,
This is a fine essay with many interesting points, eminently clear and sensible. On your main theme of simplicity, you should check out Inés Samengo’s
excellent essay. She has a similar take, but also considers the scope of a theory as a second key factor in determining what’s fundamental. And she makes the point that these two criteria are not necessarily in synch. FYI, though essay ratings have to be done by 2/26, I believe we can continue reading and posting comments afterwards. So no rush!
As you know from looking at
my essay, I agree that “a better way to think of physics is not as some form of axiomatic mathematics, but as a type of information theory.” And I like the way you characterize the difficulties we face when we have a theory that seems close to being fundamental -- your description of “the trampoline effect” was especially vivid and on point, with the Standard Model. Most of all, though, I like your general attitude – you can get seriously involved in specific issues (your “challenges”), but also really broad ones – like the “lumpiness” you mention in your comments to
Karen Crowther’s essay: “Our universe is, at many levels, “lumpy enough” that many objects (and processes) within it can be approximated when viewed from a distance.”
You were writing about renormalization, and making an interesting shift in perspective. Physicists have tried to understand this by delving into the mathematics, which by now is apparently well-understood. You suggest that a different viewpoint might also help, comparing this with many other cases in which the “approximate” (or “effective”) properties of a complex system define it more usefully at a higher level. I agree that this is a deep and important characteristic of our universe, where lower-level complexity supports new and simpler kinds of relationships, where new kinds of complexity can become important. I hope this perspective can eventually elucidate the amazing complications of our current physics.
Your summary
credo is excellent: “the belief that simplicity is just as important now as it was in the early 1900s heydays of relativity and quantum theory.” The wonder of our situation is that we’re still trying to grasp exactly what kind of simplicity those two foundational theories are showing us.
By the way, I’m much in sympathy with your remarks to Flavio, above. The earliest-submitted essays in these contests can be discouraging, and it’s a marvelous relief when a really good one shows up – in my case it was
Emily Adlam’s that rescued me from despair. So thanks for joining in!
Conrad
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 17, 2018 @ 19:57 GMT
Conrad,
[Argh, I almost became Anonymous! Why in the world does FQXi automatically sign people out after a few hours, without even giving a warning like everyone else in the world? And similarly, why do they keep expiring the reCAPCHA? That’s not security, that’s just annoying, argh
2! Keeping folks signed in is the norm these days!]
First, I should probably mention...
view entire post
Conrad,
[Argh, I almost became Anonymous! Why in the world does FQXi automatically sign people out after a few hours, without even giving a warning like everyone else in the world? And similarly, why do they keep expiring the reCAPCHA? That’s not security, that’s just annoying, argh
2! Keeping folks signed in is the norm these days!]
First, I should probably mention that I've posted a follow-up to my
contemplation of the perturbative issue (post 144023) you just mentioned. That is the one in which I took a deeper look at the issues underlying Criterion 4 from
Karen Crowley's superb essay.
Sleeping on that issue precipitated a rather unusual early-morning chain of analysis that I
documented in real-time in post_144220. Here is my final, fully generalized hypothesis from the end of that analysis chain:
All formal solutions in both physics and mathematics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing “lumpy” structure of our universe.If that assertion makes your eyes pop a bit, please take a look at my analysis chain at the above link. Once I went to this (for me at least) new place... well, it became very hard to go back. That’s because even equations like
E=mc2 have a scale-dependent, perturbative component if you look at them across the full scale spectrum of the universe, since at the quantum scale mass fuzzes out due to virtual pairs, just as in QED for electrons. Including math in that assertion was the final part of the sequence. Again, take a look at why at the above link if you are interested.
Since I don't know if I'm using links rightly yet or not, I'll keep this reply separate and create another one to address the main content of your thoughtful and generous post.
Cheers,
Terry
view post as summary
Conrad Dale Johnson replied on Feb. 23, 2018 @ 19:33 GMT
Hi Terry,
I saw the amazing brainstorm in your 2/17 “quick addendum” comment on Karen Crowther’s essay, but thought it was more appropriate to respond here. (Unfortunately since your comment is hidden in that thread, I can’t link to it directly.)
The basic idea is that perturbative theory is more fundamental than non-perturbative theory, even where the latter is actually...
view entire post
Hi Terry,
I saw the amazing brainstorm in your 2/17 “quick addendum” comment on
Karen Crowther’s essay, but thought it was more appropriate to respond here. (Unfortunately since your comment is hidden in that thread, I can’t link to it directly.)
The basic idea is that perturbative theory is more fundamental than non-perturbative theory, even where the latter is actually available -- which isn’t the case, so far, with the Standard Model. I’d never thought of it like that, but it does make sense to me. It has always seemed to me significant that a beautifully simple formula like Newton’s gravitation law has to be computed perturbatively as soon as you have more than two interacting bodies. I think what this points to is just that the actual physics of our world is being done in the vast numbers of one-on-one relationships between things (and parts of things)… while the seemingly simple “formal solutions” represent the summary result of this, not its underlying cause.
What’s really daring is your insistence that not only physics but sacrosanct Math itself – at least to the extent it’s computable – also works this way. I was indeed shocked, not so much by the anti-Platonic idea as by the intellectual energy behind it. But as I’ve come across your responses to many another essay in this contest, I’ve come to realize you’re one of those who can “think five impossible things before breakfast.” I’m in awe.
Now for another of your dicta, which I also agree with: not only entropy but also meaning increases with time. Referring here to your reply to Noson Yanofsky above, and to your comment on
Josephson’s essay: “Meaning itself appears to be inherent and pre-programmed into the very fabric of our cosmos, both at the level of the Standard Model and deeper. I do not think we are remotely close to understanding how that works, or even how to frame the question properly.”
Framing this question is a central concern of mine. My
my last FQXi essay proposed a recursive definition of meaning, expanding on Bateson’s “difference that makes a difference.” As argued also in
my current essay, for any kind of information to be meaningfully definable, there has to be an appropriate context of other minds of information, that have meaning in other contexts. In relation to your quantifying the “impact on reality per bit,” the point is that “impact” depends on the “pre-programming” of some part of reality to receive this particular kind of information and translate it into some other kind, that has an impact somewhere else. In the earlier essay I tried to outline the three great recursive technologies that accomplish this in the physical, biological and human worlds, each working by a form of natural selection. In a still
earlier essay I pointed out the “semantic” dimension in the mathematical language of physics – another way of getting at the issue of how different types of physical information help define each other.
My sense is that Josephson’s “semiotics” doesn’t get to this key issue. If we think of “meaning” in terms of signifying, we have a relation between sign and signified where the context goes unnoticed. This is also the problem with much discussion of measurement in quantum theory, where the many-faceted complexity of any actual measurement arrangement gets abstracted into a relation between object and observer.
It’s quite understandable that even though measurement-contexts clearly play a key role in quantum physics, it only seems possible to describe them theoretically in a highly abstract form. But I think this is why I’m so impressed your notion that “perturbative” theory has to be fundamental – it shows a willingness to get involved in the many-leveled nuanced “lumpiness” of the world. So more power to you! (But do stop for breakfast once in a while.)
Conrad
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 24, 2018 @ 15:36 GMT
Conrad,
Out of sheer luck I managed to find this posting just now! Speaking of how difficult it is to find reply postings and such, would be sooooo nice if FQXi did things like:
-- When people sign up to get alerts for new or reply postings to an essay, send them emails with
real, exact links to the new posts or replies, as opposed to mindlessly repeating only the generic link to the top-level essay;
-- Make linking to sub-posts trivial and intuitive;
-- Fix the "invisible sub-post" problem;
-- Add more meaningful titles to links, instead of labeling absolutely everything as "FQXi Community";
-- Stop taking people to some new or wrong location after they do something like logout to log back in again (which should keep you on the
same login in page, not send you off to the home FQXi home page!);
-- And worst of all,
stop logging people out invisibly and for no reason!Other than that, I'm good... :)
Conrad, it will be my great pleasure to respond in more detail to you today. I'll also try to fix some of the invisibility issues. I am for example considering consolidating those two very radical on-the-fly postings into one top-level mini-essay of some sort.
More later!
Cheers,
Terry
Peter Jackson wrote on Feb. 17, 2018 @ 17:30 GMT
Dear Terry,
I was most impressed, even inspired. Your ability to find the right questions is leagues above most who can't even recognize correct answers! Lucid, direct, one of the best here.
I entirely agree on simplicity as the title of my own essay suggests, but isn't a reason we haven't advanced that our brains can't quite yet decode the complex puzzle (information)?
But...
view entire post
Dear Terry,
I was most impressed, even inspired. Your ability to find the right questions is leagues above most who can't even recognize correct answers! Lucid, direct, one of the best here.
I entirely agree on simplicity as the title of my own essay suggests, but isn't a reason we haven't advanced that our brains can't quite yet decode the complex puzzle (information)?
But now more importantly. I'd like you to read my essay as two of your sought answers are implicit in an apparent classical mechanism reproducing all QM's predictions and CSHS>2. Most academics (& editors) fear to read, comment or falsify due to cognitive dissonance but I'm sure you're more curious and honest. It simply follows Bell, tries a new starting assumption about pair QAM using Maxwell's orthogonal states and analyses momentum transfers.
Spin 1/2 & 2 etc emerged early on and is in my last essay (scored 8th but no chocs). Past essays (inc. scored 1st & 2nd) described a better logic for SR which led to 'test by QM'. Another implication was cosmic redshift without accelerating expansion closely replicating Euler at a 3D Schrodinger sphere surface and Susskinds seed for strings.
By design I'm quite incompetent to express most thing mathematically. My research uses geometry, trig, observation & logic (though my red/green socks topped the 2015 Wigner essay.) But I do need far more qualified help (consortium forming).
On underlying truths & SM, gravity etc, have you seen how closed, multiple & opposite helical charge paths give toroid... ..but let's take things 2 at a time!
As motion is key I have a 100 sec video giving spin half (+, QM etc.) which you may need to watch 3 times, then a long one touching on Euler but mainly Redshift, SR, etc. But maybe read the essay first.
Sorry that was a preamble to mine but you did ask! I loved it, and thank you for those excellent questions and encouragement on our intellectual evolution.
Of course I may be a crackpot. Do be honest, but I may also crack a bottle of champers tonight!
Very best
Peter
view post as summary
report post as inappropriate
Peter Jackson replied on Feb. 18, 2018 @ 20:16 GMT
Terry,
I omitted the link to the Ridiculously Simple;
100 second video glimpse.
Peter
report post as inappropriate
Peter Jackson replied on Feb. 18, 2018 @ 20:19 GMT
Author Terry Bollinger replied on Feb. 22, 2018 @ 02:33 GMT
Peter Jackson wrote on Feb. 17, 2018 @ 17:30 GMT
https://fqxi.org/community/forum/topic/3099#post_144251
De
ar Terry,
I was most impressed, even inspired. Your ability to find the right questions is leagues above most who can't even recognize correct answers! Lucid, direct, one of the best here.
I entirely agree on simplicity as the title of my own essay suggests,...
view entire post
Peter Jackson wrote on Feb. 17, 2018 @ 17:30 GMT
https://fqxi.org/community/forum/topic/3099#post_144251
De
ar Terry,
I was most impressed, even inspired. Your ability to find the right questions is leagues above most who can't even recognize correct answers! Lucid, direct, one of the best here.
I entirely agree on simplicity as the title of my own essay suggests, but isn't a reason we haven't advanced that our brains can't quite yet decode the complex puzzle (information)?
But now more importantly. I'd like you to read my essay as two of your sought answers are implicit in an apparent classical mechanism reproducing all QM's predictions and CSHS>2. Most academics (& editors) fear to read, comment or falsify due to cognitive dissonance but I'm sure you're more curious and honest. It simply follows Bell, tries a new starting assumption about pair QAM using Maxwell's orthogonal states and analyses momentum transfers.
Spin 1/2 & 2 etc emerged early on and is in my last essay (scored 8th but no chocs). Past essays (inc. scored 1st & 2nd) described a better logic for SR which led to 'test by QM'. Another implication was cosmic redshift without accelerating expansion closely replicating Euler at a 3D Schrodinger sphere surface and Susskinds seed for strings.
By design I'm quite incompetent to express most thing mathematically. My research uses geometry, trig, observation & logic (though my red/green socks topped the 2015 Wigner essay.) But I do need far more qualified help (consortium forming).
On underlying truths & SM, gravity etc, have you seen how closed, multiple & opposite helical charge paths give toroid... ..but let's take things 2 at a time!
As motion is key I have a 100 sec video giving spin half (+, QM etc.) which you may need to watch 3 times, then a long one touching on Euler but mainly Redshift, SR, etc. But maybe read the essay first.
Sorry that was a preamble to mine but you did ask! I loved it, and thank you for those excellent questions and encouragement on our intellectual evolution.
Of course I may be a crackpot. Do be honest, but I may also crack a bottle of champers tonight!
Very best
Peter
Peter Jackson replied on Feb. 18, 2018 @ 20:16 GMT
Terry,
I omitted the link to the Ridiculously Simple;
100 second video glimpse.
Peter
Peter Jackson replied on Feb. 18, 2018 @ 20:19 GMT
..this time with the first 'h'(ttp);
https://youtu.be/WKTXNvbkhhI 100 sec..Classic QM Peter Jackson wrote on Feb. 21, 2018 @ 12:35 GMT
https://fqxi.org/community/forum/topic/3099#post_144803
Te
rry,
Did you see my 17.2.18 post above & 100sec video deriving non-integer spins from my essays mechanism resolving the EPR paradox? (I've just found the 'duplet state' confirmation in the Poincare sphere)
That all emerged from a 2010 SR model http://fqxi.org/community/forum/topic/1330 finally able to resolve the ecliptic plane & stellar aberration issues and a tranche of others (expanded on in subsequent finalist essays).
i.e you'll be aware of George Kaplans USNO circ (p6) following IAU discussions.
(Of course all including editors dismiss such progress as impossible so it's still not in a leading journal!)
Hope you can look & comment
Peter
Hi Peter,
Wow, what generous comments! I am very pleased in particular that you said I may have inspired you a bit. That makes me feel better than anything else you could have said, because in the end that was the hidden intent of the essay: To encourage folks to
look at themselves as capable of more than they ever imagined. Sometimes nothing more than writing up a new idea in a way that people can understand is the best way to help them realize their own potential. There are just too many distractions sometimes, and that in turn keeps us from realizing that we
can focus our minds and efforts to develop powerful new ideas. Take that to a community level and wow, the threads and bundles of possible positive futures opens up in ways no one could have anticipated.
On to other issues! The first is that you accidentally and very innocently stepped on what recently has become a hot button of mine, which is this:
OMG how can you even kiddingly call yourself a ‘crackpot’ for believing and advocating an extremely common sense compatible position that Einstein, Bell, and any number of very smart people feel must be correct??
Entanglement is always an interesting debate, but I don’t think even kiddingly using that particular term for self-deprecation is a good idea. It is one of the most overused
ad hominem phrases in all of science, and for that reason it is also one of the most damaging mental toxins that limit the overall ability of such communities to increase their collaborative intelligence. Intelligence at the delicate community level simply cannot function well if at the individual level its members can with impunity inject such mental toxins to kill off any cross-community communication that they don’t happen to like.
And that is not even getting into the ethics of using such mental toxins specifically to harm other human beings!
That said, sigh, I’ve used that term myself, more than once, although usually with accompanying definitions of the behaviors for which I was using it. Usually it was more frustration for the lack of a better word for describing a certain set of strongly self-defeating behaviors and analytical approaches. Physician heal thyself indeed!
But let’s get back to the issue of entanglement.
How the heck can you being in the company of no less than Einstein on that point merit anyone calling you names? Forget spin entanglement, Einstein alone had the brilliance back in the early 1900s to see that
no quantum probability wave function can be reconciled with classical physics. His very first shot across the bow was a pointed thought experiment that he posed to his audience of fellow quantum physicists: If you have a large wave function, say one a light year across, how do you keep multiple people from finding the
same electron as they individually search that wave function?
The only resolutions Einstein could see were either (a) there was never more than one point-like electron in the wavefunction to begin with (de Broglie’s pilot wave model), or (2) the quantum wave function had to collapse “instantly” across its entire lightyear diameter, removing itself at faster-than-light speeds so that no one else could find the same electron. To Einstein, for whom the principle of locality was absolute, that was enough to prove that wave functions as defined then (and now) could not possibly be complete descriptions of physical reality. It was and is an amazingly perceptive argument.
Having said all that, allow me now to shock a few folks with another disclosure: My position on quantum entanglement is precisely the opposite of those who believe that locality is the primary reality. That is, not only do I accept the reality of quantum entanglement for both experimental and theoretical reasons, I consider space and time as we know them to be
secondary to the world of quantum entanglement.
Our universe emerged from a fully quantum place, and we continue to “mine” what remains of that initially infinite range of undefined futures through the process we call entropy. The two are opposite sides of the same coin: a past that is closed to any further change via accumulation of classical information (“history”), and a future that remains partially open through a sort of mining of the many shreds and fragments of indefinitely broad, undefined futures that existed before the Great Break, and which have not yet been consumed by entropy. We call that two-faced coin space, and it is a place where the original quantum symmetries from before the Great Break now can be seen only within the nooks and crannies of smallness or coldness or indifference (transparency) in which entropy can be held at bay for a while longer. Everywhere else the coin of space displays itself as the eternally changing Hamiltonian of “now”. This universe-encompassing Hamiltonian grasps in one hand the statistically irreversible givens of the past, and in the other hand the freedoms of the yet-to-be-defined quantum future, and from them both forges still more pages to add to the ever-expanding annals of entropy.
And life is there, snatching its opportunity to persist and expand by setting up the givens of the past to ensure a future that ensures their continuity into the future.
Back to entanglement, again.
Given all I said earlier about Einstein’s amazingly perceptive arguments against entanglement, how can I possibly also believe that entanglement is real?
Peter, on page 8 of your 2017 FQXi essay you say: “The entanglement experiments of Aspect
34 and Weihs et al
35 reported unexplained 'rotational inconsistencies' but results followed predictions when ignored, so they were.”
Fact-based turnabout is fair play, I think. So here is my own pro-entanglement anomaly for you and others either to accept or to discount as you see fit:
ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count My main point is that things have, um,
moved along quite a bit since the now-ancient days of Aspect. A lot of hard-nosed business folks figured out years ago that arguments against the very existence of such phenomena do not matter much if you can simply
build devices that violate Bell’s inequality, use them to encrypt critical data transmissions, and last but not least make a lot of bucks by selling them.
I’ll make two additional remarks on your many interesting comments:
First, spin.
I like to think of spin as a bit like gearing. The outside gear is the observer turning the entire system around a few times, like a pot on a pottery wheel. The inner gear is the “spin state” or degree of resulting rotation in response to the external maniputations of the observer.
Spin 1/2 has a half-speed gear inside that doesn’t finish one full circle until the outside one spins twice, so the observer would see it lagging noticeably in comparison to her potter’s wheel. For spin 1 the inner gear (observed object rotation) and outside gear (rotation of the potter’s wheel) are locked together. For spin 2 the object is geared for high speed, circling around twice per observer induced wheel rotation.
All of this is very easy to visualize, since we’ve all seen gears e.g. on bicycles that go faster or slower than the driving gear. However, for spin 1/2 I assure you that this simple visualization is
not the one that usually gets presented, which is an odd sort of thing that doubles the outer gear instead. I would suggest that this much more continuous view is a better way to understand half spin, and that this continuity could even help provide some insights into why 1/2 is so different. Even in this simple model, for example, it is the
only spin that is slower than the driving spin. Spin 1/2 (and also the higher fermion spins of n+1/2, n=1,2,…) also has the very interesting property of causing the object to half-turn in response to one normal turn.
Think about that in terms of phases. If the opposite sides of the object represent plus and a minus phases of some sort, then half-spins have the potential to match positive and negative phases of adjacent identical fermions in ways that integer phases do not. Could this be related to the zero-probability surfaces that form in xyz space between adjacent antisymmetric fermion wave solutions? I honestly do not know, since one has to be careful how one interprets such models But it certainly
smells interesting…
The second topic is your video. I’m sorry, but I watched it several time and never saw even a hint of anything other than classical Bertlmann’s socks propagation of correlated spin, which does not violate Bell’s inequality and so doesn’t explain why customers do not sue the bejeebers out of the makers of the ID230 for false advertising. Oddly, their market instead is expanding.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
Peter Jackson replied on Feb. 22, 2018 @ 10:33 GMT
Terry
Thanks for looking. You clearly have your own well established ideas, something we're all guilty of to some degree. I studied the ID230 and essentially it seems to be based on random number generation and not any 'action at a distance' so found it's descriptions a little misleading and struggle to see it's direct relevance. Can you explain.
I was also surprised you didn't see...
view entire post
Terry
Thanks for looking. You clearly have your own well established ideas, something we're all guilty of to some degree. I studied the ID230 and essentially it seems to be based on random number generation and not any 'action at a distance' so found it's descriptions a little misleading and struggle to see it's direct relevance. Can you explain.
I was also surprised you didn't see from the video how adding different rates of z axis rotation to a 180o y axis rotation transformed it to either spin 1/2 or spin 2, or indeed other non-integer rates. (The North pole returned in either 90o or 360o y axis rotations). It's beauty is in it's simplicity.
Did you a) not see it do so? or b) not think it replicates the data?
I was also disappointed you couldn't follow the mechanistic sequence reproducing Dirac's formulation. It is indeed multi faceted so it's clear (we) have to do a better job breaking it down into brain-manageable steps.
Re Dr Bertelman's socks; Did you read my (top scored) 2015 essay;
The Red/Green Sock Trick. which clarifies how my red lined green socks (& vice versa) avoid Bells theorem as he anticipated, the solution;
"..will be found by going round the back". Most understood in 2015 so I hope at least you may also!
That essay also shows how the QM solution emerged from an SR solution free of paradox (as 3 prev finalist essays) so unified with the fundamental probability distribution of my 'Law of the reducing Middle' (which the ID230 uses), consistent with Prof Phillips excellent essay here.
But all have their own embedded ideas, whether mainstream or not. I know it can be hard to suspend them to explore others as I try to do so systemically but still often struggle. That's human nature, and we all have limited time. If yours is to short, thank you anyway for the time you spent.
Very best
Peter
PS. Are you aware of the IAU/ USNO's big 'ecliptic plane/stellar aberration problem? (astral navigation etc.) I have the clear solution but entrenched thinking won't allow it to emerge. What on Earth can I do?
view post as summary
report post as inappropriate
Peter Jackson replied on Feb. 22, 2018 @ 10:46 GMT
Terry,
I forgot; - Of course I'm
NOT suggesting;
"there is no such thing as entanglement." at all. I show an 'entangled' relationship of antiparallel polar axes reproduces QM predictions, and only then say
"there is no such thing as 'spooky action at a distance' required!"p
report post as inappropriate
hide replies
Eckard Blumschein wrote on Feb. 17, 2018 @ 18:13 GMT
Dear Terry Bollinger,
My challenge #0:
Accept that the border between past and future is a non-arbitrary point of reference; hence cosine transformation is more concise than complex-valued Fourier transformation. Just the redundant information of a chosen point t=0 is missing.
Thank you for encouragement,
Eckard
report post as inappropriate
Author Terry Bollinger replied on Feb. 22, 2018 @ 04:53 GMT
Eckard,
Your essay caught me completely off guard. For a number or reasons I pretty much accept the “now is real” interpretation a the only one that is logically self-consistent, since all block models of time require a sort of magical preconstruction of the block that on closer examination cannot be made self-consistent without some kind of causality-enforcing “growth” from past to...
view entire post
Eckard,
Your essay caught me completely off guard. For a number or reasons I pretty much accept the “now is real” interpretation a the only one that is logically self-consistent, since all block models of time require a sort of magical preconstruction of the block that on closer examination cannot be made self-consistent without some kind of causality-enforcing “growth” from past to future. Any such “growth’ process looks a whole lot like… well, time, and time with a very definite sense of “now” at the future face of growth.
I am very much aware of the SR and quantum arguments for the block universe, but am also unimpressed by them. Since there exist computational models by which an infinite number of inertial frames can co-exist and show the exquisite symmetry of SR, but with only of the frames being “real” and all of the others “virtual”, I do not see any logical path for justifying the need to create a block universe. That would be sort of the most ham-fisted attempt at a solution, and as I noted above, it doesn’t really work anyway due to self-consistency issues. Computer science tends to ingrain the value of virtual into your world view, and of how real these virtual worlds can become, with each one
potentially being the site of the single “real” frame.
I’ve also been a fan of Fourier transforms for decades, particularly the complex variety. I once “invented” a fractional integration/differentiation spectrum based on Fourier transform phase shifts. A chemist friend who actually used fractional calculus in his work was quite excited by it, but I just laughed and said I was very confident that all I had done was come up with an idea that someone else likely had done at least a century ago… which turned out to be exactly the case!
I very much like and agree with your idea that Fourier transforms are even more relevant to particle physics than we give them credit for, though I suspect we differ in some of the details of what part of which variety of transform gets applied where. If you get a chance sometime and have not already done so already, you should look up the chirality issue for fermions in the Standard Model. The left-handed and right-handed version of the electron (and other fermions,
except neutrinos) present some interesting opportunities to link Fourier components to both particles and how particles obtain mass.
My only disappointment in your essay was that I was hoping that in the last couple of pages you would take your model a bit further into particle physics to show how it might connect there. I realize though that the length limits of these essays are tough, but for me this one ended too quickly.
So again, thanks. Yours is one of a very small number of essays that I will stash away for a closer look after the FQXi commentary period.
I will put a short posting under your essay thread to point to this one as my assessment.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
Eckard Blumschein replied on Feb. 26, 2018 @ 12:11 GMT
A reply of mine to this comment was unfortunately edited in a mutilating manner. I merely recall that you mentioned fractional calculus. Maybe, I should have a look at this because for instance half differentiation implies boundaries.
Eckard
report post as inappropriate
Wayne R Lundberg wrote on Feb. 17, 2018 @ 20:28 GMT
Dear Terry,
It has always been the case that the very high-end computing requirements of theoretical physics produce machines and codes specialized to the theoretic structure. So of course the IT community is always a key player. Lattice Gauge Theory, among others, are very compute-intensive stuff! But lets get to the fundamental physics of the subject...
Have you, in your 'broad'...
view entire post
Dear Terry,
It has always been the case that the very high-end computing requirements of theoretical physics produce machines and codes specialized to the theoretic structure. So of course the IT community is always a key player. Lattice Gauge Theory, among others, are very compute-intensive stuff! But lets get to the fundamental physics of the subject...
Have you, in your 'broad' research on the subject, run across the Rishon model? It requires only two type of quanta (T & V) to create the algebraic group of quarks and leptons (QC/ED actually).
H. Harari and N. Seiberg, “The Rishon Model”, Nucl. Phys. B, Vol 204 #1, p 141-167, September 1982.
So the reductionist approach to the 'minimal quantum basis' problem does reveal a somewhat 'binary' solution.
More to the IT-ish point, though, your software skills and devotion to the algebra of the quantum subjects could well be of GREAT use. Do you by chance write javascript? There is a nice java code for displaying the QC/ED group theory for some academic research applications as well as public explanations.
Another interesting point you raise earlier
In it, you offer Challenge #1 - "What is the full physics meaning of Euler’s identity, ?^??+1=0 ?". That is actually an elegantly simple fundamental question /criterion, but just a little off the mark. Of course mathematically we known that for physics to have a unique solution it must have a cyclic variable. At least, all the best formal Proofs of Uniqueness reduce a conformal mathematics problem to a cyclic variable, removing all true singularities (including the point-like particle approximation).
So how does ?^?0 fit in?? Well, it seems that the universe is cyclic in mass and time... NOT radius and time as the astronomic observables would hope /make easy. So the general Theta is actually Thetamass-time! For a more complete answer why this works read my essay, if you please.
Further you discuss: "If someone can succeed in uncovering a smaller, simpler, more factored version of the Standard Model, who is to say that the resulting model might not enable new insights into the nature of gravity?" so please see
C.W. Misner, K.S. Thorne and J.H. Wheeler, Gravitation, W.H. Freeman and Co., p 536, 1973. in which the Nobel-winning author (Thorne) notes that mass is area-like at small (planck) scale.
Here the discussion can go into the finite representation geometries, which are area-like, and their respective quantum state algebras. Or it could look at the influence of ralpha'/R on BH theory,as I've long advocated with Prof Mathur (see his essay), in which the strong (conical) lensing effects observed are due to "PRESERVED" matter in Black Holes. Interesting inquest, again read further into the literature.
Best regards,
Dr Wayne Lundberg
https://fqxi.org/community/forum/topic/3092
p.s.
I too have a 30yr civil service career but started publishing on physics topics in 1992. More dod stories...
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 22, 2018 @ 23:44 GMT
Wayne R Lundberg replied on Feb. 26, 2018 @ 02:00 GMT
...btw, the offer to work on a javascript part of the problem still stands.
report post as inappropriate
Author Terry Bollinger wrote on Feb. 18, 2018 @ 18:40 GMT
All,
I'm having some difficulty getting in the amount of FQXi time I wanted to this weekend, but I still hope to get to all of your excellent comments and questions this evening, Sun 18 Feb.
Cheers,
Terry
James Lee Hoover wrote on Feb. 18, 2018 @ 19:30 GMT
Terry,
Seems to be some subterfuge on scoring. My score for you on 2/16 was an 8, reflecting a high opinion of your piece. Hope you can check out my essay.
Jim
report post as inappropriate
Member Marc Séguin wrote on Feb. 18, 2018 @ 21:10 GMT
Dear Terry,
Congratulations for the essay contestant pledge that you introduced (goo.gl/KCCujt) --- I think we should all follow it, and I will certainly attempt to from now on. Congratulations also on the truly constructive comments that you have left so far on the threads of many of the participants in this contest. I thought I would use a similar format and comment on your...
view entire post
Dear Terry,
Congratulations for the essay contestant pledge that you introduced (goo.gl/KCCujt) --- I think we should all follow it, and I will certainly attempt to from now on. Congratulations also on the truly constructive comments that you have left so far on the threads of many of the participants in this contest. I thought I would use a similar format and comment on your essay!
What I liked:
- Your essay is well written and interesting to read: at the end, I wanted more of it!
- You introduce vivid/memorable expressions to describe your main points: the principle of binary conciseness, the trampoline effect, foundation messages. I specially like the trampoline effect, defined as the bouncing-off of the near-minimum region of Kolmogorov simplicity by adding new ideas that seem relevant, yet in the end just add more complexity without doing much to solve the original simplification goal. I think you will agree that, when you read some of the essays submitted to your typical FQXi contest, you can observe spectacular examples of the trampoline effect. It seems easy to diagnose a trampoline effect in accepted theories that we find lacking, or in alternative theories that we find even more flawed. True wisdom, of course, would be to be able to become aware of the trampoline effect in our own thinking… which is so hard to do!
- You directly address the specific essay contest question, “What is fundamental?” (at least, in the first half of your essay)
- Nicely worded and accessible introduction to the famous equation E = mc²
- Pedagogical presentation of Kolmogorov complexity for the reader not already familiar with the concept
- Interesting parallel between the increased difficulty in reducing Kolmogorov complexity in an already well-compressed description and the increased difficulty in improving an already well-developed theory
- It was interesting to end with challenges to the physics community, although it fits only tangentially with the essay topic (it would make a great essay topic for a future contest!)
- Your challenges #2 and #3 are profound questions: WHY the spin-statistics theorem? WHY the three generations in the Standard Model? There is certainly much to be learned if we can make progress with these fundamental “Why?” questions --- although the particular physics of our particular universe might just be arbitrary at the most fundamental level, forever frustrating our hopes of ultimate unification and simplicity.
What I liked less / constructive criticism:
- You say that the content of foundation messages (data sets expressing structures and behaviors of the universe that exist independently of human knowledge and actions) must reflect only content from the as-is-universe, despite the extensive work that humans must perform to obtain them. But this presupposes that we can have a reasonably access to the “as-is” universe, which many historians and philosophers of science would deny, saying that observations are always more-or-less theory-dependent (no such thing as a pure observation, independent of the previous knowledge of the observer): see for instance the articles “Theory-ladenness” and “Duhem-Quine Thesis” in Wikipedia.
- You say that in physics, the sole criterion for whether a theory is correct is whether it accurately reproduces the data in foundation messages. It is true that reproducing data is an important criterion, but is it the sole one? For example, a modern, evolved, computer assisted epicycles-based Ptolemaic model (with lots and lots of epicycles) could probably reproduce incredibly well the planetary positions data, but we could use other criteria (simplicity, meshing with theories explaining other phenomena) to strongly criticize it and ultimately reject it.
- I am not sure that the map analogy and the associated figure helps clarify the concept of a Kolmogorov minimum. Maybe it’s because I was distracted by the labels: Why pi-r-squared in one of the ovals? Why Euler’s identity? Why the zeros and ones along the path? Why is the equation E = mc2 written along a path that goes from Newton to Einstein, since it is purely an Einsteinian equation?
- Your short section on the “Spekkens Principle” is very compact and will probably remain obscure to many readers (it was for me). It might have been beneficial to expand it (I understand there was a length limit to the essay…) or to drop it altogether.
- Concerning your challenge no. 1… Like many mathematicians and physicists, I am in awe with Euler’s identity, but I am not sure that there is explicit undiscovered physics insight hiding within it. Once you understand that the exponential function is its own derivative, that the exponent i in e to the i*t comes in front when you derive with respect to time, that multiplication by i rotates a vector by 90° in the complex plane and that the velocity vector in uniform circular motion is perpendicular to the position vector, it becomes “evident” that you can model circular uniform motion (hence, the trigonometric circle) with an exponential function with an imaginary argument: Euler’s identity then follows from the fact that pi radians corresponds to half a turn, which is the same as multiplying by -1! If there is anything truly remarkable in all this basic math, it is perhaps that the ratio pi (or, more often, 2 times pi) appears so often in the fundamental equations of physics, even in phenomena that do not seem related in any way to circles or rotations.
And finally, a question:
In your expression “principle of binary conciseness”, what does the "binary" stand for exactly? The fact that it deals with TWO (or more…) theories that address the same data, or the fact that Kolomogorov complexity is often applied to strings of BINARY digits?
Congratulations once again, and welcome to the FQXi community! I hope you have the time to take a look at my essay and leave comments --- especially constructive criticism, which is unfortunately so hard to get in these contests, because of the fear of rating reprisal.
Marc
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 23, 2018 @ 00:39 GMT
Marc,
Thank you for such generous, detailed, and insightful comments! That is the best critique section I’ve seen yet for my essay, and your comments definitely made me consider how to do a number of items better.
I put the diagram together with a focus on capturing the idea of “excursions” away from the main message, and freely confess that some of my choices for labels were...
view entire post
Marc,
Thank you for such generous, detailed, and insightful comments! That is the best critique section I’ve seen yet for my essay, and your comments definitely made me consider how to do a number of items better.
I put the diagram together with a focus on capturing the idea of “excursions” away from the main message, and freely confess that some of my choices for labels were more humorous winks and nods at my own essay than realistic examples from actual data compression. It’s an old technique to see if folks are paying close attention, and in this case, you are the only one who said they noticed.
Labeling the main path that way, though… yes, that is pretty much an error, and I definitely would do it differently in retrospect. Since I’m pretty sure I’ll be doing new versions of that same figure in the future, that insight of yours will be helpful. Thanks!
So again: Thanks, that is really good feedback!
On your additional comments, Euler’s identity is in many ways is easy to understand, and certainly it is exceptionally elegant. The point about a physics interpretation is a bit more subtle, though. Sometimes things that seem very familiar can have additional implications that our familiarity actually makes harder to see. For that challenge, it really is a shot in the dark in the sense that I’m not assuming
any specific link to physics, just that its very brevity may imply a more specific
physical implication that so far has been overlooked.
By binary I am showing my computer science roots: We reflectively translate
everything numeric into zeros and ones. Other options did not work out that all well in the early years of computers, and binary is a pretty decent choice for a “universal” format for representing numbers.
I’ll now go take a look at your essay. You have me curious!
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
Jeffrey Michael Schmitz wrote on Feb. 18, 2018 @ 23:37 GMT
Dear Terry,
This is a well-written essay for a general science reader (by far the hardest type of essay to write). Looks like you have a good shot at winning. The word "tree" is simple, but a tree is complex. Does a simple equation mean a simple thing? Perhaps a simple equation just fits with how we communicate or think.
A side note: I thought spin 1/2 is the way it is because of interaction with photons a spin 1.
All the best,
Jeff Schmitz
report post as inappropriate
Author Terry Bollinger replied on Feb. 21, 2018 @ 20:28 GMT
Jeff,
I did a boo-boo and replied to you at the essay posting level instead of directly to your above post. So in case you or anyone interested has not seen my reply, you can either mosey down a couple of posts to the next Author posting, or
try this direct link. I also posted an essay assessment under your essay, which I assume you have seen. For anyone else interested, my assessment of Jeff’s essay
is located here.
Cheers, Terry
Declan Andrew Traill wrote on Feb. 19, 2018 @ 02:12 GMT
A nice essay. I think you would be interested in my 2012 FQXi essay titled "A Classical Reconstruction of Relativity" located here:
https://fqxi.org/community/forum/topic/1363
And my work on modelling the electron/positron wavefunctions as 3D standing waves, located here: http://vixra.org/pdf/1507.0054v6.pdf
I also have an essay in this year's contest titled "A Fundamental Misunderstanding" about a Classical explanation for QM entanglement (EPR experiment).
Regards,
Declan Traill
report post as inappropriate
Author Terry Bollinger replied on Feb. 19, 2018 @ 04:23 GMT
Declan,
Argh! Dang it! I was all ready to dismiss your 2012 essay out-of-hand as “obviously and immediately geometrically self-contradictory”… and then realized you’ve created a genuinely clever and self-consistent world with this idea, even if I’m still not convinced of it being the
same world we live in.
If I’m reading your idea rightly, what you have created is...
view entire post
Declan,
Argh! Dang it! I was all ready to dismiss your 2012 essay out-of-hand as “obviously and immediately geometrically self-contradictory”… and then realized you’ve created a genuinely clever and self-consistent world with this idea, even if I’m still not convinced of it being the
same world we live in.
If I’m reading your idea rightly, what you have created is a rigid, isotropic 3D universe in which gravity becomes something very much like optical density in a gigantic cube of optical glass. In fact, for photons I’m not seeing much difference at all between the variable-index glass cube model and your model. Light would curve near a star because the optical density of the glass would increase near the star, and so forth for all other gravity fields. That’s about as close of a match between a model and what is being modeled that you can get.
But your truly innovative addition to such model is the idea that since matter has a quantum wave length, it is
also subject to the same velocity and wavelength shifts in higher-optical-density space as are photons. Photon wavelengths shorten as the photons slow in denser glass, and similarly, so do your mass waves. But mass and total energy depends on these wavelengths, so you are using these changes to implement relativistic masses.
Once again, that
sounds like it should be an immediate contradiction with the extremely well-proven results of SR… except that it is not. You have to compare any two frames relative to each other, not to your “primary” frame of the giant optical glass cube, and that should still give you self-consistent and SR-consistent results.
To make matters worse, even though you have clearly designated one inertial frame as being in some way “special”, that does
not necessarily and absolutely mean that your model necessarily contradicts the enormous body of experimental observations that on the exact equivalence of physics across all inertial frames.
Alas, the problem is not that simple, since it is most definitely possible to create asymmetric frame models that fully preserve SR. You just have to take more of a computer modeling perspective to understand how it works.
I think I’ve already noted elsewhere in these 2017 postings that from a computer modeling perspective it’s not even all that difficult to create a model in which one inertial frame becomes the “primary” or “physical” inertial frame in which all causality is determined. All other inertial frames then become
virtual frames that move within that primary frame. Causality self-consistency is maintained within such virtual frames via asymmetric early (“it already happened”) and late (“the event has not yet occurred”)
binding of causality along their axes of motion relative to the primary frame. Speed of light constraints prevent anyone within such a frame from being aware of any causal asymmetry, since by the time the outcomes of both early (past) and late (future) binding events reach them, both are guaranteed to have occurred by information of the events reach the observer.
Incidentally, one of the most delightful implications of asymmetric causality binding in virtual frames is the answer it produces for the ancient question of whether out futures are predetermined or “free will”. The exceedingly unexpected answer is
both, depending on what direction you are facing! For us, if one plausibly assumes that the CMB frame is the primary frame, the axis of predestination versus free will is determined by whether the philosopher is facing toward or away from a particular star in the constellation Pisces, though I don’t recall off hand which is which. Direction-dependent philosophy for one of the most profound questions of the universe, I love it!
Even better is the fact that
no one in any of the frames, primary or virtual, can tell by any known test that can do whether they are or are not in the primary frame. Special relativity thus is beautifully maintained, yet at the same time having a single physical frame hugely simplifies causality self-consistency.
Bottom line: I can’t even fault your idea for its use of what is clearly just such a singular frame, because I know that having such a singular frame can very beautifully support every detail of SR. Ouch!
So, ARGH! Your 2012 model is a
lot harder to disprove than I was expecting… and please recall the goal in science is always to destroy your own models to prove that they really, truly can pass muster.
Well. Wow. I can’t rate your 2012 contest model, which I think makes me happy because it would take me a lot of closer examination of your model to comment on it and feel confident. You have a lot of equations and equation specificity there.
But it’s late so I’m calling this a wrap. I won’t forget your model. And the key defense you might want to keep in mind, since I’m sure your earlier attempt got tossed out for violating SR, is simply this: Having a primary frame in a physics model
is not a sufficient reason to dismiss it because there exist single-frame models can be made fully consistent with all known results of special relativity. Given that such models are possible, any attempt to eliminate a model
solely on that criterion is a bogus dismissal. You have to find a true contradiction with SR, one that flatly contradicts known results, rather than just offending people philosophically for making SR more like a computer model and less like an absolutely pristine mathematical symmetry. It’s not the beauty of the symmetry that counts in the end, it’s whether your model matches with and perfectly predicts observed reality, that is, whether it is Kolmogorov in nature (see my essay again).
Thank you for helping me tear my hair out in frustration!… :)
(Actually, seriously: Good work! But still… argh!)
Cheers,
Terry Bollinger,
Fundamental as Fewer Bits by Terry Bollinger
view post as summary
Author Terry Bollinger wrote on Feb. 19, 2018 @ 02:27 GMT
Dear Jeff,
Thank you for your kind comments! I looked up your
interesting short essay and added a posting on it.
Regarding spin, it’s definitely the interaction
between identical fermions, e.g. a bunch of tightly packed electrons, that makes them unique. What happens is that the
antisymmetric nature of the fermion wave functions cause surfaces of zero probability of finding an electron to form between them. This compresses the electrons, which do
not like that at all and fight back by trying to expand the space within these zero-probability cells that form around them. The result is a kind of probability foam that we so casually call “volume” in classical physics. Without this effect, earth would be just a centimeters-ish black hole.
This
Pauli exclusion occurs for any cluster of identical fermions, regardless of electromagnetic or any other kind of charge, and so is completely unrelated to electromagnetism and the spin 1 photons that make electromagnetism possible.
By far the best short explanation of antisymmetric (spin ½) and symmetric (spin 1) wave functions that I’ve encountered on the web are these two teaching notes by Simon Connell, a physics professor in South Africa:
Symmetric / antisymmetric wave functions Pauli's exclusion principle Cheers,
Terry Bollinger,
Fundamental as Fewer Bits
Heinrich Luediger wrote on Feb. 19, 2018 @ 10:37 GMT
Dear Terry
“The universe indisputably possesses a wide range of well - defined structures and behaviors that exist independently of human knowledge and actions.” is what you say. I’m not asking for a proof of that naturalistic dogma (it does not exist), only a minimum level of critical attitude. Hilbert eventually understood that what a point or a line is doesn’t fall into the realm of logic/mathematics. And the literature dealing with what a bit information-theoretically is worth multi-gigabytes…
Heinrich
report post as inappropriate
Armin Nikkhah Shirazi wrote on Feb. 19, 2018 @ 14:12 GMT
Dear Terry,
You presented your essay from a viewpoint with which I have little familiarity, and as a result I truly enjoyed having familiar ideas examined from a perspective that was novel to me.
A few comments:
1. Your example involving the sequence which can be found in the decimal expression of pi reminded me of the fact that most irrational numbers are still unknown to us....
view entire post
Dear Terry,
You presented your essay from a viewpoint with which I have little familiarity, and as a result I truly enjoyed having familiar ideas examined from a perspective that was novel to me.
A few comments:
1. Your example involving the sequence which can be found in the decimal expression of pi reminded me of the fact that most irrational numbers are still unknown to us. But with the irrational numbers we do know, your example gave me the idea that one might try the following cookie cutter approach which requires little creativity to help more efficiently compress a sequence: take a set of irrationals, take their representation in base 2,3 etc. up to 10 (if one wants to incorporate the compression of sequences which contain letters, then go higher) and create a table which contains the numbers and their expansions up to some number of digits, say, 50 million (the larger , the better). It seems that one then has a ready-made 3D "coordinate system" in which the three coordinates are: A symbolic representation of the irrational number, its n-ary expansion, and the position of the first digit of the sequence in that expansion. The sequence could then be compressed by just giving its coordinates. Due to my ignorance in these matters, I am not sure if this is too naive, elementary or unworkable of an idea, but I believe one cannot learn if one does not take the risk of occassionally embarrassing oneself.
2. Your reconceptualization of theory development in physics as data compression strikes me as an abstraction that could be useful for comparing the historical development of different theories. Perhaps it has some unexpected use and application in the history and philosophy of science. Unfortunately, I know too little about data compression to be able to assess the merits of this possibility, but it seems that you might? Another idea you discussed for which I see connections with the philosophy of physics is the trampoline effect applied to the standard model, which reminds me a bit of Kuhn's crisis phase.
3. Your discussion of the Kolmogorov minimum at times reminded me of the variational principles. Do you know whether such connections exist?
4. With regard to your first challenge, I am glad that you, as what appears (to me, at least) a hard-nosed physicist, ask the meaning of the mathematics we use to model the phase of the quantum state. All too often I find that people are not even aware of how little we know about its phyiscal origin. Saying that it is the time-dependent solution to Schrödinger's equation is too me little more than a fig leaf for our ignorance. I admit that my perspective is influenced by the fact that I have thought about this question quite a bit.
5. With regard to your second challenge, I think that there will be a convergence with respect to what from the Kolmogorov approach would be considered a simple answer and one that might in more qualitative terms be considered philosophically satisfying. I am glad that you called out the all too-convenient method of "solution by denial that a problem exists".
6. With regard to your third challenge, I believe that a refactoring of the Standard model will not happen before a paradigm change occurs. In my view, what is missing to discover a simpler understanding of the standard model is a conceptual framework which redefines its conceptual building blocks, analogous to how what was missing for the ancient mayas for a simpler understanding of astronomy was the concept of planets orbiting the sun. I am receptive to your call to lay off gravity when trying to simplify our understanding of the standard model, but that is only because I already hold the view (or bias) that if nature wanted gravity to be quantum, it would have given us more (actually, any) experimental evidence that it is quantum.
7. Your principle of Kolmogorov simplicity reminds me a little of Zeilinger's principle that the most elementary system carry only one bit of information. Any thoughts on the relationship between these principles?
Overall, I do agree that the way to advance our understanding of fundamental physics is to find simpler reconceptualitions. My background knowledge of Kolmogorov simplicity is too incomplete to be able to tell whether it is the definitive criterion for simplicity, but it certainly seems promising.
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 20, 2018 @ 03:00 GMT
Dear Armin,
Thank you for such a thoughtful and detailed set of comments! I’ll take out of order so I can address #3 first:
----------
#3. Wow, good catch! Not only are variational principles relevant to straightening out the Kolmogorov path, I had to cut that section out due to length constraints!
The variation of variational :) that I was originally planning to use...
view entire post
Dear Armin,
Thank you for such a thoughtful and detailed set of comments! I’ll take out of order so I can address #3 first:
----------
#3. Wow, good catch! Not only are variational principles relevant to straightening out the Kolmogorov path, I had to cut that section out due to length constraints!
The variation of variational :) that I was originally planning to use began with an explanation of functionals (paths or trajectories) from Feynman’s Quantum ElectroDynamics (QED). I then talked about the way to tell when you were close to the optimal path was that nearby paths would have very similar phases, causing the overall bundle of paths to reinforce each other. Finally, that has to be translated into the idea of similar data sets or messages are also mutually reinforcing.
That last point is where it got too complicated, too diverse, and frankly too new. Data sets can sometimes match up in a fairly direct way, e.g. when comparing two genes by seeing how well their halves combine in solution. But in other cases you would need first to find just the right “space” in which to compare the data set, an idea that is closely related to data visualization. Finally, in the case of messages in the more conventional sense of known programs, you get into the complicated and historically rather unsatisfying field of evolutionary programming, albeit with an interesting twist that might well be worth exploring. The idea would be to create a set of transformation operators that all guarantee the program will still provide the same outputs (data set), use the operators to create as huge and dense of a cloud of such equivalent programs as possible, then look for regions in the
cloud of programs in which subsets end up all being very similar. Those regions would nominally represent the “least action” regions, and thus the core of the real message.
The biggest problem I see with the cloud idea is that unless the variational program generator is designed carefully it could easily create artifacts—e.g. areas of varying “program density”—that could mess up the search. For efficiency you would probably want to start with some kind of sparse Monte Carlo generation to look for “interesting regions”, then start increasing the program (message ) densities within those regions to see if the trend holds, and to find more details.
The overall process would not be terribly some other forms of evolutionary programs that also create equivalent or slightly different variations. However, here the focus would be on creating functionally identical programs, not variations, and then finding new ways to shorten or optimize them. The quality criterion would also be unusual and more automated, looking for message subsets that are common across messages and thus more likely to represent the key parts of the message.
----------
#7. Again, good catch! Just a few day ago I added
an extended reply to Noson Yanofsky in which I did some exploration of the idea of that over time, the amount of meaning per message
increases. By “time” I should note that I mean not just the past few centuries or even millennia, but over the history of the entire universe. The end result for more common message types would be just one bit per message, but even in that case the
meaning per bit — the impact on the physical world — would continue to increase over time.
----------
#6. I like and agree without your point that it is way past time for the Standard Model to undergo a good
discontinuous extended community reorganization of conceptual knowledge, or DECROCK. :) And yes, I just now made up that phrase and acronym because I cannot bring myself to slide two ten cent coins on a table after decades of hearing that once-noble phrase overused and misused for sales and research funding purposes. And besides, decrock — let’s make it a verb instead of acronym, so DECROCK has now been officially deprecated after one just one sentence of existence; sorry about that DECROCK, such are modern times! — sounds like someone tipping over a crockpot to dump out aging bits of this and that that have been simmering for way too long. Dumping is must as much of a part of decrocking as creativity, since one of the critical features of such an event is that the explosion of creativity is different from ordinary, individual-level creating. Decrocking creativity is instead a community-wide crystallization effect in which previously disparate bits of data and isolated concepts suddenly start fitting together smooth, pushing out and displacing the older, less useful ideas that had been obscuring and blocking the crystallization process much like water that is too dirty can slow the formation of sugar crystals that otherwise might have formed spontaneously. Such a “sudden fitting of the pieces” happened both conceptually and quite literally in the case of the plate tectonics decrocking that took place in the early 1970s in the US. (In many other countries it happened years earlier.)
(Belatedly initiating a Google deconfliction search… hmm… oh wow, really?… oh well, good enough, it’s a very minor conflict community-wise, and it’s not an verb…)
So: It’s way past time for the Standard Model to undergo a deep-dip decrocking! And as an extra benny, you get to keep your two dimes and shifty fingers in your pockets.
----------
#5. I too hope that folks will begin to realize that spin statistics is a very deep and important issue, one that I would judge is playing some hidden and critical role in preventing a deeper consolidation of the Standard Model. This is like literally a Nobel Prize and worldwide fame just waiting to happen for anyone who can find it.
#4. I hope also that someone can make some progress on that wonderful, beautiful little equation:
#2. Applying Kolmogorov minimization to histories of theories may be both doable and interesting, since such histories are data with structure. I would hesitate however to characterize the trampoline effect as similar to the slow accumulation of both stale facts and new facts that collectively lead to a new synthesis. The trampoline effect is pathological, creating something more akin to a huge boil full of, uh, we’ll euphemistically call it
fluid, that contains only expansions and variations of pathogenic tangents that lack the kind of new universe-inspired facts that cause a real Kuhn crisis to decrock the past and crystalize a brand new fabric of deeper comprehension.
#1. You are saying something interesting there I think, but I have to confess that didn’t quite get the idea?
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
Armin Nikkhah Shirazi replied on Feb. 21, 2018 @ 13:32 GMT
Dear Terry,
Thank you for your extended reply. I just wanted to acknowledge the following:
3. I am glad to see that connections between the variational principles and Kolmogorov Complexity have been discovered already. Applied specifically to the path integral, I believe that there is a more fundamental principle at work. In an amended form of Philip Gibbs' phrasing, it could be...
view entire post
Dear Terry,
Thank you for your extended reply. I just wanted to acknowledge the following:
3. I am glad to see that connections between the variational principles and Kolmogorov Complexity have been discovered already. Applied specifically to the path integral, I believe that there is a more fundamental principle at work. In an amended form of Philip Gibbs' phrasing, it could be stated as "Nothing actual means everything potential", and in a formulation that I have called the default specification principle (and which I think is a bit more precise), it can be stated as "The absence of an explicit specification entails all possible default specification outputs". The idea is that when something is not in some way specified or pinned down, then of all the consequences that could result out performing that specification, all are available as "live possibilities". This has an essentially tautological character, except that it captures a distinction between things which are specified "explicitly" and things which are specified "by default", i.e. they are specified due to background constraints. For instance, if I wait until a throw a regular six-sided die, I know that, say, the number 7 or King of clubs are not among the possible outcomes of a throw. I don't know enough about information theory to be able to tell, but it seems to me that the realization that there is such a distinction, which is essentially ontological in character, still remains to be made. Possibly, if and when it is made, it could help by shifting some of the complexity of, say, a message to the background constraints.
7. I find the notion of an increase of meaning per message very interesting, in my mind it seems something analogous to a second-order effect, but possibly it could be conceptualized in terms of just the kind of background constraint I mentioned in my previous point. To give an (albeit rather hokey) analogy with the throw of a die: When I say that I hold a die in my hand which I am about to throw, but nothing further about the nature of the die, the possibilities for outcomes of a throw can be large. Without specifying the number of sides or what is on the sides, even the number 7 and the King of clubs are possibilities. Perhaps over time, you learn somehow that my die only contains numbers on its sides, in which case king of clubs is no longer a live possibility, but the number 7 still is. When you finally learn that I only ever throw 6-sided dies, you can then also eliminate the number 7, and thereby simplify the encoding of the possible outcomes. Conversely, by somehow incorporating these background constraints, you could increase the "meaning per message" by referring (as before) to a "die throw" but shifting the complexity to the background instead of having it contained in the message itself. (Incidentally, I will give a talk on the default specification principle at the APS March meeting and plan on filming it, if you are interested, let me know and I'll notify you when upload it).
6. I honestly did not realize that "paradigm change" has become a dirty phrase, but then I may not have been exposed to its abuse as much as you have. DECROCk is certainly a humorous take, but however it is called, I agree that it will involve discarding ideas which are no longer useful.
2. I think in Kuhn's model, the "slow accumulation of both stale facts and new facts" is actually still part of normal science. As I understand it, the crisis period refers to the one in which there is a competition between a number of different candidates for a paradigm without a clear favorite.
1. This was simply meant as a simple straightforward generalization of the example with the sequence in pi you gave: Instead of just pi, consider a set of irrational numbers, instead of a decimal expansion consider all expansions (binary, ternary etc) up to some number that is deemed useful, and instead of just some random number of digits in the expansion consider some standard. Then use this to create a giant look-up table the specification of the addresses within which is less complex than the specification of a given sequence itself. Like I said, this may be more naive or trivial than you might have thought.
All the best,
Armin
view post as summary
report post as inappropriate
Anonymous wrote on Feb. 19, 2018 @ 19:42 GMT
Dear Heinrich Luediger,
I took the liberty to read your “Context” essay before attempting to respond to your comments, to make sure that I understood fully what you are attempting to say. If you have read enough of my posting comments for this year’s (2017) contest, you will surely be aware that I hold philosophy as an approach to life in high regard, and that some of my favorite...
view entire post
Dear Heinrich Luediger,
I took the liberty to read
your “Context” essay before attempting to respond to your comments, to make sure that I understood fully what you are attempting to say. If you have read enough of my posting comments for this year’s (2017) contest, you will surely be aware that I hold philosophy as an approach to life in high regard, and that some of my favorite essays this year were written by philosophers.
My first warning that your essay might be rather unique was when you quoted a line from Kant that eloquently restates what every mother or father of an enquiring child already knows, which is that we humans like to ask “why” in situations where no one has an answer. Here is the Kant line you quoted:
“… it is the peculiar fate of human reason to bring forth questions that are equally inescapable and unanswerable.”
From that simple observation you somehow (I do not yet see how) inferred this:
“… we may read Kant’s disturbing assertion as: human knowledge is without false floor, irreducible and hence not tolerating questions.”
I would estimate that well over 95% of readers would instead interpret that line from Kant as a gentle and basically humble reminder of how deeply ingrained curiosity is in most of us, and that the hard questions that such curiosity engenders are a good thing, rather than something to be discouraged. That you instead interpreted his comment as an assertion that people should
stop asking questions is very unexpected.
Thus I was genuinely curious to find out why you interpreted this line in this way, and so read your essay in detail to find out why.
As best I can understand your worldview from that careful reading, you believe sincerely that special relativity, general relativity, and quantum mechanics are all unreal mathematical fantasies whose complex, incomprehensible mathematical structures are used by a small group of people, positivist scientists mostly, in positions of power and privilege. In contrast you believe that only the older Newtonian physics that is more accessible to your direct senses is valid. Finally, you believe that the same group that uses these false mathematical frameworks to maintain positions of privilege are also very worried that people such as yourself might join together to ask hard questions that would uncover the falseness of their mathematical fantasies, and so undermine their positions. You believe therefore that this same group works actively keep suppress people like you even from speaking about the falseness of their QM, SR, and GR frameworks.
Let me specific about which lines in your essay led me to the above inferences:
Pages 2-3: “Since both SR/GR and QM 5 are not associated with phenomena whatsoever, modern physics, by having taken us into the never-Here and never-
Now, has become speechless, i.e. cannot translate logic and mathematics back to meaning other than by fantastic speculation and daring artistic impression.”
Page 3: “Hence it doesn’t come as a surprise that mathematically driven physics moves tons of data just to remain void of experience. In other words, much of modern physics stands in false, namely affirmative-logical, relations to the rest of human knowledge.”
Pages 3-4: “So, I’m absolutely convinced that classical physics has not been falsified in the sense of contradicting human experience.”
Page 4: “Of course I’m not denying that there are instrumental observations that don’t agree with classical physics, but that is not what theories primarily are about. Rather they are meant to ‘make observable’ novel domains of experience and in order not to ‘sabotage’ established domains of experience they are to be incommensurable, i.e. orthogonal, and thus additive.”
Page 4: “Positive, that is, logical knowledge does not permit rhetorical questions for the reason of creating strings or networks of affirmations and precipitating as unscientific whatever is not tractable by its analytical methodology. And by successively entraining us into its network we are becoming ants in an ant colony, fish in a fish school and politically-correct repliants of ever-newer, the less intuitive the better, opinions.”
The next-to-last quote above is to me the most fascinating. I was genuinely scratching my head as to how you were handling instrumental observations that do not agree with classical physics, of which there are shall we say, quite a few? I see that you do not deny the existence of such observations — I was actually a bit surprised by that — but that you instead seem to interpret them as ultimately irrelevant data that have very little impact on everyday Newtonian-framework reality and observation, and so do not really mean much… except to the positivists, who jumped on them collectively (additively) to create huge nonsensical mathematical fantasies that make bizarre and incomprehensible predictions that are unrelated to reality.
However, I think it is the last quote above that is the most evocative of how you feel about what you perceive as the situation, and your level of anger about it. You seem convinced in that quote that this group has dedicated itself to ensuring that even that tiny remaining group of true, reality-believing inquirers such as yourself, the ones who still believe in the readily observable reality of the Newtonian world of physics, will be scooped up relentlessly, utterly isolated, driven to silence, and made into nothing more than mindless, unquestioning ants.
Such a perspective helps make more comprehensible your unexpected view of the simple observation from Kant, the one about the incessant and unanswerable curiosity of most humans. I suspect (but am not sure) that you are reading Kant’s line not as some gently intended general observation on the nature of curiosity in both children and adults, but as some sort of subtle warning from Kant to his followers that there exist people such as yourself who understand what he and his followers are really up to — creating indecipherable scientific fantasies that they can then use to build up a power base — and that this group needs to be shut down to keep them from asking unanswerable questions that would expose the unreal nature of their mathematical fantasies.
I’ll end by pointing out that I think you have a serious inconsistency in your beliefs, one that leaves you with two choices.
You say you do not accept the reality of quantum theory, yet your daily actions powerfully contradict that assertion. Even as you read this text you are reaping enormous personal benefits of from these supposedly imaginary mathematical frameworks.
Why? Well, are you or are you not using a personal computer, laptop, or cell phone to read this posting, and to post your own ideas?
The problem is that semiconductor chips on which all of these devices depend cannot even exist within classical physics. They can only be understood and applied usefully by applying material and quantum theory. So, if you insist that only objects you can see with your own sense are real, look at what you are doing right now on your electronic devices. Ask anyone you can find with a solid-state electrical engineering background how such device work. Take the time and effort to let them teach you the basic design of devices that you can see are real and right in front of you, both at the laptop level and by using a Newtonian microscope to look at the complexity of the resulting silicon chips. Let your own senses convince you, with the help of someone you can trust—and surely you can find at least one electrical engineer whom you know well enough on a personal basis that you trust them to be honest about how those clearly real chips were designed and built?
There are other examples. Do you have lights that turn on at night? Einstein was the one who created quantum mechanics when he explained why such sensors
cannot be explained by classical waves.
Do you recall the old cathode-ray tubes? Were you aware that the electrons that write images on the screens of such devices travel fast enough that you cannot design such devices without taking special relativity into account?
But if you insist that none of this is real, I must ask: Shouldn’t you then stop buying and using all such devices? Their very existence compromises your fundamental premise that they are based upon mathematics that are not real, and are designed only to perpetuate power. How then can you continue using them?
The only other alternative I can suggest is that you examine more closely both why your feel there is a conspiracy.
For whatever it’s worth. I assure you as someone whose friends will testify to my honest and who has worked in high tech and science areas for decades that until I read your essay today, I had never before encountered the idea that QM, SR, and GR might be fantasies that some group of people uses to maintain power and suppress questions. The people I have known just found these mathematical constructs to be incredibly useful for building things (QM hugely, but also SR) and for understanding observational data (GR for astronomy). They would have been horrified (and literally unable to do their jobs) if someone had taken those tools away from them.
Since you seem to be a thought leader for this idea that QM, SR, and GR are part of a large, centuries-old mathematical power conspiracy, I don’t seriously expect you to be persuaded to abandon your belief in a conspiracy to promulgate false mathematics as physics. But I can attest to you that from my decades-long personal experiences at many levels of science and applied technology that I simply have not encountered anything that corresponds in to the kind of false math or false intent that you describe. So, I at least want to point out to you the option of changing your mind.
Sincerely,
Terry Bollinger
Fundamental as Fewer Bits (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
report post as inappropriate
Heinrich Luediger replied on Feb. 20, 2018 @ 12:27 GMT
Dear Terry (if that’s you),
Thanks for giving so much thought to my essay!
To begin with: already the comment I left on your site should make clear that I’m not under the impression of an ongoing conspiracy, but rather believe that much of science has got “lost in math”, to quote Sabine Hossenfelder. However, other than Hossenfelder I take the title of her book literally,...
view entire post
Dear Terry (if that’s you),
Thanks for giving so much thought to my essay!
To begin with: already the comment I left on your site should make clear that I’m not under the impression of an ongoing conspiracy, but rather believe that much of science has got “lost in math”, to quote Sabine Hossenfelder. However, other than Hossenfelder I take the title of her book literally, namely that certain branches of physics have ended up in a blind alley by having moved (in Kantian terms) beyond possible experience. Hence my comment was triggered by your plain assertion that the universe ‘indisputably’ exists independent of mankind. I was simply shocked to find eclipsed the knowledge and (logically negative) experiences of people I guess we equally admire (Hilbert, Goedel, Tarski, etc. and also Wittgenstein).
Though I admit that my essay is fairly provocative, and obviously arousing your dissent, you shouldn’t claim of the essay what espressis verbis it doesn’t. You say that I interpret Kant’s view (of the peculiar fate of human knowledge) as the assertion “that people should stop asking questions”, whereas I say that “…for us to be human the scientific-rhetorical question, while it has no answer, is yet the condition sine qua non,…”. So, what I say is that the question is very important, but that we should let fare all hope that it can be answered for the reason of being made up from incommensurables, i.e. containing a priori elements. Hence the question is the ground from where to think beyond it.
I happily use my computer for the reason that it is not quantum but wonderfully deterministic. The behavior of electronic components has been derived from Bohr’s model of the atom. The foundations of the electronic band structure were developed by Bloch, Bethe, Peierls and A. Herries Wilson between 1928 and 1931, who all were students of Sommerfeld or Heisenberg. So, much quantum, but little mechanics there.
Last, in my essay I say that modern physics offers explanations and models for instrumental observations deviating from classical physics. And that’s absolutely fine with me unless these mathematical devices are being reified (as e.g. space-time or configuration space), for then they begin to ‘predict’ things beyond possible experience.
You see, no conspiracy only lost in math…
Heinrich
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 21, 2018 @ 10:08 GMT
Heinrich,
Thank you for such a thoughtful (and cheerful) reply to my critique! Reading your reply also makes me feel better about your essay itself, since it shows a side to your views that perhaps does not come through in your more narrowly focused essay.
I think the saying that we can agree to disagree works here. But doesn't using a fully classical computer sometimes get a little Bohring?... :)
Finally, I just have to mention that your first name, Heinrich, stands out for me because it was a very common name in my family eight generations ago, when they first came to the New-To-Europeans-World from Germany. They were from the same area in Europe, near a large lake (can't recall the name) as Bollinger sandstone and Heinrich Bullinger. So, probably some interesting history there. The name was transformed to Henry once they settled in Missouri.
(Regarding my phrase New-to-the-Europeans-World: I think is quite likely that the native Americans had already noticed both that their world was there, and that in terms of generations of ancestors, it was not even particularly new. They were quite observant about such things, often more so than folks who walk around all day with smart phones in front of them... :)
Cheers,
Terry
Heinrich Luediger replied on Feb. 21, 2018 @ 13:40 GMT
Dear Terry,
Maybe it’s in the German genes that we prefer to think it out over trying it out...
Heinrich
P.S. From the hints you gave your family once came from Switzerland. Gruezi!
report post as inappropriate
Heinrich Luediger replied on Feb. 21, 2018 @ 15:38 GMT
Dear Terry,
just one point I overlooked in my previous response. “I think is quite likely that the native Americans had already noticed both that their world was there, and that in terms of generations of ancestors, it was not even particularly new”.
There still exist almost zero-contact Indian tribes deep in the Amazonian jungle having words for exactly one generation up, one generation down. So, there is no word for e.g. grandfather. Hence there is good reason to assume that North-American Indians, prior to colonization, had similar kinship recognition, i.e. I don’t think they had any idea that their world “was not particularly new” (famous is Whorff’s analysis of the absence of ‘time’ in Hopi). What NA Indians indeed had (or at least most) were creation myths, that is, principled explanations about the coming in place of their existence, typically beginning with ravens, eggs, deer and coyotes ...today its Big Bang and Inflation…
Heinrich
report post as inappropriate
hide replies
peter cameron wrote on Feb. 20, 2018 @ 09:27 GMT
Hello Terry,
imo your contestant pledge is right on target, makes specific some of the concerns and disappointments i've felt in exploring many of the threads, and particularly the offers to barter good scores. Only point on which i hesitate is your contention that one should avoid rating an essay highly because its conclusions are agreeable to a given reader's perspective. After all the...
view entire post
Hello Terry,
imo your contestant pledge is right on target, makes specific some of the concerns and disappointments i've felt in exploring many of the threads, and particularly the offers to barter good scores. Only point on which i hesitate is your contention that one should avoid rating an essay highly because its conclusions are agreeable to a given reader's perspective. After all the rationalizations are done most folks just do what they feel like doing, and to accept that reality seems to yield a less complex world view.
Many thanks for the short and clear explanation of Kolmogorov complexity. Agree it is a good metric.
Your three challenges seem for the most part well chosen and relevant to the present muddle in particle physics theory.
the first, the Euler identity, seems perhaps the most difficult, as the compression is greatest there. Expressing it in terms of sin and cos suggests amplitude and phase of the wavefunction. And the presence of '1' suggests unitarity. Beyond that there seems to be only our desire to see what connections might 'pop out' in a deeper understanding of the physics, for which we as yet have no clear perpsective.
the second, the quest for a simple explanation of fermion-boson statistics, also remains to be had. My sense again is that we need a deeper understanding of the wavefunction. Point particle quarks and leptons with intrinsic internal properties leave us lost in almost meaningless abstraction.
and the third, to 'refactor' SM without adding gravity or complexity... Certainly to simplify, to reduce rather than increasing complexity in our models, is an essential aspect. Agree with the hope that such a simplification would have a natural place for gravity, that it would not be necessary to put it in 'by hand' so to speak.
It seems to me that to meet your challenges will require improved models of the wavefunction.
Finally i think it is good to keep in mind that the geometric interpretation of Clifford algebra, geometric algebra, has shown the equivalence of GR in curved space with 'gauge theory gravity' in flat space. Introduction of the concept of curved space came not from the physicists, from Einstein in particular, but from the math folks. Einstein was looking for math tools to express his physics understanding. Geometric interpretation was lost with the early death of Clifford and ascendance of the vector formalism of Gibbs, was not rediscovered until the work of Hestenes in the 1960s. What was available to Einstein was the tensor calculus. History is written by the winners, and Einstein's true perspective has perhaps been distorted by those who most readily embraced the formalism he adopted, the math folks and their acceptance of Riemann's view of his creation.
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 20, 2018 @ 19:46 GMT
Greetings Peter and Michaele,
Thanks you for this marvelous and extremely interesting set of comments! I did not know of the existence of viXra.org, which seems to have the same free-access goals arXiv.org originally intended to provide. Once I found it (with some difficulty; Google Scholar does not index it) and your spot there I downloaded a large sampling of your papers.
Each...
view entire post
Greetings Peter and Michaele,
Thanks you for this marvelous and extremely interesting set of comments! I did not know of the existence of
viXra.org, which seems to have the same free-access goals arXiv.org originally intended to provide. Once I found it (with some difficulty; Google Scholar does not index it) and your spot there I downloaded a large sampling of your papers.
Each number (n) indicates your comment paragraph to which I am responding:
(1) That’s a good catch on my Pledge. The italicized part of my line about not making the conclusion everything shows my intent was what you just said it should be, but my second line sort of contradicted that. I’ve updated the Pledge to v1.3 to fix the second line; please take a look and see if it works.
(2) Thanks! To be honest, looking at Kolmogorov more closely for the purposes of this contest helped me understand it better, too. Recognizing that the Kolmogorov minimum model is isomorphic to a formal model for lossless data compression was fun, sort of like a little “aha!” light going off in my head.
(3) That is encouraging feedback on my three challenges; thanks!
(4) The Euler equation challenge was in some ways the most interesting to me, in no small part because it is a pure and pristine outcome of the argument in the essay. Unlike the other two, I have absolutely no idea where it might connect into physics. But if I believe my own arguments about Kolmogorov compression, then there is a very good chance that somehow it
does, and we just do not see it. Certainly the sin-cos breakdown seems like a hint, I agree. I’ve always found that equation interesting, but now my curiosity is even higher.
You do realize that your own impedance reformulation of quantum math may provide a new way to look at Euler’s equation, yes? Sometimes something as simple as flipping the numerator and denominator provides a whole new way to look at old problems, as you clearly have noticed by using impedance instead of the more traditional conductance. So who knows, perhaps you and Michaele (I confess I have no idea how to pronounce her name) will nail that one!
(5) You said “… Point particle[s] … leave us lost in almost meaningless abstraction.”
Yep, especially since QM assures us that point particles do not exist anywhere in the real universe. So why then do we insist on using them in our math, which unavoidably results in infinity artifacts. (By “artifacts” I mean computational results that are not really part of the problem, but instead are just noise generated by the particular method we are using to model the problem.) I am not in the least surprised that you were able to get rid of renormalization costs in your impedance approach, since by flipping your primary fraction upside down you halted the model-induced generation of point-particle infinitesimal artifacts. If you’ve written or plan to write any software for your model, I would anticipate that such software will prove to be hugely more computationally efficient for the same reason.
I think a lot more folks need to hear about your impedance reformulation of QM, and to take its potential computational properties seriously. You do realize that more efficient quantum modeling software can be worth
lots of money in areas such as pharmaceuticals and materials research? If your impedance reformulation can increase computer based quantum modeling efficiency by eliminating the costly renormalization steps, you could well be sitting on top of a little gold mine there without even realizing it.
(6) I too would love to see that simpler Standard Model! Simpler versions of it would almost certainly clarify one way or the other how gravity fits in.
(7a) You are preaching to the choir! I love the Clifford and (more cryptic) Grassmann works. I too have never quite forgiven Gibbs, but in my case more specifically for his bad-programmer artificial deconstruction of the gorgeous and dimensionally unique symmetries of Hamilton’s quaternion to create dot and cross products. The easily dimensionally generalized dot products of vectors I’m sort of OK with, but the 3D-locked cross products in which did things like arbitrarily inverts signs are to me a mess that likely covers up something simpler.
Maxwell did after all write all of his laws in quaternions, and they worked beautifully. It as Heaviside who massively transformed and compressed them into their current vector form. Remarkably, Heaviside then insisted that the much more compact and massively transformed set of equations still be credited to Maxwell. But despite this selfless act of generosity from Heaviside’s soul (which perhaps went up, up, up, up to the Heaviside layer? and yes, the ionosphere really was named for that same Heaviside, but only humans in Cats outfits seem to recall that), the conversion had some issues: the quaternion and vector versions of Maxwell’s laws are not
quite isomorphic, due the Gibbs delinking of the dot and cross products, and to his reinterpretation of the cross product. I suspect that this is the source of certain minor anomalies in out modern use of Maxwell’s equations.
It was the subsequent attempts to make Gibbs’ cross product just as easily generalizable to higher dimensions as the dot product that resulted in Grassmann and Clifford algebras. I have some trouble with that. Since the very first cross product had already had its original subtle quaternion symmetries mangled by Gibbs, artifacts and some obscuration had already begun before Grassmann and Clifford tried to generalize the concept further. To me that speaks of the likely loss of more subtle symmetries that exist only in the 3+1 space of quaternions, and at least some insertion of artifact noise into Clifford and Grassmann algebras.
Alas, many a physics PhD student has crashed their thesis into the stubborn wall of figuring out how quaternions may be relevant to more than just Maxwell’s equations. But I’m going to give you a bit of a hint here: Given what you are doing and are trying to do, you
really need to look a bit more closely at the true origin of this entire generalization mess, which is the quaternions. And by “mess” I am including not just the dot-product vectors, which at least generalized cleanly, but also the cross-product Clifford algebras, which frankly did not come off nearly as well after the Gibbs-induced Great Split. 3-space is quote unique for its vector-spin equivalence, and only quaternions truly capture that. Clifford algebras are nice, but can never recapture that unique set of 3-space relationship at higher dimensionalities, simply because they do not
exist in any of the higher (or lower) dimensionalities. I don’t think it’s an accident that our space is a 3-space.
(7b) Regarding both your impedance model of matter and your observation that there are viable mathematical alternatives to curved space, your may find this essay of interest:
A Classical Reconstruction of Relativity by Declan Andrew Traill.
My comments on his essay may help explain why I suspect Andrew’s ideas are relevant to yours.
BTW, having looked at his early papers closely, Einstein really was as many have asserted over the years really not that great at math. So as you noted, he tended to use whatever was available and that others could help him with. Oddly, he did not seem to be particularly visual either, since for example it was Minkowski who came up with spacetime. As best I can tell, Einstein was instead sort of like a human physics simulator. That is, he could almost intuitively model and understood how physics would work in a given situation, and then use that understanding to look for and fix problems in the simulation. The hard part for him was the extreme difficulty he tended to have when attempting to convert those insights into words or equations. I cannot help but think of it as a bit like some form of autism, only one focused around physics. A very unique mind, Einstein, which I guess should be no surprise to anyone.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
peter cameron replied on Feb. 22, 2018 @ 18:05 GMT
Terry,
Like your practice of numbering responses by paragraphs. and a useful analytic. Had not thought of approaching my reading in that manner. Like that free acrobat permits highlighting and commenting.
Many interesting thoughts in your responses. Skipping beyond a few of them to
(4) Euler's identity - yes, apparent deepness of connection between math and matter befuddles....
view entire post
Terry,
Like your practice of numbering responses by paragraphs. and a useful analytic. Had not thought of approaching my reading in that manner. Like that free acrobat permits highlighting and commenting.
Many interesting thoughts in your responses. Skipping beyond a few of them to
(4) Euler's identity - yes, apparent deepness of connection between math and matter befuddles. re inversion, i did try to work with conductance early in exploration of impedance quantization. Shoulda been trivial. Seemed more sensible to work in terms of what goes easy rather than what resists. Simple thing in mathcad (left a link for you to pdf of an early mathcad file in our thread on my essay) to work either way. Was very puzzling. Gave it a lot of attention, as somehow it seemed a philosophical item of importance, to opt for conductance rather than resistance. Could not make any sense of it, still don't understand why. Couldn't get the numbers to work. One has to calculate to figure anything out. Couldn't calculate. Had to give up, switch back to impedances. Can't believe there is anything fundamental in the modeling that would cause this. Wish i had a student to chase it.
(5) renormalization - you are calling conductance 'the primary fraction'? Like that, but of itself it is not enough to result in finiteness if i understand correctly. Impedance has both capacitive and inductive components with pi/2 phase shift between them in 'free space', a consequence of how wave function of interest excites the eight-component 3D Pauli vacuum wavefunction. Capacitive impedance is zero at the singularity, inductive is infinite. Going from ohms to mhos doesn't get rid of the singularity, just shifts its phase. However either extreme results in an infinite mismatch.
(5a) Regarding applications of quantum impedance matching in amo/condensed matter, agree there are possibilites as yet unimagined. Been humping on that for years. Only way to understand the inertia of mainstream is to experience it. Would seem obvious. The computer holding me a willing captive at this very moment is chock full of impedance matches. How could a quantum computer be any different? So far in our investigation of impedance quantization it seems the concept is firmly grounded in reality. Computers require impedance matching. Quantum computers require quantum impedance matching. Have a Buddhist friend that likes the phrase 'not no mind'. Me, i go with compassion for ignorance, otherwise would hate my not no self and go lusting for the not no gold mines' kitty kats. What a funny world we live in. Mortgages and back taxes.
(6) dude! We got got gravity. That was our black hole info paradox paper/poster for the 2013 Rochester Conference on Quantum optics, information, and measurement. Vetted by Optical Society of American, world guardian of quantum information theory/experiment. Poster is perhaps quickest click. http://vixra.org/pdf/1306.0102v1.pdf
(7a) excellent. You know math better, perhaps i have found a teacher please. The uniqueness of 3D space is an area pretty new to me, and i've never seriously looked at quaternions. When trying to model particle physics one picks and chooses what to learn very carefully, as there is infinity of beauty and complexity in every direction. For the purposes of tying together loose ends it seldom suffices to simply chase down the ends and knot them, unless one agrees to be satisfied with the tangle that connect them. Lacking that the universal PhD path dives in and start untangling. The village idiot just tracks down the ends and ties the knot. and poof fairy tale tangle vanishes and he steps in a cow pie. So it goes. Michaele and i are more modelers than typical theorists, and def not math types. Calculators.
(7b) thanks for link to Traill, Peter Jackson also recommended him and I took a look but was not able to wrap the mind around it, will try again.
re intuition, i think much of it has to do with what one experiences in life, what the Buddhist might call dependent arising. For me it was working with my brother, designing/building/operating vibratory piledrivers, synchronized spinning eccentric weights. Standing next to them, feeling the energy transfer,... Made one want to laugh, to dance and sing. That and dad was an electronics guy, we build the electronic analog and ran it on the bench. Mechanical and electrical impedances. Quantization comes easy once one gets that.
view post as summary
report post as inappropriate
peter cameron replied on Feb. 22, 2018 @ 18:18 GMT
took a look at the Rochester poster. Had forgotten superheavies (top/higgs/Z/W) line is in the wrong place by a power of alpha in figure 4. Better reference for that figure is the big bang/bounce paper http://vixra.org/abs/1501.0208
report post as inappropriate
Cristinel Stoica wrote on Feb. 20, 2018 @ 23:20 GMT
Dear Terry,
I enjoyed very much your essay, and I take the opportunity to say that your pledge is great and we should all adopt it. I think the idea "Fundamental as Fewer Bits", using Kolmogorov complexity, is great, and I am also using it to propose to identify the simplest theory in section 5 of this reference. Of course, this is not an absolute measure, because each equation has behind...
view entire post
Dear Terry,
I enjoyed very much your essay, and I take the opportunity to say that your pledge is great and we should all adopt it. I think the idea "Fundamental as Fewer Bits", using Kolmogorov complexity, is great, and I am also using it to propose to identify the simplest theory in section 5 of
this reference. Of course, this is not an absolute measure, because each equation has behind it implicit definitions and meanings. Your examples E=mc2 and Euler's identity, reveal this relativity when you try to explain them to someone who doesn't know mathematics and physics. But there is a theorem showing that the Kolmogorov complexity of the same data expressed in two different languages differs only by a constant (which is given by the size of the "dictionary" translating from one language into the other). So modulo that constant, Kolmogorov complexity indicates well which theory is simpler. This is a relativity of simplicity if you want, and of fundamentalness, because it depends on the language. But the difference is irrelevant when the complexity of the theory exceeds considerably the length of the dictionary. One may wonder what if the most fundamental unified theory is simpler than any such dictionary? Well, in this case the difference becomes relevant, but I think that if the theory is so simple, then we should use the minimal language required to express it. So if the dictionary is too large, it means we are not using the best formulations of the theory. This means that once we find the unified theory, it may be the most compressed theory, but we can optimize further by reformulating the mathematics behind it. For example, Schrödinger's equation is a partial differential equation, but the language is simplified if we use the Hilbert space formulation. Compression by reformulation occurs also by using group representations for particles, fiber bundle formulation of gauge theories, and Clifford algebras.
Another idea I liked in your essay is the trampoline effect. I would argue here that I see the trampoline as being again relative, in the following sense. Let's see it as an elastic wall, rather than a trampoline, or if you wish, as a potential well. Once you go beyond the wall, or outside the potential well, the trampoline accelerates you instead of rejecting you back. I would take as an example each major breakthrough. Once the wall between Newtonian mechanics and special relativity was left behind, special relativity reached a new compression level, unifying space and time, energy and momentum, the electric and the magnetic fields in a single tensor etc. Then other trampolines appeared, which separated special relativity from general relativity, and from quantum mechanics. Similarly, when we moved from nonrelativistic quantum mechanics to relativistic QM, the description of particles became simply in terms of representations of symmetry groups, the Poincaré and gauge groups. It is true that at this time we are hitting from decades the wall which separates our present theories from quantum gravity, and there's a similar wall separating them from a unified theory of particles. And at least another wall beyond which we expect to find the unified theory. So my hypothesis, based on previous history, is that the trampoline rejects us back as long as we don't break through, in which case it will accelerate both the discovery and the simplification. And until we will get to the terminus, more complexity may wait us beyond each wall, as usually happened so far, since every time when we found the new simplicity, new phenomena were discovered too. It's a rollercoaster. But I believe at the end there will be a really short equation, and underlying it some simple mathematical structure but initially not so simple to express. We will see.
Thank you for the great essay, and good luck in the contest!
Best wishes,
Cristi Stoica, Indra's net
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 21, 2018 @ 09:44 GMT
Cristi,
Thank you for such kind remarks, and I’m glad you liked my essay!
Your first paragraph above is a very good analysis of issues that for reasons both of essay length limits and keeping the focus on a general audience I decided not to put into the essay.
One way I like to express such issues is that the full Kolmogorov complexity can be found only by treating the...
view entire post
Cristi,
Thank you for such kind remarks, and I’m glad you liked my essay!
Your first paragraph above is a very good analysis of issues that for reasons both of essay length limits and keeping the focus on a general audience I decided not to put into the essay.
One way I like to express such issues is that the full Kolmogorov complexity can be found only by treating the functionality of the particular computer, quite literally its microprogramming in some cases, as part of the message. That’s really not all that surprising given that one of the main reasons for creating high-level instructions within a processor is to factor our routines that keep showing up in the operating system, or in the programs themselves.
I like your analysis of a two-language approach. Another way to standardize and ensure complete comparisons is to define a standardized version of the Turing machine, then count everything built on that as part of the message. That way basic machine functions and higher-level microcodes instructions all become part of the full set of factoring opportunities in the message.
Incidentally, a Turing-based approach also opens up opportunities for very unexpected insights, including at the physics level.
Why? Because many of the very instructions we have pre-programmed into computers contain deep assumptions about how the universe works. Real numbers are a good example, since their level of precision amounts to an inadvertent invocation of Planck’s constant when they are applied to data for length, momentum, energy, or perhaps most importantly, time. If you are trying to be fundamental, a lot more caution is needed on how such issues are represented at the machine level since there are multiple ways to approach numeric representation of external data, and he operations on them.
Here’s an example: Have you ever thought about whether a bundle of extremely long binary numbers might be sorted
without having to “see” the entire lengths of the bundles first?
Standard computers always treat long numbers as temporally atomic, that is, you always treat them as a whole. This means you have to complete processing of each long unit before moving on to the next one, and it’s the main reason why we also use shorter bit lengths to speed processing.
But as it turns out, you
can bundles of numbers of any length, even ones infinite in length, by using what are called comparators. I looked into these a long time ago, and they can be blazingly fast. The don’t need to see the entire number because our number systems (also parts of the total program, and thus of our assumptions!) require that digits to the right can never add up to more than one unit of the digit we are looking at. That means that once a sort order is found, no number of follow-up bits can ever change what it is.
But all of this sounds pretty computer-science-abstract and number-crunchy. Could ideas that deep in computing theory really affect the minimum size of a Kolmogorov message about, say, fundamental physics?
Sure they could. For example, for any bundle of infinite-length integers there are only so many sorted orders possible, and so only so many states needed in the computing machines that track those numbers and their sorted order. What if those states corresponded to states of the quantum numbers for various fermions and the infinite lengths to their progression along a worldline, or alternatively to the various levels of collapse of a quantum wave function?
I really am just pulling those examples out of a hat, so anyone reading this should please not take them as hints! But that said, such radically different approaches to implementing and interpreting real numeric values in computers
are good examples of the kind of thinking that likely will be needed to drop fundamental physics to smaller message sizes.
That’s because the Planck relationships like length-momentum and time-energy argue powerfully that really long numbers with extreme precision can only exist in the real world at high costs in terms of other resources. Operating systems that do not “assume” that infinitely precise locations in space or time to be cost-free likely are closer to reality than are operating systems that inadvertently treat infinitely precise numbers as “givens” or “ideals” that the machine then only approximates. It’s really the other way around: Computers that make precision decisions both explicit and cost-based are likely a lot closer to what we see in the quantum model, where quantum mechanics similarly keeps tabs on precision versus costs. Excessive use of real numbers in contrast can become very narrow but very bouncy examples of thee trampoline effect, causing the computation costs of quantum models that use them to soar outward by requiring levels of precision that are go far beyond the those of the natural systems they are intended to model.
Getting back to your comments about higher-level reformulations in terms of e.g. gauge theories and Clifford algebras: Absolutely! Those very much are examples of the “factoring methods” that, if used properly, often can result in dramatic reductions in size, and thus bring us closer to what is fundamental. The only point of caution is that those methods themselves may need careful examination, both for whether they are the best ones and whether they, much like the real-number examples I just gave, contain hidden assumptions that drives them
away from simpler mappings of messages to physics.
Regarding your second paragraph: Trampolines as multi-scale potential wells, heh, I like that! I think you have a pretty cool conceptual model there. I’m getting this image of navigating a complex terrain of gravity fields that are constantly driving the ship off course, with only a very narrow path providing fast and accurate navigation. I particularly like your multi-scale (fractal even?) structuring, since it looks at the Kolmogorov minimum path at multiple levels of granularity, treating it like a fractal that only looks like a straight line from a distance. That’s pretty accurate, and it’s part of why a true minimum is hard to find and impossible to prove.
Thanks again for some very evocative comments and ideas!
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
Peter Jackson wrote on Feb. 21, 2018 @ 12:35 GMT
Terry,
Did you see my 17.2.18 post above & 100sec video deriving non-integer spins from my essays mechanism resolving the EPR paradox? (I've just found the 'duplet state' confirmation in the Poincare sphere)
That all emerged from a 2010 SR model
http://fqxi.org/community/forum/topic/1330 finally able to resolve the ecliptic plane & stellar aberration issues and a tranche of others (expanded on in subsequent finalist essays).
i.e you'll be aware of George Kaplans USNO circ (p6) following IAU discussions.
(Of course all including editors dismiss such progress as impossible so it's still not in a leading journal!)
Hope you can look & comment
Peter
report post as inappropriate
Peter Jackson replied on Feb. 24, 2018 @ 14:23 GMT
Terry,
I like your definition (quote?) of QM. The thing about history is that nobody can see it as history at the time.
There's history being written in my essay you've so far missed due to normal embedded assumptions. To make it more visible I've posted the below check list which the ontology builds on; Hope you can find the l time to look with a fresh mind.
AS MOST STRUGGLE WITH THE CLASSICAL SEQUENCE (TO MUCH TO HOLD IN MIND ALL AT ONCE) A QUICK OUTLINE INTRO IS HERE;
1. Start with Poincare sphere OAM; with 2 orthogonal momenta pairs NOT 'singlets'.
2. Pairs have antiparalell axis (random shared y,z). (photon wavefront sim.)
3. Interact with identical (polariser electron) spheres rotatable by A,B.
4. Momentum exchange as actually proved, by Cos latitude at tan intersection.
5. Result 'SAME' or 'OPP' dir. Re-emit polarised with amplitude phase dependent.
6. Photomultiplier electrons give 2nd Cos distribution & 90o phase values.
7. The non detects are all below a threshold amplitude at either channel angle.
8. Statisticians then analyse using CORRECT assumptions about what's 'measured!
The numbers match CHSH>2 and steering inequality >1 As the matching computer code & plot in Declan Traill's short essay. All is Bell compliant as he didn't falsify the trick with reversible green/red socks (the TWO pairs of states).
After deriving it in last years figs I only discovered the Poincare sphere already existed thanks to Ulla M during this contest. I hope that helps introduce the ontology.
Very best. Peter
report post as inappropriate
Author Terry Bollinger wrote on Feb. 21, 2018 @ 13:57 GMT
Peter Jackson, Eckard Blumschein, Wayne R Lundberg, James Lee Hoover, Marc Séguin, and Jeffrey Michael Schmitz:
This is to let you know that I am aware that all six of you have unanswered postings on my essay-level posting thread. I will strive mightily today Wed 21 Feb to provide at least short replies to all of your postings. My replies will be in the form of direct subthread replies to your postings. I will also try but not promise to assess your essays, unless you have requested me not to. If I assess your posting, it will be as a new post under your essay-level posting thread.
As many of you have likely noticed, my problem is that I tend to do a pretty deep analysis of each posting and essay, including looking up author papers if they exist. So even when I try hard to be "brief", I tend not to be! I also do most posting composition and editing offline to reduces chances of loss, check spelling better, and make sure my sentences are whole. That too slows the process.
If you don't believe that I tend to be overly talky... well, take a look at this "brief alert to unanswered authors" that you are reading right now... :)
Cheers, Terry Bollinger
"Quantum mechanics is simpler than most people realize. It is no more and no less than the physics of things for which history has not yet been written."
Philip Gibbs wrote on Feb. 21, 2018 @ 19:06 GMT
Terry, thanks for a very clear and interesting essay.
It seems there are two types of information covered in your essay. There is the information required to describe a theory such as the standard model, and there is the information in the state space of the theory. Mostly you are talking about the former, but for example, when you talk about redundancy in symmetry that is about the latter.
Do you make any distinction between the roles played by these two types of information?
report post as inappropriate
Author Terry Bollinger replied on Feb. 24, 2018 @ 19:12 GMT
Philip,
Thank you for your kind comments, and my apologies — I lost this one!
I will provide a longer reply after I take a look at your essay, which I had already independently listed as one that I definitely wanted to read.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
Author Terry Bollinger replied on Feb. 25, 2018 @ 04:30 GMT
Philip,
Your question about whether there is a distinction between descriptive (which I interpret as more “English like”) data and data that can be reduced though symmetry groups.
The best answer I can give is that (a) I really don’t know, and (b) I nonetheless rather strongly suspect that even the most random-looking descriptive parts of a theory are just finer-scale...
view entire post
Philip,
Your question about whether there is a distinction between descriptive (which I interpret as more “English like”) data and data that can be reduced though symmetry groups.
The best answer I can give is that (a) I really don’t know, and (b) I nonetheless rather strongly suspect that even the most random-looking descriptive parts of a theory are just finer-scale compositions of symmetry operations. That is because I have difficulty visualizing data compression processes that do not at some level invoke symmetries. Saying that two pieces of data are really one
is, after all, just another way of stating a symmetry.
I read your essay and found your approach intriguing and resonant with some of my own perspectives. I was struck in particular by this description:
“In category theory a generalisation can be formulated using coequivalence and epimorphisms. The assimilation of information is an algebraic process of factorisation and morphisms. In algebraic terms then, the ensemble of all possibilities forms a freely generated structure in a universal algebra. Information about the world that forms part of life experience defines substructures and epimorphisms onto further algebraic structures that represent the possible universes that conform to the observed information.”The image that brought to mind for me was Kolmogorov compression with a focus on free algebras, and applied not just to observed data in our universe, but to the definition of all possible universes. Intriguing!
I note from the book chapter below that there seems to have been some coverage of algebraic generalizations of quantum mechanics (or at least of Hilbert space) in some of the side branches of physics, even if they are not dominant topics in the mainstream:
Landsman N.P. (2009) Algebraic Quantum Mechanics. In: Greenberger D., Hentschel K., Weinert F. (eds) Compendium of Quantum Physics. Springer, Berlin, Heidelberg .
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
view post as summary
Thomas Howard Ray wrote on Feb. 21, 2018 @ 19:39 GMT
Terry,
I've been mulling this over. If I accept the Kolmogorov (Kolmogorov-Chaitin) complexity as the ultimate foundation standard, let me understand:
You would have me believe that the world is fundamentally made of information bits that are algorithmically compressible. Okay, I'll entertain that notion.
Except that you used the example of Einstein, E=mc^2, to serve as a minimum Kolmogorov complexity, arguing that mathematical conciseness is the standard.
The equation, however, is not irreducible. The
meaning of the equation is in the expression E = m. The second degree addition tells us that the relations in the equation are dynamic, that energy and mass may take infinite values. The binding energy then was discovered through experiment, setting a practical limit.
So I find myself moving ever closer to Brian Josephson's premise that meaning itself is fundamental. And meaning seems to be that which contains the requisite first degree information to "Be fruitful and multiply" as the Bible has it. So I suspect that meaning precedes construction. Or compression.
Enjoyed the essay.
Best,
Tom
report post as inappropriate
Author Terry Bollinger replied on Feb. 23, 2018 @ 05:55 GMT
Tom,
Thank you for your well-stated questions about information versus meaning.
Information (bits) and meaning are not the same thing at all, nor does the idea of binary compression create meaning. All compression does is eliminate bits that are
not part of the primary meaning of the message.
To get at the meaning, you have to have some much broader context by which to interpret those bits. Since you mentioned the Bible, an example would be a Unicode version of the Bible in, say, German. Until you understand both Unicode and the German language, that bit string remains just that: a string of bits. The
meaning only comes from that broader context.
Or for another analogy, compression is more like panning for gold. It helps pull out the gold, sure, but the
value of that gold depends entirely on the person doing the panning.
For more on the relationship between data and meaning, please see
this posting I did about how the meaning of a given string of bits can vary over time.
Thanks again for some well-stated questions!
Cheers,
Terry Bollinger
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
Thomas Howard Ray replied on Feb. 23, 2018 @ 12:04 GMT
Terry,
Perhaps because I think like a complex systems scientist, I agree with your gold mining metaphor -- applied to information. Information without waste and redundancy is efficient and useful. On the other hand, waste and redundancy are assets to creativity. The meaning that one
assigns to information is a subjective judgement; it does not necessarily contain the requisite information to "be fruitful and multiply."
A priori meaning is that which precedes information, and continues without the user's knowledge. For example, Leslie Lamport said, "A distributed system is one in which the failure of a computer you didn't even know existed can render your own computer unusable." Unusable, not meaningless. For in the context of the system, rejected information is useful somewhere else in the system.
Thanks for bringing the dialogue to a higher level. It is most welcome.
Best,
Tom
report post as inappropriate
Steven Andresen wrote on Feb. 22, 2018 @ 07:14 GMT
Dear Terry
If you are looking for another essay to read and rate in the final days of the contest, will you consider mine please? I read all essays from those who comment on my page, and if I cant rate an essay highly, then I don’t rate them at all. Infact I haven’t issued a rating lower that ten. So you have nothing to lose by having me read your essay, and everything to...
view entire post
Dear Terry
If you are looking for another essay to read and rate in the final days of the contest, will you consider mine please? I read all essays from those who comment on my page, and if I cant rate an essay highly, then I don’t rate them at all. Infact I haven’t issued a rating lower that ten. So you have nothing to lose by having me read your essay, and everything to gain.
Beyond my essay’s introduction, I place a microscope on the subjects of universal complexity and natural forces. I do so within context that clock operation is driven by Quantum Mechanical forces (atomic and photonic), while clocks also serve measure of General Relativity’s effects (spacetime, time dilation). In this respect clocks can be said to possess a split personality, giving them the distinction that they are simultaneously a study in QM, while GR is a study of clocks. The situation stands whereby we have two fundamental theories of the world, but just one world. And we have a singular device which serves study of both those fundamental theories. Two fundamental theories, but one device? Please join me and my essay in questioning this circumstance?
My essay goes on to identify natural forces in their universal roles, how they motivate the building of and maintaining complex universal structures and processes. When we look at how star fusion processes sit within a “narrow range of sensitivity” that stars are neither led to explode nor collapse under gravity. We think how lucky we are that the universe is just so. We can also count our lucky stars that the fusion process that marks the birth of a star, also leads to an eruption of photons from its surface. And again, how lucky we are! for if they didn’t then gas accumulation wouldn’t be halted and the star would again be led to collapse.
Could a natural organisation principle have been responsible for fine tuning universal systems? Faced with how lucky we appear to have been, shouldn’t we consider this possibility?
For our luck surely didnt run out there, for these photons stream down on earth, liquifying oceans which drive geochemical processes that we “life” are reliant upon. The Earth is made up of elements that possess the chemical potentials that life is entirely dependent upon. Those chemical potentials are not expressed in the absence of water solvency. So again, how amazingly fortunate we are that these chemical potentials exist in the first instance, and additionally within an environment of abundant water solvency such as Earth, able to express these potentials.
My essay is attempt of something audacious. It questions the fundamental nature of the interaction between space and matter Guv = Tuv, and hypothesizes the equality between space curvature and atomic forces is due to common process. Space gives up a potential in exchange for atomic forces in a conversion process, which drives atomic activity. And furthermore, that Baryons only exist because this energy potential of space exists and is available for exploitation. Baryon characteristics and behaviours, complexity of structure and process might then be explained in terms of being evolved and optimised for this purpose and existence. Removing need for so many layers of extraordinary luck to eventuate our own existence. It attempts an interpretation of the above mentioned stellar processes within these terms, but also extends much further. It shines a light on molecular structure that binds matter together, as potentially being an evolved agency that enhances rigidity and therefor persistence of universal system. We then turn a questioning mind towards Earths unlikely geochemical processes, (for which we living things owe so much) and look at its central theme and propensity for molecular rock forming processes. The existence of chemical potentials and their diverse range of molecular bond formation activities? The abundance of water solvent on Earth, for which many geochemical rock forming processes could not be expressed without? The question of a watery Earth? is then implicated as being part of an evolved system that arose for purpose and reason, alongside the same reason and purpose that molecular bonds and chemistry processes arose.
By identifying atomic forces as having their origin in space, we have identified how they perpetually act, and deliver work products. Forces drive clocks and clock activity is shown by GR to dilate. My essay details the principle of force dilation and applies it to a universal mystery. My essay raises the possibility, that nature in possession of a natural energy potential, will spontaneously generate a circumstance of Darwinian emergence. It did so on Earth, and perhaps it did so within a wider scope. We learnt how biology generates intricate structure and complexity, and now we learn how it might explain for intricate structure and complexity within universal physical systems.
To steal a phrase from my essay “A world product of evolved optimization”.
Best of luck for the conclusion of the contest
Kind regards
Steven Andresen
Darwinian Universal Fundamental Origin
view post as summary
report post as inappropriate
James Lee Hoover wrote on Feb. 23, 2018 @ 05:27 GMT
Terry,
Not sure what you mean:
a time to tear and a time to mend, a time to be silent and a time to speak,
My comments above did not apply to you, basically the contest in general.
Jim
report post as inappropriate
Author Terry Bollinger wrote on Feb. 24, 2018 @ 00:07 GMT
All,
I just did an evaluation of Karl Coryat’s excellent essay
The Four Pillars of Fundamentality. It is both funny and profound, and I recommend it highy!
For anyone interested, I once again inadvertently got “into the zone” while
contemplating Karl’s Pillar #3 (Relations), resulting in another one of my on-the-fly mini-papers. This one addresses two topics: (a) the deep
physics level fundamentality of “relations”, which is the topic of Karl’s Pillar #4, and (b) a years-old space-as-entanglement idea from my personal physics notes.
I had not intended to present the space-as-entanglement idea here, but it just seemed too relevant. It is equivalent to a hugely simplified, non-holographic approach to constructing 3-space out of a direct 3D (not 4D) web of group-level entanglements. The entangled “unit of space” is an overlooked direction-only conjugate component of particle spin. Since these were just personal musings, I was genuinely surprised to find out that a lively community for exploring the idea that space is a form of 4D holographic entanglement has existed for years. My version is much simpler (3D), much more direct (just a web), and I think kind of fun to read as a mind-stretching exercise if nothing else!
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
Author Terry Bollinger wrote on Feb. 24, 2018 @ 01:19 GMT
All,
Another very well written essay that I must recommend is Marc Séguin’s
Fundamentality Here, Fundamentality There, Fundamentality Everywhere.
It was one of my most enjoyable reads. It is lucid, learned, well-stated, well-ordered, addresses the topic in an interesting and engaging way, and has a sly self-deprecating sense of humor that had me chuckling multiple times. It is also spot-on for the question that FQXi asked this year.
On looking back at my assessment of Marc’s essay, it looks like I got a bit carried away again. This time the topic was the nature of
qualia. That is the word for the internal sensations and emotions that you can bring up in your mind without external sensory inputs. Try it: Close your eyes and imaging red and green lights, alternating. Those are qualia.
Notice that even though your optical system consistently maps the external light frequencies that we call red and green into the corresponding qualia in your head, the very fact that you can bring up the qualia
without any external stimulation shows that all that is going on here is mapping: the light frequencies get mapped into those “somethings” in your head that you can also bring up from memory. For all you or I know, what red light brings up in my head might be what you would have called green. That sort of thing happens all the time for folks with synesthesia (which makes me jealous!).
So if you happen to have any interest in qualia, you can see what I wrote in
my comments on Marc’s essay.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger
Member Marc Séguin replied on Feb. 24, 2018 @ 03:09 GMT
Dear Terry,
Thank you for the kind words about my essay! To keep the ball rolling, may I recommend another excellent essay,
"What if even the Theory of Everything isn’t fundamental" by Paul Bastiaansen
fqxi.org/community/forum/topic/3063
I too got carried away with my comments on his thread... I used, of course, your very helpful and honest "what I liked/what I liked less" approach, and even referred to your essay contestant pledge!
Cheers,
Marc
report post as inappropriate
Author Terry Bollinger wrote on Feb. 24, 2018 @ 16:37 GMT
The Crowther Criteria for Fundamental Theories of PhysicsThe source for this consolidated and lightly edited list is the 2017 FQXi Essay
When do we stop digging? Conditions on a fundamental theory of physics, by Dr Karen Crowley at the University of Geneva. You can download her essay and read the discussion about it here:
https://fqxi.org/community/forum/topic/essay-download/3034/_
_details/Crowther_Crowther_-_when_do.pdfTo qualify under the Crowther Criteria, a fundamental theory of physics must be:
CC#1.
Unified: It must address all of reality using a single set of self-consistent premises.
CC#2.
Unique: It should be the only possible theory once its premises have been stated formally.
CC#3.
UV complete: There should not exist any phenomena are outside of its formal scope.
CC#4.
Non-perturbative: Its formalisms should be exactly solvable rather than probabilistic.
CC#5.
Internally self-consistent: It should be well-defined formally, and should not generate singularities.
CC#6.
Scale smooth: Its explanation of reality should be continuous across all scales (levels) of space and time, with no gaps, overlaps, or other discontinuities.
CC#7.
Fully generative: It requires no pre-existing fixed or “given” structures, such as space itself, that have complex and non-trivial properties.
CC#8.
Natural: It should require no arbitrary, inexplicable “fine-tuning” of numeric parameters.
CC#9.
Not weird: The underlying premises should be simple, easily comprehensible, and subject to Occam’s razor.
Thomas Howard Ray wrote on Feb. 24, 2018 @ 18:52 GMT
Terry,
If #7 were true, physics would have no foundational theories. Complex, non-trivial properties are often the result of dynamics with specified boundary conditions.
report post as inappropriate
Paul Bastiaansen wrote on Feb. 24, 2018 @ 19:30 GMT
Dear Terry,
An original and daring idea, to define a numerical measure to answer the question ‘what is fundamental’. It is really interesting to literally view scientific theories as a concise way to represent measurement data.
I have a few comments. Your example of decimals of pi nicely illustrates that an unambiguous measure of Kolmogorov complexity is not possible. The example...
view entire post
Dear Terry,
An original and daring idea, to define a numerical measure to answer the question ‘what is fundamental’. It is really interesting to literally view scientific theories as a concise way to represent measurement data.
I have a few comments. Your example of decimals of pi nicely illustrates that an unambiguous measure of Kolmogorov complexity is not possible. The example doesn’t suffice, however, because the string of decimals is too short. I’m quite sure that in the space of 20 decimals, you cannot write a program that enumerates the decimals of pi. So probably the shortest way to represent your example string is the string itself (or a zipped variant). But of course, if you would take a string of decimals of pi that is much longer, the example does work.
A more serious objection on your view on physics as information theory is that I would want to see arguments why it makes sense to view a physical theory as a concise way of reproducing data. This view misses the semantical part, the
meaning of the theory. In practice, the question on how to check measurement data against the predictions of a theory, is not a straightforward one. A lot of theory interpretation is needed to calculate what outcome the theory predicts for a certain measurement. This aspect is absent from your view.
Let me put it in a different way: if I understand you right, I can rephrase your claim that the most fundamental physical theory is somewhat like the best compression algorithm. Both are the shortest possible way to represent a set of data. But there is an important difference, because any scientific theory is a finite description of an infinite set of data, whereas the size of a compressed set of data still scales with the size of the original data. This indicates that there is an important difference between the two.
In the end, I tend to think that how fundamental a theory is, is not a concept that can be given a numerical measure. But I admit that the idea is really interesting.
Let me read up on the Spekkens principle, because I don’t think I understand it. And I really like the three challenges you conclude with. I must confess, as a physicist, that I never appreciated the mysteries around spin and the difference between fermions and bosons. My bad, because indeed this must be profound.
All the best,
Paul Bastiaansen
view post as summary
report post as inappropriate
Author Terry Bollinger wrote on Feb. 24, 2018 @ 22:34 GMT
All,
I’m introducing a new FQXi process idea here, which is this: I want to create a new format for capturing important essay contest conversations in a more explicit, more accessible form that makes them easier to cite and reference.
Specifically, I will be posting for reference a number of supplemental "mini-essays" that capture, clean up, and document some of the particularly...
view entire post
All,
I’m introducing a new FQXi process idea here, which is this: I want to create a new format for capturing important essay contest conversations in a more explicit, more accessible form that makes them easier to cite and reference.
Specifically, I will be posting for reference a number of supplemental "mini-essays" that capture, clean up, and document some of the particularly interesting ideas that have emerged from what have been for me very stimulating interactions with other essays and their authors. My goal is to make these synergistic outcomes more explicit and easier to reference in the future. Putting aside the competitive aspects of the FQXi Essay contests, I would judge that the greatest value of these contests emerges instead from the interactions between essay authors. Our essays are far more valuable as an interactive whole than they are if viewed only in isolation.
The Crowley Criteria posting is my first example of such a mini-essay, though it is more an example of a reference summary than a mini-essay. For any of the mini-essays I post, please free to add your thoughts with (preferably) a reply-to at that posting. Note however that in the case of the Crowley Criteria I'm just trying to capture her ideas in a simple format. So if you want to debate any of the points in her list you should go to her essay thread rather than mine.
I’m posting the Crowley Criteria in part because I need them as a reference for a rather unusual mini-essay that I will be posting soon. To be frank, that mini-essay ends up directly contradicting her CC#4. I didn't expect to wind up there, but such an unexpected journey is worth documenting!
Each time I post a mini-essay I will try quickly (I was slow this time) to post a short addendum that provides links to the mini-essay. Below is my link addendum for the Crowley Criteria post.
--------------------
To link to the mini-essay titled:
The Crowther Criteria for Fundamental Theories of Physics… Please copy and paste either the named link above or the direct URL below:
https://fqxi.org/community/forum/topic/3099#post_14555
1
view post as summary
Wayne R Lundberg replied on Mar. 3, 2018 @ 19:21 GMT
Terry, while I liked her essay and criteria a lot, I'm sure that there are more than really necessary. Especially when you consider that the mathematical uniqueness criteria can only be filled by a cosmology with 11 dimensions, one of which is a cyclic variable. I don't know of any other besides my own theta-mass-time.
WRL
report post as inappropriate
Gordon Watson wrote on Feb. 24, 2018 @ 22:45 GMT
Terry, some quick short notes as I work my way to your essay:
1. FQXi Essay Contestant Pledge = Suggested FQXi Voting Pledge
Your Pledge is so refreshing that I've hot-linked it above. LHS wording of the title is yours; to me, it reads "official" and is thus too hopeful (for now). RHS is my suggested edit as we work with FQXi to improve things!
2. Under current...
view entire post
Terry, some quick short notes as I work my way to your essay:
1.
FQXi Essay Contestant Pledge =
Suggested FQXi Voting PledgeYour Pledge is so refreshing that I've hot-linked it above. LHS wording of the title is yours; to me, it reads "official" and is thus too hopeful (for now). RHS is my suggested edit as we work with FQXi to improve things!
2. Under current circumstances, my own position is clear:
(i) As an independent researcher, I'm here to discuss, learn, teach, debate, respond to every question, critique others, etc. Result = Fail; eg, next-to-no questions, few responses.
(ii) I'm not here for the votes: Result = Just-as-well; eg, given a 0 without explanation: how can I learn, respond, correct, defend, revise, acknowledge, etc?
3. While we await (with many others) for FQXi improvements, why don't we develop an OPEN voting system? Add to your Pledge a (say, for argument's sake) 5-category [each numbered; #1-5] scoring sheet [maximum vote per category = 2??] with space for explanations, plus identifier (say, for you, hot-linked
Terry Bollinger [or with hot-linked email-addresses also allowed] so that we ALWAYS get an alert -- with easy-return access. [You get the idea.]
Recipient can respond to Terry Bollinger#2, for all to see: thus promoting open learning, debate, progress, support for one view or the other, or a middle view, etc. Given the teaching/learning, who then here, as a serious researcher, would focus on "fake-scores"?
The advantage of this OPEN proposal is that you, with your background, could lead us to something truly useful, actionable, within the current rules, a worthwhile experiment,
ready for the next "contest" (surely the wrong word here) -- which FQXi can monitor before refining (if need be), and accepting as the new gold-standard in OPEN teaching/learning/essay-exchange; etc:
ready for the next + 1 "contest"!
4. To your (for me) excellent essay:
(i) I counted 8 important fundamental symbols in Challenge #1.
(ii)
Re Challenge #2: in my [hurried] essay, see hot-linked Reference [12], p.639! It's part of my theory.
(iii) NB: Your editorial red-pen will be very welcome there at any time; hopefully after you've read [in the first thread], the Background to my theory (which dates from 1989).
(iv) Maybe,
with hard work and insight, you might just become the person who finds a hidden gemstone of simplicity by unravelling the threads of misunderstanding that for decades have kept it hidden.PS: Terry, if/when you reply to my post (at any time), please copy it to my essay-thread so that I'm alerted to it. I will do likewise.
Enough (for now): With many thanks and much appreciation for your lovely work;
Gordon Watson
More realistic fundamentals: quantum theory from one premiss.
view post as summary
report post as inappropriate
Author Terry Bollinger wrote on Feb. 24, 2018 @ 22:49 GMT
… and of course,
not having a line return at the end of the text URL caused the FQXi software to invalidate the URL by placing its final digit on a separate line. I’m pretty sure that did not show up in the preview, but maybe I just didn’t notice it. Tsk, why didn’t I anticipate such an
obvious bug in advance?... :)
Trying again:
--------------------
To link to the mini-essay titled:
The Crowther Criteria for Fundamental Theories of Physics… Please copy and paste either the named link above or the direct URL below:
https://fqxi.org/community/forum/topic/3099#post_14555
1
Author Terry Bollinger wrote on Feb. 24, 2018 @ 22:56 GMT
AAAAARRRRRGGGGGHHHHH!!!!! EVEN IF I WIN THIS IS
NOT WORTH $10,000!!
--------------------
To link to the mini-essay titled:
The Crowther Criteria for Fundamental Theories of Physics… Please copy and paste either the named link above or the direct URL below:
https://fqxi.org/community/forum/topic/3099#post_14555
1
Mary had a little lamb, I hope it eats the bug…
Author Terry Bollinger wrote on Feb. 24, 2018 @ 23:03 GMT
(Why THANK YOU nice uniformed people that my wife just called in! Yes, I would just
LOVE to wear that nice white jacket to help keep my arms from spontaneously beating my own head! Just be sure to send the bill to FQXi!)
Thomas Howard Ray replied on Feb. 25, 2018 @ 14:23 GMT
You're new here, aren't you? :-)
Being also retired from the DoD, you must have experienced the difficulties making legacy software work with rapidly advancing technology and updates to Windows. You're suffering from PTSD.
The world is catching up with us, Terry.
report post as inappropriate
Author Terry Bollinger replied on Feb. 25, 2018 @ 22:38 GMT
Oh yes indeedy! The stories either of us could tell... DISA alone...
But in recent years I had the true privilege of working almost entirely with (a) Leading-edge commercial tech (I saw Google's Earth tech before Google owned it, and some amazing drones long before anyone had them at home); and (b) AI and robotics research. In short, I got spoiled!
Author Terry Bollinger wrote on Feb. 25, 2018 @ 22:43 GMT
Fundamental as (Literally) Finding the Cusp of MeaningTerry Bollinger, 2018-02-25 Feb
NOTE: The purpose of a mini-essay is to capture some idea, approach, or even a prototype theory that resulted from idea sharing by FQXi Essay contestants. This mini-essay was inspired primarily by two essays:
The Perception of Order by Noson S Yanofsky
The Laws of Physics by...
view entire post
Fundamental as (Literally) Finding the Cusp of MeaningTerry Bollinger, 2018-02-25 Feb
NOTE: The purpose of a mini-essay is to capture some idea, approach, or even a prototype theory that resulted from idea sharing by FQXi Essay contestants. This mini-essay was inspired primarily by two essays:
The Perception of Order by Noson S YanofskyThe Laws of Physics by Kevin H KnuthRelevant quotes:
Yanofsky (in a posting question): “I was wondering about the relationship between Kolmogorov Complexity and Occam's razor? Do simpler things really have lower KC?”
Knuth: “Today many people make a distinction between situations which are determined or derivable versus those which are accidental or contingent. Unfortunately, the distinction is not as obvious as one might expect or hope.”
Bollinger: “…the more broadly a pattern is found in diverse types of data, the more likely it is to be attached deeply within the infrastructure behind that data. Thus words in Europe lead back ‘only’ back to Proto-Indo-European, while the spectral element signatures of elements on the other side of the visible universe lead all the way back to the shared particle and space physics of our universe. In many ways, what we really seem to be doing there is (as you note) not so much looking for ‘laws’ as we are looking for points of shared origins in space and time of such patterns.”
Messages, Senders, Receivers, and MeaningAll variations of information theory include not just the concept of a message, but also of a sender who creates that message, and of a receiver who receives that message. The sender and received share a very special relationship, which is that they both understand the structure of the message in a way that assigns to it yet another distinct concept, which is that of meaning.
Meaning is the ability to take specific, directed (by the sender) action as the result of receiving the message. Meaning, also called semantics, should never be confused with the message itself, for two reasons. The first is that a message in isolation is nothing more than a meaningless string of bits or other characters. In fact, if the message has been fully optimized — that is, if it is near its Kolmogorov minimum — it will look like random noise (the physical incarnation of entropy) to any observer other than the sender and receiver. The second is that the relationship between messages and meaning is highly variable. Depending on how well the sender and receiver “understand” each other, the same meaning can be invoked by messages that vary wildly in length.
This message-length variability is a common phenomenon in human relationships. Couples who have lived together for decades often can convey complex meaning by doing nothing more than subtly raising an eyebrow in a particular situation. The very same couple in the distant past might well have argued (exchanged messages) for an hour before reaching the same shared perspective. Meaning and messages are not the same thing!
But the main question here is this: What makes the sender and receiver so special?
That is, how does it come to be that they alone can look at a sequence of what looks like random bits or characters, and from it implement meaning, such as real-world outcomes in which exquisitely coordinated movements by the sender and receiver accomplish joint goals that neither could have accomplished on their own?
In short: How does meaning, that is, the ability to take actions that forever alter the futures of worlds both physical and abstract, come to be attached to a specific subset of all the possible random bit or character strings that could exist?
Information Theory at the Meta LevelThe answer to how senders and receivers assign meaning to messages is that at some earlier time they received an earlier set of messages that dealt specifically with how to interpret this much later set of messages. Technologists call such earlier deployments of message-interpretation messages protocols, but that is just one name for them. Linguists for example call such shared protocols languages. Couples who have been together for many years just call their highly custom, unique, and exceptionally powerful set of protocols understanding each other.
But it doesn’t stop there. Physicists also uncover and identify shared protocols, protocols that they had no part in creating. They have however slowly learned how to interpret some of them, and so can now read some of the messages that these shared protocols enable. Physicists call such literally universal protocols the “laws” of physics, and use them for example to receive messages literally from the other side of the universe. For example, these shared protocols enable to look at the lines in light spectra and, amazingly, discern how the same elements that we see on earth can also be entrained within the star-dismantling heat and power of a quasar polar plasma jet billions of light years distant in both space and time.
Protocols as Meaning EnablersWhile the word “protocol” has a mundane connotation as the rules and regulations by which either people or electronic equipment interact in clear, understandable ways (share information), I would like to elevate the stature of this excellent word by asserting that in terms of the meta-level at which all forms of information theory first acquire their senders and receivers, a protocol is a meaning enabler. That is, to create and distribute a protocol is to create meaning. They enable previously isolated components of the universe, at any scale from that of fundamental particles to light from distant quasars, to enable receivers to alter their behaviors and adopt new sets of coordinated, “future selective” behaviors that no longer leave the future entirely open to random chance. This in turn means that the more widely a protocol is distributed and used, the “smarter” the universe as a whole becomes. The enhancements can vary enormously is scale and scope, from the tiny sound-like handshakes that enable electrons to pair up and create superconductive materials, through the meaning exchanged by an aging couple, and up to scales that are quite literally universal, such as the shared properties of electrons. The fact that those shared electron properties define a protocol can be seen by imagining what would happen if electrons on the other side of the universe did not have the same quantum numbers and properties as the electrons we know. The protocol would be broken, and the light that we see would no longer contain a message that we understand.
Historically such protocol deficiencies, that is, a lack or misunderstanding of the protocols that enable us to assign meaning to data, is the norm rather than the exception. Even in the case I mentioned earlier of how the electrons-photons-and-elements protocol enabled us to know what elements are in a quasar on the other side of the universe, there was a time in the 1800s when scientists mourned that we would never be able to know the composition of distant stars, which by that time they had realized were forever unreachable by any means of transportation that they could envision. It was not until the electrons-photons-and-elements protocol was deciphered that the availability of this amazing information became known.
And even then that new information created its own mysteries! The element helium should have and would have been named “helion” had it been known on earth at the time of its discovery in solar spectra. That is because “-ium” indicates a metal (e.g. titanium, while “-on” indicates a gas (e.g. “argon). In this case the newly uncovered electron-photon-element protocol sent us a message we did not yet understand!
Many more such messages are still awaiting protocol, with biology, especially at the biochemical level, being a huge and profound area in need of more protocols, of more ways to interpret with meaning the data we see. Thus for example, despite our having successfully unearthed the protocol for how DNA codes amino acids and proteins at the connection level, we remain woefully lacking in protocols for understanding how the non-protein components of DNA really work, or even of how those amino acids, once strung together, almost magically fold themselves into a working protein.
Naturally Occurring ProtocolsTo understand the full importance of protocols, however, it is vital as Kevin Knuth strongly advocates in his essay that we get away from the human-centric view that calls such discoveries of meaning “laws” in the human sense. In particular, the emergence and expansion of meaning-imbuing protocols is not limited just to relationships between humans (the aging couple) or ending with human-only receivers (we blew it for helion). The largest and most extensive protocols exist entirely independently of humans, in domains that include physics and especially biology.
In the case of physics, the protocols that count most are the shared properties and allowed operations on those properties that enable matter and energy to interact in a huge variety of extremely interesting, and frankly bizarrely unlikely, ways. Kevin Knuth dives into some of these anthropic issues in his essay, primarily to point out how remarkable and, at this time at least, inexplicable they are. But in any case they exist, almost literally like fine-tuned machinery custom made to enable still more protocols, and thus still more meaning, to emerge over time.
The First Open-Ended Protocol: BiochemistryChemistry is one such protocol, with carbon-based biochemistry as an example in which the layering of protocols — the emergence of compounds and processes whose very existence depends on earlier protocols, such as proteins out of amino acids — is essentially unlimited.
It is flatly incorrect to view computer software and networks as the first example of open-ended protocols that can be layered to create higher and higher levels of meaning. The example of truly open-ended protocols capable of supporting almost unlimited increases in meaning was the remarkable cluster of basic protocols centered around the element carbon. Those elemental protocols — their subtleties include far more than just carbon, though carbon is literally the “backbone” upon which the higher-level protocols obtain the stability they require to exist at all —enabled the emergence of layer upon layer of chemical compounds of increasing complexity and sophistication. As exploited by life in particular, these compounds grow so complex that they qualify as exceptionally powerful machines capable of mechanical action (cutting and splicing DNA), energy conversion (photosynthesis), lens-like quantum calculation (chlorophyll complexes), and information storage and replication (DNA again).
Each of these increasingly complex chemical machines also enable new capabilities, which in turn enable new, more sophisticated protocols, that is, new ways of interpreting other chemicals as messages. This interplay can become quite profound, and has the same ability to “shorten” messages that is seen in human computer networking. Fruit for example responds to gas ethylene by ripening faster, a protocol created to create enticing (at first!) smells to attract seed-spreading animals. The brevity of the message, the shortness of the ethylene molecule, is a pragmatic customization by plants to enable easy spreading of the message.
Humans do this also. When after an extended effort (think of Yoda after lifting Luke Skywalker’s space ship out of the swamp) we inhale deeply through our nose, we are self-dosing with the two-atom vasodialator nitric oxide, which our nasal cavities generate slowly over time for just such purposes.
Cones Using Shared Protocols (Cusps)To understand Kevin Knuth’s main message, it’s time to take this idea of protocols to the level of physics, where it recursively becomes a fundamental assertion about the nature of fundamental assertions.
Minkowski, the former professor of Albert Einstein who more than anyone else created the geometric interpretation of Einstein’s originally algebraic work, invented the four-dimensional concept of the light cone to describe the maximum limits for how mass, energy, and information spread out over time. A 4D light “cone” does not look like a cone to our 3D-limited human senses. Instead, it appears like a cone, but like a ball of included space whose spherical surface expands outward at the speed of light. Everything within that expanding ball has potential access to — that is, detailed information about — whatever event created that particular cone. The origin of the light cone becomes the cusp of an expanding region that can share all or some subset of the information first generated at that cusp. Note that the cusp itself has a definite location in both space and time, and so qualifies as a well-defined event in spacetime, to use relativistic terminology.
Protocols are a form of shared information, and so form a subset of the types of information that can be shared by such light cones. The cusp of the light cone becomes the origin of the protocol, the very first location at which it exists. From there it spreads at speeds limited by the speed of light, though most protocols are far less ambitious and travel only slowly. But regardless of how quickly or ubiquitously a new protocol spreads, it must always have a cusp, an origin, an event in spacetime at which it comes into being, and thereby creates new meaning within the universe. Whether that meaning is trivial, momentous, weak, powerful, inaccurate, or spot-on remains to be determined, but in general it is the protocols that enable better manipulations of the future that will tend to survive. Meaning grows, with stronger meanings competing against and generally overcoming weaker ones, though as in any ecology the final outcomes are never fixes or certain. The competitive multi-scale ecosystem of meaning, the self-selection of protocols as they vie for receivers who will act upon the messages that they enable, is a fascinating topic in itself, but one for some other place and time.
In an intentional double entendre, I call these regions of protocol enablement via the earlier spread of protocols within a light cone “cones using shared protocols”, or cusps. (I hate all-cap acronyms, don’t you?) A protocol cusp is both the entire region of space over which the protocol applies or is available, but it is also the point in spacetime — the time and location — at which the protocol originated.
Levels of Fundamentality as Depths of Protocol CuspsAnd that is where Kevin Knuth’s focus on the locality and contingency of many “fundamental” laws comes into play. What we call “laws” are really just instances where we are speculating, with varying levels of confidence, that certain repeated patterns are messages with a protocol that we hope will give them meaning.
Such speculations can of course be incorrect. However, in some instances they prove to be valid, at least the degree that we can prove it from the data. Thus the existence of the Indo-European language group was at first just a speculation, but one that proved remarkably effective at interpreting words in many languages. From it the cusp or origin of this truly massive “protocol” for human communications was given a name: Proto-Indo-European. The location of this protocol cusp in space was most likely the Pontic-Caspian steppe of Eastern Europe, and the time was somewhere between 4,500 BCE and 2,500 BCE.
Alphabets have cusps. One of the most amazing and precisely located examples is the Korean phonetic alphabet, the Hangul, which was create in the 1400s by Sejong the Great. It is a truly masterful work, one of the best and most accessible phonetic alphabets ever created.
Live is full of cusps! One of the earliest and most critical cusps was also one of the simplest: The binary choice between the left and right chiral (mirror-image) subsets of amino acids, literally to prevent confusion as proteins are constructed from them. Once this choice was made it became irrevocable for the entire future history of life, since any organism that went against was faced with instant starvation. Even predators cooperate in such situations. The time and origin of this cusp remains a deep mystery, one which some (the panspermia hypothesis) would assign to some other part of the galaxy.
The coding of amino acids by DNA is another incredibly important protocol, one whose features are more easily comparable to the modern communications network concept of a protocol. The DNA-amino protocol is shared with minor deviations by all forms of life, and is a very sophisticated. It has been shown to perform superbly at preventing the vast majority of DNA mutation from damaging the corresponding proteins. The odds of that property popping up randomly in the DNA to amino acid translation mechanism are roughly one million to one. I recall from as recently as my college years reading works that disdained this encoding as random and an example of the “stupidity” of nature. It is not, though its existence does provide a proof of how easily stupidity can arise, especially when accompanied by arrogance.
The Bottom Line for FundamentalityIn terms of Kevin Knuth’s concepts of contingency and context for “fundamental” laws (protocols) and rules, the bottom line in all of this is surprisingly simple:
The fundamentality of a “law” (protocol for extracting meaning from data) depends on two factors: (1) How far back in time its cusp (origin) resides, and (2) how broadly the protocol is used.
Thus the reason physics gets plugged so often as having the most fundamental rules and “laws” is because its cusp at the same time as the universe itself, presumably in the big bang, and because its protocols are so widely and deeply embedded that they enable us to “read” messages from the other side of the universe.
Nearly all other protocol cusps, including those of life, are of a more recent vintage. But as Kevin Knuth points out in his essay, and as gave examples of through the very existence of physics-enable open protocols in biochemistry, deeper anthropic mysteries remain afoot, since strictly in terms of what we can see, the nominally “random” laws of physics were in fact direct predecessor steps necessary for life to begin creating its own upward-moving layers of protocols and increased meaning.
It was a huge mistake to think that DNA-to-amino coding was “random.”
And even if we haven’t a clue why yet, it is likely also a huge mistake to assume that the protocols of physics leading so directly and perfectly into the protocols of life. We just do not understand yet what is going on there, and we likely need to do a better job of fully acknowledging this deeply mysterious coincidence of continuity before we can make any real progress in resolving it.
view post as summary
Anonymous replied on Feb. 26, 2018 @ 02:21 GMT
Hi Terry,
I read your mini-essay and like it.
I consider such mini-essays and / or addenda as very helpful - after one has read dozens of different essays with different ideas and at least I would need a somewhat more compact summary of the main ideas of the many different authors.
A couple of thoughts about your mini-essay:
'Protocols' sounds like a rather mechanical...
view entire post
Hi Terry,
I read your mini-essay and like it.
I consider such mini-essays and / or addenda as very helpful - after one has read dozens of different essays with different ideas and at least I would need a somewhat more compact summary of the main ideas of the many different authors.
A couple of thoughts about your mini-essay:
'Protocols' sounds like a rather mechanical term to catch the distinction between message and meaning. It is really a big puzzle how 'meaning' can arise from rather mechanical processes. 'Meaning' traditionally is connected to awareness of the orderedness of the external reality - and additionally the orderedness of the internal reality of a subject that is capable of being aware of something. With this, the circle of meaning is closed. I suspect that 'meaning' is somewhat a similar tautology than the one I describe in my own addendum to my essay: meaning self-confirms itself in the same manner as my purported idea of fundamental truths do.
I think you are totally on the right track to suspect that 'meaning' has exactly the meaning we ascribe to it: by finding some meaning in nature, we find a certain truth that speaks to us through nature. By finding some meaning that we epistemologically have facilitated by means of our preferences to emotionally conclude something, we may gain some truth or some falsehood about this 'something'. In summary: whereas meaning about the external reality is more likely to be stable and pointing to some objective truths, the meaning of some more subjective conclusions about very specific circumstances that do not really justify to make some general rule out of them, we are more in danger to conclude something that could be objectively false or at least incomplete.
Your example with the aging couple is to the point, since it shows that the problem of subjective conclusions and their real meaning is solved over time by compressing the message as far as possible: Highten an eyebrow then has a very precise meaning - regardless of whether or not the couple loves one another or is in permanent confrontation. In either case one's emotions are perfectly understood by the other via the compressed message that has a well-suited meaning for the couple.
Interestingly this could be a complementary example of 'internalizing some external reality as a model, as a set of symbols' as is done by modelling some abilities for perception of the brain for the sake to understand the latter in information-theoretic terms. The hightened eyebrow does the complementary, it *externalizes* not a model, but a precise emotional state by means of a compressed and very specific symbol / action. Together with the model one has about the emotional landscape of the partner, one can even reliably deduce how to further interpret the hightening of the eyebrow, since the latter can be interpreted in general as dislikening something, and the internal model can further specify what the dislikening specifically is all about in the actual situation.
Another interesting aspect of protocols seem to be for me that they limit or exclude other possibilities. This is what we all want to achieve by searching for some more fundamental level of nature. Limiting the options that are left makes it easier to determine the more fundamental level.
Just a couple of thoughts :-)
Best wishes,
Stefan Weckbach
view post as summary
report post as inappropriate
Conrad Dale Johnson replied on Feb. 26, 2018 @ 15:18 GMT
Terry –
That was wonderfully clear and readable, not to mention vast in scope - an excellent summary of what I think are the key issues here. I agree with pretty much everything, except – there’s a basic missing piece to your concept of meaning. Naturally, it happens to be what I’ve been trying to articulate in my essays.
You write, “To create and distribute a protocol...
view entire post
Terry –
That was wonderfully clear and readable, not to mention vast in scope - an excellent summary of what I think are the key issues here. I agree with pretty much everything, except – there’s a basic missing piece to your concept of meaning. Naturally, it happens to be what I’ve been trying to articulate in my essays.
You write, “To create and distribute a protocol is to create meaning.” This describes the aspect of information-processing that’s well-understood: data gets transferred from sender to receiver and decoded through shared protocols – a very good term for the whole range from laws of physics to human philosophies. But this concept of meaning takes it for granted that the underlying data is distinguishable: that there are physical contexts – for both sender and receiver – in which the 1’s and 0’s (or any of the many different kinds of information that actually constitute the physical world), make an observable difference.
This is hard
not to take for granted, I know – both because such contexts are literally everywhere we look, and because it’s very difficult to describe them in general terms. But I’ve argued both on logical grounds and empirically, from “fine-tuning”, that it takes an extremely special kind of universe to make any kind of information physically distinguishable.
The physical world is essentially a recursive system in which information that’s distinguished (measured) in one context gets communicated out to help set up other contexts, to distinguish more information. Quite a number of distinct protocols are apparently needed to make this work, and I’ve tried to sort some of them out in
my current essay, to suggest how they might have emerged. In
my 2017 essay I compared the way this system works with the other two basic recursive systems that make up our world, biological evolution and human communication.
Regarding biological and human systems, you’re right that there’s “natural selection” for meanings that “enable better manipulations of the future.” But while this also applies to the evolution of human theories about the physical world, I don’t think it’s quite right for the generation of meaning in the physical world itself. Rather, the meanings that get selected are the ones that keep on enabling the future itself – that is, that constantly set up new situations in which the same protocol-system can operate to create new meaning.
I don’t mean to detract at all from your remarkable mini-essay – I give it a 10. But please fix your next-to-last sentence. I think you mean that it’s a mistake to suppose the protocols of physics just happen to support the protocols of life. That’s a complex issue… that can’t become clear, I think, until we have some idea where the protocols of physics come from.
Thanks for your
many eye-opening contributions to this contest – once again, I’m in awe.
Conrad
view post as summary
report post as inappropriate
Author Terry Bollinger wrote on Feb. 25, 2018 @ 22:53 GMT
To link to the mini-essay titled:
Fundamental as (Literally) Finding the Cusp of Meaning… Please copy and paste either the named link above or the direct URL below (beware of possible line breaks in the URL):
https://fqxi.org/community/forum/topic/3099#post_145761
Author Terry Bollinger wrote on Feb. 26, 2018 @ 00:28 GMT
FQXi Essay Contestant Pledge Author: Terry Bollinger. Version 1.3, 2018-02-15----------------------------------------
When evaluating essays from other FQXi Contest participants, I pledge that I will rate and comment on essays based only on the following criteria:--
My best, most accurate judgement of the quality of the essay, without regard to how my ratings and comments on that essay could affect my own contest status.
--
How well the essay makes its argument to back up its answer.
--
How accurately and reliably an essay uses reference materials.--
How focused the essay is on answering the question as posed and intended by FQXi. (This is secondary to criteria above.)
Furthermore, I will consciously strive to:--
Avoid rating an essay low just because it has a novel approach.--
Avoid rating an essay low because I disagree with its answer. Instead, I will focus how well the essay argues for that answer.
--
Avoid rating an essay high solely because I like its conclusion. Even if I agree, my rating will reflect the overall essay quality.
--
Avoid ratings inflation. If an essay does very poorly at arguing its conclusion, I pledge to give it the appropriate low rating, versus an inflated “just being nice” number such as a 5 or 6.
--
Avoid reprisal behavior. I pledge that I will never knowingly assign unfair point ratings or make false comments about another essay as a form of reprisal against another contestant who gave my essay low ratings or negative comments.
--
Avoid rudeness towards other contestants. If other contestants become abusive, I will appeal to FQXi to intervene, rather than attempt to respond in kind on my own.
Author Terry Bollinger wrote on Feb. 26, 2018 @ 00:33 GMT
To link to the above mini-essay, please copy and paste either of the following links:
FQXi Essay Contestant Pledgehttps://fqxi.org/community/forum/topic/3099#post_14577
9
Gordon Watson wrote on Feb. 26, 2018 @ 10:31 GMT
Terry, I'm copying my "voting suggestions, etc" above to me essay-thread. I'm hoping to get others involved.
So that I'm alerted, please post a note there if/when you reply.
Thanks; Gordon
More realistic fundamentals: quantum theory from one premiss.
report post as inappropriate
Author Terry Bollinger replied on Feb. 26, 2018 @ 12:56 GMT
Gordon,
Thank you for supporting the Pledge!
Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see why. My queue on this last day is long, but I will follow your link and a look at your essay.
Cheers,
Terry
Fundamental as Fewer Bits by Terry Bollinger (Essay 3099)Essayist’s Rating Pledge by Terry Bollinger"Quantum mechanics is simpler than most people realize. It is no more and no less than the physics of things for which history has not yet been written."
Author Terry Bollinger replied on Feb. 26, 2018 @ 13:29 GMT
Gordon,
Wow! That is one of the best arguments for locality that I think I’ve seen. I like your Bell-ish style of writing and focus on specifics. You are of course in very good company, since both Einstein and Bell were localists.
I can’t do a detailed assessment today — too many equations that would need careful examination to assess your argument meaningfully — but what I’ve seen at a quick look seems pretty solid.
That said, there is an expanding class of pro-entanglement data anomalies that you need somehow to take into account:
ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count This field has moved way beyond the Aspect studies. A lot of hard-nosed business folks figured out years ago that arguments against the existence of entanglement don’t matter much if they can simply
build devices that violate Bell’s inequality. Which they did, and now they sell them to some very smart, physics-savvy customers who use them on a daily basis to encrypt some critical data transmissions. Many of these customers would be, shall we say,
upset in interesting ways if some company sold them equipment that did not work.
Again, thanks for a well-argued essay! I’ll try (no promises though) to take a closer look at your essay at some later (post-commenting-close) date. Again assuming the equations are solid, yours is the kind of in-depth analysis needed to sharpen everyone’s thinking about such topics.
Cheers,
Terry
Gordon Watson replied on Mar. 1, 2018 @ 05:00 GMT
Terry: NB: your time is avaluable to me; so no need to rush! Seeking to minimize misunderstandings --
see below -- from the get-go, your comments follow [with some editing for efficiency] -- with some
bolding for clarity (and sometimes emphasis).
TB: "Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see...
view entire post
Terry: NB: your time is avaluable to me; so no need to rush! Seeking to minimize misunderstandings --
see below -- from the get-go, your comments follow [with some editing for efficiency] -- with some
bolding for clarity (and sometimes emphasis).
TB: "Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see why."
GW: Here it is: "(i) Quantum mechanics is simpler than most people realise. (ii) It is no more and no less than the physics of things for which history has not yet been written."
We agree: 'Quantum mechanics is simpler than most people realise.' I would add: It's little more than an advanced [and experimentally-supported] probability/prevalence theory. But please, for me, translate your 2nd sentence (ii) into a few more words: "(ii) It is no more and no less than the physics of things for which history has not yet been written = ..." ??
TB: "My queue on this last day is long."
GW: Rightly so! But (NB) the threads can remain open for years!!
TB: "But I will follow your link and a look at your essay."
GW: Please take your time with the essay and communicate directly by email (it's in the essay) when you have difficulties; especially if you're rusty with delta-functions in ¶13. I am here for critical feedback and questions, etc. And I cannot be offended.
TB: "Wow! That is one of the best arguments for locality that I think I’ve seen. I like your Bell-ish style of writing and focus on specifics."
GW: Tks.
TB: "You are of course in very good company, since Einstein was a localist."
GW: Yes; without doubt!TB: "And Bell was a localist."
GW: ??? Not from my readings! For me, a true localist would have reviewed his theorem and spotted the error. Further, here's Bell's dilemma from as late as 1990:
‘I cannot say that AAD is required in physics.
I can say that you cannot get away with no AAD. You cannot separate off what happens in one place and what happens in another. Somehow they have to be described and explained jointly. That’s the fact of the situation;
Einstein's program fails ... Maybe we have to learn to accept not so much AAD, but the inadequacy of no AAD. ... That's the dilemma. We are led by analyzing this situation to admit that, somehow, distant things are connected, or at least not disconnected. ...
I don't know any conception of locality that works with QM. So I think we're stuck with nonlocality ... I step back from asserting that there is AAD and
I say only that you cannot get away with locality. You cannot explain things by events in their neighbourhood. But, I am careful not to assert that there is AAD,' after Bell* (1990:5-13); emphasis added.
*Bell, J. S. (1990). “
” Transcript of 22 January 1990, CERN Geneva. Driessen, A. & A. Suarez (1997). Mathematical Undecidability, Quantum Nonlocality and the Question of the Existence of God. A. 83-100.
TB: "I can’t do a detailed assessment today — too many equations that would need careful examination to assess your argument meaningfully — but what I’ve seen at a quick look seems pretty solid."
GW: PLEASE: Do not get bogged down; send me emails when you have difficulties. For me, your time is precious!
TB: That said, there is an expanding class of pro-entanglement data anomalies that you need somehow to take into account:
ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count
GW: Terry: My theory expects "entanglement" to be strengthened with better equipment; and you [thankfully] next supply the supporting evidence!
TB: "This field has moved way beyond the Aspect studies. A lot of hard-nosed business folks figured out years ago that arguments against the existence of entanglement don’t matter much if they can simply build devices that violate Bell’s inequality. Which they did, and now they sell them to some very smart, physics-savvy customers who use them on a daily basis to encrypt some critical data transmissions."
GW: We agree, 100%.
TB: "Many of these customers would be, shall we say, upset in interesting ways if some company sold them equipment that did not work."
GW: NBB Why wouldn't it work? My theory would be kaput if it didn't!
TB: "Again, thanks for a well-argued essay! I’ll try (no promises though) to take a closer look at your essay at some later (post-commenting-close) date. Again assuming the equations are solid, yours is the kind of in-depth analysis needed to sharpen everyone’s thinking about such topics."
GW: Please take you time; every word of criticism is like a kiss from wife.
Tingling in anticipation; with my thanks again; Gordon
More realistic fundamentals: quantum theory from one premiss
view post as summary
post approved
Gordon Watson replied on Mar. 1, 2018 @ 05:12 GMT
REPOSTED TO CORRECT FORMATTING ERROR NOT PRESENT IN PREVIEW! Adding: my comments below are to mimimize some apparent misunderstandings.
Terry: NB: your time is valuable to me; so no need to rush! Seeking to minimize misunderstandings --
see below -- from the get-go, your comments follow [with some editing for efficiency] -- with some
bolding for clarity (and sometimes...
view entire post
REPOSTED TO CORRECT FORMATTING ERROR NOT PRESENT IN PREVIEW! Adding: my comments below are to mimimize some apparent misunderstandings.
Terry: NB: your time is valuable to me; so no need to rush! Seeking to minimize misunderstandings --
see below -- from the get-go, your comments follow [with some editing for efficiency] -- with some
bolding for clarity (and sometimes emphasis).
TB: "Your title is intriguing; look at my signature line and its single-concept definition of QM and you can see why."
GW: Here it is: "(i) Quantum mechanics is simpler than most people realise. (ii) It is no more and no less than the physics of things for which history has not yet been written."
We agree: 'Quantum mechanics is simpler than most people realise.' I would add: It's little more than an advanced [and experimentally-supported] probability/prevalence theory. But please, for me, translate your 2nd sentence (ii) into a few more words: "(ii) It is no more and no less than the physics of things for which history has not yet been written = ..." ??
TB: "My queue on this last day is long."
GW: Rightly so! But (NB) the threads can remain open for years!!
TB: "But I will follow your link and a look at your essay."
GW: Please take your time with the essay and communicate directly by email (it's in the essay) when you have difficulties; especially if you're rusty with delta-functions in ¶13. I am here for critical feedback and questions, etc. And I cannot be offended.
TB: "Wow! That is one of the best arguments for locality that I think I’ve seen. I like your Bell-ish style of writing and focus on specifics."
GW: Tks.
TB: "You are of course in very good company, since Einstein was a localist."
GW: Yes; without doubt!TB: "And Bell was a localist."
GW: ??? Not from my readings! For me, a true localist would have reviewed his theorem and spotted the error. Further, here's Bell's dilemma from as late as 1990:
‘I cannot say that AAD is required in physics.
I can say that you cannot get away with no AAD. You cannot separate off what happens in one place and what happens in another. Somehow they have to be described and explained jointly. That’s the fact of the situation;
Einstein's program fails ... Maybe we have to learn to accept not so much AAD, but the inadequacy of no AAD. ... That's the dilemma. We are led by analyzing this situation to admit that, somehow, distant things are connected, or at least not disconnected. ...
I don't know any conception of locality that works with QM. So I think we're stuck with nonlocality ... I step back from asserting that there is AAD and
I say only that you cannot get away with locality. You cannot explain things by events in their neighbourhood. But, I am careful not to assert that there is AAD,' after Bell* (1990:5-13); emphasis added.
*Bell, J. S. (1990). “
Indeterminism and nonlocality.” Transcript of 22 January 1990, CERN Geneva. Driessen, A. & A. Suarez (1997). Mathematical Undecidability, Quantum Nonlocality and the Question of the Existence of God. A. 83-100.
TB: "I can’t do a detailed assessment today — too many equations that would need careful examination to assess your argument meaningfully — but what I’ve seen at a quick look seems pretty solid."
GW: PLEASE: Do not get bogged down; send me emails when you have difficulties. For me, your time is precious!
TB: That said, there is an expanding class of pro-entanglement data anomalies that you need somehow to take into account:
ID230 Infrared Single-Photon Detector Hybrid Gated and Free-Running InGaAs/InP Photon Counter with Extremely Low Dark Count
GW: Terry: My theory expects "entanglement" to be strengthened with better equipment; and you [thankfully] next supply the supporting evidence! TB: "This field has moved way beyond the Aspect studies. A lot of hard-nosed business folks figured out years ago that arguments against the existence of entanglement don’t matter much if they can simply build devices that violate Bell’s inequality. Which they did, and now they sell them to some very smart, physics-savvy customers who use them on a daily basis to encrypt some critical data transmissions."
GW: We agree, 100%.
TB: "Many of these customers would be, shall we say, upset in interesting ways if some company sold them equipment that did not work."
GW: NBB Why wouldn't it work?
My theory would be kaput if it didn't!TB: "Again, thanks for a well-argued essay! I’ll try (no promises though) to take a closer look at your essay at some later (post-commenting-close) date. Again assuming the equations are solid, yours is the kind of in-depth analysis needed to sharpen everyone’s thinking about such topics."
GW: Please take you time; every word of criticism is like a kiss from wife.Tingling in anticipation; with my thanks again; Gordon
More realistic fundamentals: quantum theory from one premiss.
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Mar. 5, 2018 @ 04:38 GMT
Gordon,
Good comments, wow. I've had some difficulty (external factors) getting back to my queue, and this is
not a complete reply. But two quick items:
-- When I say "QM is the physics of that for which history has not yet been written," probably the best way to explain it is Feynman's integral-of-all-possible-histories QED concept. What that concept says is remarkably...
view entire post
Gordon,
Good comments, wow. I've had some difficulty (external factors) getting back to my queue, and this is
not a complete reply. But two quick items:
-- When I say "QM is the physics of that for which history has not yet been written," probably the best way to explain it is Feynman's integral-of-all-possible-histories QED concept. What that concept says is remarkably simple: If you track
every possible way that an event could happen from its start to all points in the future that could be touched by that event, then add them all together using particle phase along those paths, you end up (voila!) with, well, the quantum wave function for the event. The paths whose phases match up reinforce each other, and give the highest probability outcomes.
That is, every wave function can also be interpreted as a "bundle" of possible histories,
but only if the wave function has not yet been "collapsed". And by "collapse", I really mean only this: Until you poke the wave function hard enough to force it to say
which of those many possible histories in the wave function has to produce an actual event or particle. Extracting such information, such as by letting an electron wave function hit a photodetector, creates history. There really is no meaningful distinction between the two: information
is history.
In most presentations the "history" or path implications of collapsing a wave function is not emphasized, in part because I think people are uncomfortable with the idea that some parts of the past have not yet been set. But if you detect a photon whose wave function is a hundred light years in diameter (happens all the time!), you are inevitably also setting a "history" for that photon that causes it to land on earth and not on some distant star. For pilot wave folks this is flat-out trivial: The "real" particle was always headed to earth! For folks like me who respect but cannot accept the pilot wave model, it gets... complicated, and requires a rather ragged-edge concept of when the past finally gets set.
-- Bell: Argh, I don't recall the reference, but I can assure you with something like 99% confidence that Bell was trying to
disprove entanglement. He was a pilot wave person and proud of it, saying it helped him come up with his theorem (that part at least I think is from Speakable and Unspeakable).
The reason he comes over as the opposite is, I'm pretty sure, a case where he was leaned over very hard to
not seem biased. He truly did not want to be one of those people who adamantly finds what they want to find; he wanted the data to speak for itself.
Enough, it's late...
Cheers,
Terry
view post as summary
hide replies
Giovanni Prisinzano wrote on Feb. 26, 2018 @ 15:44 GMT
Dear Terry,
Thank you for your beautiful, meditated and profound essay, as well as for your great and very balanced contribution to the forum!
All the best,
Giovanni
report post as inappropriate
Author Terry Bollinger replied on Feb. 27, 2018 @ 05:31 GMT
Giovanni,
Well... hmm, it's Feb 27 but this is still working, at least for a while.
Thank you for your very kind remarks! I'll be sure to read your essay, as I try to do whenever anyone posts, even though the rating periods is over.
(Or can we still post, just not rate? Sigh. I must read the rules again...)
Cheers,
Terry
Giovanni Prisinzano replied on Feb. 27, 2018 @ 18:11 GMT
Dear Terry,
there is no hurry to read my essay, if you want to do it. The forum remains open until the nomination of the winners (and even beyond), although I fear it will be very little frequented from now on.
Mine is the modest contribution of a non-specialist. Read it without obligation, when you have time.
Regarding the scoring system, I know it enough, having participated...
view entire post
Dear Terry,
there is no hurry to read my essay, if you want to do it. The forum remains open until the nomination of the winners (and even beyond), although I fear it will be very little frequented from now on.
Mine is the modest contribution of a non-specialist. Read it without obligation, when you have time.
Regarding the scoring system, I know it enough, having participated in the last three contests. I feel able to say (and I'm not the only one) that it works pretty badly and it's the worst aspect of the contest. The problem is that almost no one of us uses a rigorous and correct voting pledge as yours and the score is given often by sympathy, or resentment, or to return a high mark, or because absurd alliances and consortia come out..
As a rule, I have never asked anyone to score my essay, but I have certainly sometimes been influenced by the requests of others, or by a too high rating that I received, or by the desire not to disappoint someone, and I certainly ended up by evaluating too high some essays that perhaps did not deserve it, or that I simply could not understand. My mistake, no doubt.
Fortunately, I rarely participate in the scoring and unfortunately, having difficulties with English, even in discussions, but others do not so, and this way of doing negatively affects the final ranking of the community. Thus, some objectively mediocre essays often end up in the upper part of the ranking, while others objectively valid end up in undeservedly low positions. Your own essay, in my opinion one of the best, if not the best, deserved to end up in a position much higher than that it had (after blasts of 1 or 2 given without adding any motivation). But I also think of other contributions, like that of Karl Coryat, that you have appreciated and discussed in detail. Or of even more neglected essays, like that of A. Losev, which seemed to me very interesting and original. Or the suggestive one by Joe Becker (founder of the Unicode system!), who may have been penalized, as well as by his very shy and humble attitude, by his clearly holistic and metaphysical perspective (but similar to that of a great visionary scientist and philosopher like Leibniz). Or that of Bastiaansen, which certainly offers food for thought. But there are certainly many others, perhaps even lower scored, but certainly valid, which I forgot or I have not even read, because there are 200 essays and time is lacking..
You will ask me: why do you put this in my thread, instead of writing it in a more appropriate and general context? In fact, these considerations may be out of context here and I apologize for this. But they came to me immediately after the closing of the community vote, while I was reading some of your posts. Moreover I have a little hope that your tireless, qualified, very correct contribution to this year's contest-forum can serve to make the FQXi community better, avoiding the risk of becoming a confused and scientifically sterile ground of personalism and preconceptions.
Thanks again for all your contributions and, in particular, for the latest precious mini-essays, which will be for me a material for reading and reflection, in the coming days or weeks.
Cheers,
Giovanni
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Feb. 28, 2018 @ 01:50 GMT
Giovanni,
I have finally figured out how to finds posts like yours! I simply search by date, e.g. "Feb. 27" for this one. It has been very hard for my poor brain to find entries when they show up in the middle of the blogs, both in mine and in others.
Thank you for your positive and constructive comments! Also, thanks for that bit of info on how just the ratings close, not the commenting. I for one will be more likely to show up, not less. The ratings part is designed like a Hunger Games incentive program, so having it gone makes me feel like a more unfettered form synergistic interactions is now possible.
I am particularly appreciative of your quick list of essays worth examining. I plan to look at them, hopefully all of them! I keep finding unexpectedly interesting points in so many of these essays.
Finally, please feel very free to post in my essay thread anytime you want to. It never even occurred to me that it might not be the right "spot" for you to do so. (Come to think of it, considering some of the humongous posts that I've put on other folks' threads, I guess it's sort of a given that I'm not too worried about people cross-posting, isn't it?)
Cheers,
Terry
Peter Jackson wrote on Feb. 26, 2018 @ 17:52 GMT
Terry,
In our (Feb 17th) string above we didn't resolve the non integer spin video matter;
100 sec video Classic QM. It's just occurred that you were after a POLAR spin 1/2, 2 etc! Now that's not quite what the original analysis implied, but, lest it may have been, YES, the 3 degrees of freedom also produce that.
Just one y axis rotation with each polar rotation gives spin 1/2; Imagine the polar axis horizontal. Now rotate around the vertical axis to switch the poles horizontally. HALF a polar rotation at the same time brings your start point back.
Now a y axis rotation at HALF that rate means it takes TWO rotations of the polar axis to t return to the start point.
Occam never made a simpler razor! It's a unique quality of a sphere that there's no polar axis momentum loss from y or z axis rotations.
Was there anything else? (apart from confusing random number distributions explained in Phillips's essay with real 'action at a distance'!) Of course tomography works but within strict distance limits. Just checked through Karen's list again and can't find one the DFM doesn't qualify for apart from a few particle physics bits. Can you check & see if I can stop digging now and leave those to the HEP specialists!?
Peter
PS; Not sure if that link hasn't suddenly died!
report post as inappropriate
Author Terry Bollinger replied on Feb. 27, 2018 @ 05:37 GMT
Peter,
Thank you for the follow-up, but at 12:30 AM I'm not quite sure I followed all of that? I assume you did see my long posting at your site? I'll try to read your posting above again when I'm awake... :/ zzz
Cheers,
Terry
Ulla Marianne Mattfolk wrote on Feb. 26, 2018 @ 19:23 GMT
Hi,
This is a wonderful essay, with Deep fundamental knowledge. I am impressed.
Nothing to ask for now.
Ulla Mattfolk https://fqxi.org/community/forum/topic/3093
report post as inappropriate
Author Terry Bollinger replied on Feb. 27, 2018 @ 05:39 GMT
Ulla,
Thank you for your generous and kind remarks! It's past the rating period now, but I'll be sure to take a look at your essay tomorrow (today?)
Cheers,
Terry
Author Terry Bollinger wrote on Feb. 27, 2018 @ 03:37 GMT
The Illusion of Mathematical FormalityTerry Bollinger, 2018-02-26 Feb
Abstract. Quick: What is the most fundamental and least changing set of concepts in the universe? If you answered “mathematics,” you are not alone. In this mini-essay I argue that far from being eternal, formal statements are actually fragile, prematurely terminated first-steps in perturbative...
view entire post
The Illusion of Mathematical FormalityTerry Bollinger, 2018-02-26 Feb
Abstract. Quick: What is the most fundamental and least changing set of concepts in the universe? If you answered “mathematics,” you are not alone. In this mini-essay I argue that far from being eternal, formal statements are actually fragile, prematurely terminated first-steps in perturbative sequences that derive ultimately from two unique and defining features of the
physics of our universe: multi-scale, multi-domain
sparseness and multi-scale, multi-domain
clumping. The illusion that formal statements exist independently of physics is enhanced by the clever cognitive designs of our mammalian brains, which latch on quickly to first-order approximations that help us respond quickly and effectively to survival challenges. I conclude by recommending recognition of the probabilistic infrastructure of mathematical formalisms as a way to enhance, rather than reduce, their generality and analytical power. This recognition makes
efficiency into a first-order heuristic for uncovering powerful formalisms, and transforms the incorporation of a statistical method such Monte Carlo into formal systems from being a “cheat” into an integrated concept that helps us understand the limits and implications of the formalism at a deeper level. It is not an accident, for example, that quantum mechanics simulations benefit hugely from probabilistic methods.
----------------------------------------
NOTE: A mini-essay is my attempt to capture and make more readily available an idea, approach, or prototype theory that was inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:1.
When do we stop digging, Conditions on a fundamental theory of physics by Karen Crowther2.
The Crowther Criteria for Fundamental Theories of Physics3.
On the Fundamentality of Meaning by Brian D Josephson4.
What does it take to be physically fundamental by Conrad Dale Johnson5.
The Laws of Physics by Kevin H KnuthAdditional non-FQXi references are listed at the end of this mini-essay.
----------------------------------------
Backgroun
d: Letters from a Sparse and Clumpy UniverseSparseness6 occurs when some space, such as a matrix or the state of Montana, is occupied by only a thin scattering of entities, e.g. non-zero numbers in the matrix or people in Montana . A
clump is compact group of smaller entities (often clumps themselves of some other type) that “stick together” well enough to persist over time. A clump can be abstract, but if it is composed of matter we call it an
object. Not surprisingly, sparseness and clumping tend to be closely linked, since clumps often are the entities that occupy positions in some sparse space.
Sparseness and clumping occur at multiple size scales in our universe, using a variety of mechanisms, and when life is included, at varying levels of abstraction. Space itself provides a universal basis for creating sparseness at multiple size scales, yet the very existence of large expanses of extremely “flat” space is still considered one of the greatest mysteries in physics, an exquisitely knife-edged balancing act between total collapse and hyper expansion.
Clumping is strangely complex, involving multiple forces at multiple scales of size. Gravity reigns supreme for cosmic-level clumping, from involvement (not yet understood) in the 10 billion lightyear diameter Hercules-Corona Borealis Great Wall down to kilometer scale gravel asteroids that just barely hold together. From there a dramatically weakened form of the electromagnetic force takes over, providing bindings that fall under the bailiwick of chemistry and chemical bonding. (The unbridled electric force is so powerful it would obliterate even large gravitationally bond objects.) Below that level the full electric force reigns, creating the clumps we call atoms. Next down in scale is yet another example of a dramatically weakened force, which is the pion-mediated version of the strong force that holds neutrons and protons together to give us the chemical elements. The protons and neutrons, as well as other more transient particles, are the clumps created by the full, unbridled application of the strong force. At that point known clumping end… or do they? The quarks themselves notoriously appear to be constructed from still smaller entities, since for example they all use multiples of a mysterious 1/3 electric charge, bound together by unknown means at unknown scales. How exactly the quarks have such clump-like properties remains a mystery.
Nobel Laureate Brian Josephson
1 speculates that at least for higher level domains such as biology and sociology, the emergence of a form of stability that is either akin to or leads to clumping always the result of two or more entities that oppose and cancel each other in ways that create or leave behind a more durable structure. This intriguing concept can be translated in a surprisingly direct way to the physics of clumping and sparseness in our universe. For example, the mutually cancelling of positive and negative charges of an electron and a proton can combine to leave enduring and far less reactive result, a hydrogen atom, that in turn supports clumping through a vastly moderated presentation of the electric forces that it largely cancels More generally, the hydrogen atom is an example of
incomplete cancellation, that is, cancellation of only a subset of the properties of two similar but non-identical entities. The result qualifies as “scaffolding” in the Josephson sense due to its relative neutrality, which allows it for example to be a part of chemical compounds that would be instantly shredded by the full power of the mostly-cancelled electric force. Physics has many examples of this kind of incomplete cancellation, ranging from quarks that mutually cancel the overwhelming strong force to leave milder protons and neutrons, protons and electrons that then cancel to leave charge-free hydrogen atoms, unfilled electron states that combine to create stable chemical bonds, and hydrogen and hydroxide groups on amino acids that combine to enable the chains known as proteins. At higher levels of complexity, almost any phenomenon that reaches an equilibrium state tends to produce a more stable, enduring outcome. The equilibrium state that compression-resistant matter and ever-pulling gravity reach at the surface of a planet is another more subtle example, one that leads to a relatively stable environment that is conducive to, for example, us.
Bonus Insert: Space and gravity as emerging from hidden unified-force cancellationsIt is interesting to speculate whether the flatness of space could itself be an outcome of some well-hidden form of partial cancellation.
If so, it would mean that violent opposing forces of some type of which we are completely unaware (or have completely misunderstood) largely cancelled each other out
except for a far milder residual, that being the scaffolding that we call “flat space.” This would be a completely different approach to the flat space problem, but one that could have support from existing data if that data were examined from Josephson’s perspective of stable infrastructure emerging from more mutual cancellation by far more energetic forces.
The forces that cancelled would almost certainly still be present in milder forms, however, just as the electric force continues to show up in milder forms in atoms. Thus if the Josephson effect — ah, sorry, that phrase is already taken — if the
Josephson synthesis model applies to space itself, then the mutually cancelling forces that led to flat space may well already be known to us, just not in their most complete and ferocious forms. Furthermore, if these space-generating forces are related to the known strong and electric forces — or more likely, to the Standard Model combination of them with the weak force — then such a synthesis would provide and entirely new approach to unifying gravity with the other three forces.
Thus the full hypothesis in summary: Via Josephson synthesis, it is speculated that ordinary xyz space is a residual structural remnant, a
scaffolding, generated by the nearly complete cancellation of two oppositely signed versions of the unified weak-electric-strong of the Standard Model. Gravity then becomes not another boson force, but a topological effect applied by matter to the the “surface of cancellation” of the unified Standard Model forces.
Back to Math: Is Fundamental Physics Always Formal?In her superb FQXi essay
When do we stop digging? Conditions on a fundamental theory of physics, Karen Crowley
2 also created an exceptionally useful product for broader use,
The Crowther Criteria for Fundamental Theories of Physics.
3 It is a list of nine succinctly stated criteria that in her assessment need to be met by a physics theory before it can qualify as fundamental.
There was however one criterion in her list about which I uncertain, which was the fourth one:
CC#4.
Non-perturbative: Its formalisms should be exactly solvable rather than probabilistic.
I was ambivalent when I first read that one, but I was also
unsure why I felt ambivalent. Was it because one of the most phenomenally accurate predictive theories in all of physics, Feynman’s Quantum ElectroDynamics or QED, is also so deeply dependent on perturbative methods? Or was it the difficulty that many fields and methods have in coming up with closed equations? I wanted to understand why, if exactly solvable equations were the “way to go” in physics for truly fundamental results, why then were some of the most successful theories in physics perturbative? What all does that work really imply?
As it turns out, both the multi-scale clumpiness and sparseness of our universe are relevant to this question because they lurk behind such powerful mathematical concepts as renormalization. Renormalization is not really as exotic or even as mathematical is it is in, say, Feynman’s QED theory. What it really amounts to is an assertion that our universe is, at many levels, “clumpy enough” that many objects (and processes) within it can be
approximated when viewed from a distance. That “distance” may be real space or some other more abstract space, but the bottom line is that this sort of approximation option is a deep component of whatever is going on. I say that in part because we are ourselves as discrete, independently mobile entities are very much part of this clumpiness, as are the large, complex molecules that make up our bodies… as are the atoms that enable molecules… as are the nucleons that enable atoms… and as are the fundamental fermions that make up nucleons.
This approximation-at-a-distance even shows up in everyday life and cognition. For example, let’s say you need an AA battery. What do you think first? Probably you think “I need to go to the room where I keep my batteries.” But your navigation to that room begins as a
room to room navigation. You don’t worry yet about
exactly where in that room the batteries are, because that has no effect on how you navigate to the room. In short, you will
approximate the location of the battery until you navigate closer to it.
The point is that the room is itself clumpy in a way that enables you to do this, but the process itself is clearly approximate. You could in principle super-optimize your walking path so that it minimizes your total effort to get to the battery, but such a super-optimization would be extremely costly in terms of the thinking and calculations needed, and yet would provide very little benefit. So, when the cost-benefit ratio grows too high, we
approximate rather than super-optimize, because the clumpy structure of our universe makes such approximations much more cost-beneficial overall.
What happens after your reach the room? You change scale!
That is, you invoke a new model that tells you how to navigate the draws or containers in which you keep the AA batteries. This scale is physically smaller, and again is approximate, enabling tolerance for example of highly variable locations of the batteries within a drawer or container.
This works for the same reason that in Feynman’s QED is incredibly accurate and efficient for modeling an electron probabilistically. The electron-at-a-distance can be safely and very efficiently modeled as a point particle with a well-defined charge, even though that is not really correct. That is the room-to-room level. As you get closer to the electron, that model must be replace by a far more complex one that involves rapid creation and annihilation of charged virtual particle pairs that “blur” the charge of the electrons in strange and peculiar ways. That is the closer, smaller, dig-around-in-the-drawers-for-a-battery level of approximation. In both cases, the overall clumpiness of our universe makes these special forms of approximation both very accurate and computationally efficient.
At some deeper level, one could further postulate that this may be more than just a way to model reality. It is at least possible (I personally think it probable) that this is also how the universe actually
works, even if we don’t quite understand how. I say that because it is always a bit dangerous to assume that just because we like to model space as a given and particles as points within it, those are in the end just models, ones that actually violate quantum mechanics in the sense of postulating points that
cannot exist in real space due the quantum energy cost involved. A real point particle would require infinite energy to isolate, so a model that invokes such particles to estimate reality really should be viewed with a bit of caution as a “final” model.
So bottom line: While Karen Crowley’s Criterion #4 makes excellent sense as a goal, our universe seems weirdly wired for at least some forms of approximation. I find that very counterintuitive, deeply fascinating, and likely important in some way that we flatly do not yet understand.
Perturbation Versus Formality in Terms of Computation CostsHere is a hypothesis:
In the absence of perturbative opportunities, the computational costs of fully formal methods for complete, end-to-end solutions trends towards infinity.The informal proof is that full formalization implies fully parallel combinatorial interaction of all components of a path (functional) in some space, that being XYZ space in the case of approaching an electron. The computational cost of this fully parallel optimization then increases both with decreasing granularity of the path segment sizes used, and with path length. The granularity is the most important parameter, with the cost rapidly escalating towards infinity as the precision (inverse of segment length) decreases towards the limit of representing the path as an infinitely precise continuum of infinitely precise points.
Conversely, the ability to use larger segments instead of infinitesimals depends on the scale structure of the problem. If that scale structure enables multiscale renormalization, then the total computational cost remain at least roughly proportional to the level of precision desired. If no such scale structure is available, the cost instead escalates towards infinity.
But isn't the whole point of closed formal solutions is that they remain (roughly) linear in computational cost versus the desired level of precision?
Yes... but what if the mathematical entities we call "formal solutions" are actually nothing more than the highest-impact granularities of what are really just perturbative solutions made possible by the
pre-existing structure of our universe?
Look for example at gravity equations, which treat stars and planets as point-like masses. However, that approximation completely falls apart at the scale of a planet surface, and so is only the first and highest-level step in what is really a perturbative solution. It's just that our universe is pre-structured in a way that makes many such first steps so powerful and so broadly applicable that it allows us to
pretend they are complete, stand-alone formal solutions.
A More Radical Physics HypothesisAll of this leads to a more radical hypothesis about formalisms in physics, which is this:
All formal solutions in physics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing clumpy structure of our universe.But on closer examination, even the above hypothesis is incomplete. Another factor that needs to be taken into account is the neural structure of human brains, and how they are optimized.
The Role of Human CognitionHuman cognition must rely on bio-circuitry that has very limited speed, capacity, and accuracy. It therefore relies very heavily in the mathematical domain on using Kolmogorov programs to represent useful patterns that we see in the physical world, since a Kolmogorov program only needs to be executed to the level of precision actually needed.
Furthermore, it is easier and more compact to process suites of such human-brain-resident Kolmogorov programs as the
primary data components for reasoning about complexity, as opposed to using their full elaborations into voluminous data sets that are more often than not beyond neural capacities. In addition to shrinking data set sizes, reasoning at the Kolmogorov program level has the huge advantage that such program capture in direct form at least many of the regularities in such data sets, which in turn allows much more insightful comparisons across programs.
We call this “mathematics.”
The danger in
not recognizing mathematics as a form of Kolmogorov program creation, manipulation, and execution is that as biological intelligences, we are by design inclined to accept such programs as representing the
full, to-the-limit forms of the represented data sets. Thus the Greeks assumed the Platonic reality of perfect planes, when in fact the physical world is composed of atoms that make such planes flatly impossible. The world of realizable planes is instead emphatically and decisively
perturbative, allowing the full concept of “a plane” to exist only as unobtainable limit of the isolated, highest-level initial calculations. The reality of such planes falls apart completely when the complete, perturbative, multi-step model is renormalized down to the atomic level.
That is to say, exactly as with physics, the perfect abstractions of mathematics are nothing more than top-level stages of perturbative programs made possible by the pre-existing structure of our universe.
The proof of this is that whenever you try to
compute such a formal solution, you are forced to deal with issues such as scale or precision. This in turn means that the abstract Kolmogorov representations of such concept never really represent their end limits, but instead translate into huge spectra of precision levels that approach the infinite limit to whatever degree is desired, but only at a cost that increases with the level of precision. The perfection of mathematics is just an illusion, one engendered by the survival-focused priorities of how our limited biological brains deal with complexity.
Clumpiness and MathematicsThe bottom line is this even broader hypothesis:
All formal solutions in both physics and mathematics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing “clumpy” structure of our universe.In physics, even equations such as
E=mc2 that are absolutely conserved at large scales cannot be interpreted “as is” at the quantum level, where virtual particle pairs distort the very definition of where mass is located.
E=mc2 thus more accurately understood as a high-level subset of a multi-scale perturbative process, rather than as a complete, stand-alone solution.
In mathematics, the very concept of an infinitesimal is a limit that can never be reached by calculation or by physical example. That makes the very foundations of real mathematics into a calculus not of real values, but of sets of Kolmogorov programs for which the limits of execution are being intentionally ignored. Given the indifference and often lack even of awareness of the implementation spectra that are necessarily associated with all such formalisms, is it really that much of a surprise how often unexpected infinities plague problems in both physics and math? Explicit awareness of this issue changes the approach and even the understanding of what is being done; math in general becomes a calculus of operators, of programs, rather than of absolute limits and concepts.
One of the most fascinating implications of the hypothesis that all math equations ultimately trace back to the clumpiness and sparseness of the physical universe is that heuristic methods can become integral parts of such equations. In particular they should be usable in contexts where a “no limits” formal statement overextends computation in directions that have no real impact on the final solution. This makes methods such as Monte Carlo into first-order options for expressing a situation correctly. As one example, papers by Jean Michel Sellier
7 show how the carefully structured “signed particle” applications of Monte Carlo methods can dramatically reduce the computation costs of quantum simulation. Such syntheses of both theory (signed particles and negative probabilities) with statistical methods (Monte Carlo) promise not only to provide practical algorithmic benefits, but also to provide deeper insights into the nature of quantum wavefunctions themselves.
Possible Future Expansions of this Mini-EssayAs a mini-essay, my time is growing short for posting here. Most of the above arguments are my original stream-of-thought arguments that led to my overall conclusion. But as my abstract shows, I have a great many more thoughts to add, but likely not enough time to add them. I will therefore post this following link to a public Google Drive folder I’ve set up for FQXi-related postings.
If this is OK with FQXi — basically if they do not strip out the URL below, and I’m perfectly fine if they do — then I
may post updated versions of this and other mini-essays in this folder in the future:
Terry Bollinger’s FQXi Updates Folder----------------------------------------
Non-FQXi References6. Lin, H. W., Tegmark, M., and Rolnick, D. Why does deep and cheap learning work so well?
Journal of Statistical Physics, Springer,168:1223-1247 (2017).
7. Jean Michel Sellier. A Signed Particle Formulation of Non-Relativistic Quantum Mechanics.
Journal of Computational Physics, 297:254–265 (2015).
view post as summary
Author Terry Bollinger wrote on Feb. 27, 2018 @ 03:42 GMT
Anonymous wrote on Feb. 27, 2018 @ 04:33 GMT
An Exceptionally Simple Space-As-Entanglement TheoryTerry Bollinger, 2018-02-26 Feb
Abstract. There has been quite a bit of attention in recent years to what has been called the
holographic universe. This concept, which originated somehow from string theory (!), postulates that the universe is some kind of holographic image, rather than the 3D space we see....
view entire post
An Exceptionally Simple Space-As-Entanglement TheoryTerry Bollinger, 2018-02-26 Feb
Abstract. There has been quite a bit of attention in recent years to what has been called the holographic universe. This concept, which originated somehow from string theory (!), postulates that the universe is some kind of holographic image, rather than the 3D space we see. Fundamental to this idea is space as entanglement, that is, that the fabric of space is built out of the mysterious “spooky action” links the Einstein so disdained. In keeping with its string theory origins, the holographic universe also dives down to the Planck foam level. The point of this mini-essay is that except for the point about space being composed of entanglements between particles, none of this complexity is needed: there are no holograms, and there is no need for the energetically impossible Planck foam. All your need is group entanglement of the conjugate of particle spin, which is an overlooked “ghost direction” orthogonal to spin. Particles form a mutually relative consensus on these directions (see Karl Coryat Pillar #3) that allows them to ensure conservation of angular momentum, and that consensus becomes xyz space. Instead of a complicated hologram, its structure is that of an exceptionally simple direct-link web that interlinks all of the participating particles. It is no more detailed than it needs to be, and that number is determined solely by how many particles participate in the overall direction consensus. Finally, it is rigid in order to protect and preserve angular momentum, since the overriding goal in all forms of a quantum entanglement is absolute conservation of some quantum number.
----------------------------------------
NOTE: A mini-essay is my attempt to capture an idea, approach, or prototype theory inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:
1.
The Four Pillars of Fundamentality by Karl Coryat----------------------------------------
IntroductionFo
r this mini-essay I think the original text gives the thought pretty well “as is,” so I am simply quoting it below. My thanks again to Karl Coryat for a fun-to-read and very stimulating essay.
A quote from my assessment Karl Coryat’s Pillar #3If space is the fabric of relations, if some vast set of relations spread out literally across the cosmos, defining the cosmos, are the true start of reality instead of the deceptive isolation of objects that these relations then make possible, what are the components of that relation? What are the “bits” of space?
I don’t think we know, but I assure you it’s not composed of some almost infinite number of 10
-35 meter bubbles of Planck foam. Planck foam is nothing more than an out-of-range, unbelievably extrapolated extremum created by pushing to an energetically impossible limit the rules of observation that have physical meaning only at much lower energies. I suspect that the real components of space are much simpler, calmer, quieter, less energetic, and well, space-like than that terrifying end-of-all-things violence that is so casually called “Planck foam.”
I’ll even venture a guess. You heard it here first… :)
My own guess is that the units of space are nothing more radical than the action (Planck) conjugation complements of the angular momenta of all particles. That is, units of pure direction, which is all that is left after angular momentum scarfs up all of the usual joule-second units of action, leaving only something that at first glance looks like an empty set. On closer examination, though, a given spin must leave something behind to distinguish itself from other particle spins, and that “something” is the orientation of the spin in 3-space, a ghostly orthogonality to the spin plane of the particle. But more importantly, it would have to be cooperatively, relationally shared with every other particle in the vicinity and beyond, so that their differences remain valid. Space would become a consensus fabric of directional relationships, one in which all the particles have agreed to share the same mutually relative coordinate system — that is, to share the same space[/]. This direction consensus would be a group-level form of entanglement, and because entanglement is unbelievably unforgiving about conservation of conserved quantum numbers such as spin, it would also be extraordinarily rigid, as space should be. Only over extreme ranges would it bend much, to give gravity, which thus would not be an ordinary quantum force like photon-mediated electromagnetism. It would also be loosely akin to the “holographic” concept of space as entanglement, but this version is hugely simpler and much more direct, since neither holography, nor higher dimension, nor Planck-level elaborations are required. The entanglements of the particles just create a simple, easily understood 3-space network linking all nodes (particles).
But space cannot possibly be compose of such a sparse, incomplete network, right?
After all, space is also infinitely detailed as well as extremely rigid, so there surely are not enough particles in the universe to define space in sufficient detail! Many would in fact argue that this is precisely why any phenomenon that creates space itself must operate at the Planck scale of 10
-35 meters, so that the incredible detail needed for 3-space can be realized.
Really? Why?
If only 10 objects existed in the universe, each a meter across, why would you need a level of detail that is, say, 20 orders of magnitude more detailed for them to interact meaningfully and precisely with each other? You would still be able to access much higher levels of relational detail, but only by asking for more detail, specifically by applying a level of energy proportional to the level of detail you desired. Taking things to the absolute limit first is an incredibly wasteful procedure, and incidentally, it is emphatically not what we see in quantum mechanics, where every observation has a cost that depends on the level of detail desired, and even then only at the time of the observation. There are good and deeply fundamental quantum reasons why the Large Hadron Collider (LHC) that found the Higgs boson is 8.6 km in diameter!
The bottom line is that in terms of as-needed levels of detail, you can build up a very-low-energy universal “directional condensate” space using the spins of nothing more than the set of particles that exist in that space. It does not matter how sparse or dense those particles are, since you only need to make space “real” for the relationships that exist between those particles. If for example your universe has only two particles in it, you only need one line of space (Oscillatorland!) to define their relationship. Defining more space outside of that line is not necessary, for the simple reason that no other objects with which to relate exist outside of that line.
So regardless of how space comes to be — my example above mostly shows what is possible and what kinds of relationships are required — its very existence makes the concept of relations between entities as fundamental as it gets. You don’t end with relations, you start with them.
ConclusionsQuite few people who are reading this likely do not even believe in entanglement! So I am for you the cheerful ultimate heretic, the fellow who not only believe fervently in the reality of entanglement, but would make it literally into the very fabric of space itself. Sorry about that, but I hope you can respect that I have my reasons, just as I very much respect localism. Two of my top physicist favorites of all time, Einstein and Bell, were both adamant localists!
If you are a holographic universe type, I hope you will at least think about some of what I’ve said here. I developed these ideas in isolation from your community, and frankly was astonished when I finally realized its existence. I deeply and sincerely believe that you have a good and important idea there, but history had convoluted it in very unfortunate ways. Take a stab at my much simpler 3D web approach, and I think interesting things could start popping out fairly quickly.
If you are MOND or dark matter enthusiast, think about the implications of space being a direct function of the presence or absence of matter. One of my very first speculations on this topic was that as this fabric of entanglement thins, you could very well get effects relevant to the anomalies that both MOND and dark matter attempt to explain.
Finally, I gave this fabric a name a long time, a name with which I pay respect to a very great physicists who literally did not get respect: Boltzmann. I call this 3D fabric of entanglements the Boltzmann fabric, represented (I can’t do it here) by a lower-case beta with a capital F subscript. His entropic concepts of time become cosmic through this fabric.
view post as summary
report post as inappropriate
Author Terry Bollinger wrote on Feb. 27, 2018 @ 04:39 GMT
To link to the above mini-essay, please copy and paste either of the following links:
An Exceptionally Simple Space-As-Entanglement Theoryhttps://fqxi.org/community/forum/topic/3099#post_14610
0
Peter Jackson replied on Feb. 28, 2018 @ 13:18 GMT
Terry,
Going back to what spawned string theory and Len Susskinds thoughts an even simpler interpretation in another direction seems to yield a whole lot more useful stuff without infinite recursion; i.e. here;
VIDEO Time Dependent Redshift. Are we locked in a circular one way street without the exit of helicical paths?
My present classic QM derivation emerged from a test of the model and SR components, via the 2015 top scorer;
The Red/Green Sock Trick. Might it not be time to step back and review other routes?
Peter
report post as inappropriate
Author Terry Bollinger wrote on Feb. 27, 2018 @ 04:57 GMT
It’s Time to Get Back to Real String TheoryTerry Bollinger, 2018-02026 Feb
Abstract. There is a real string theory. It is experimentally accessible and verifiable, at scales comparable to ordinary baryons and mesons, as opposed to the energetically impossible Planck foam version of string theory. It has perhaps 16 or so solutions, most likely, as opposed to the...
view entire post
It’s Time to Get Back to Real String TheoryTerry Bollinger, 2018-02026 Feb
Abstract. There is a real string theory. It is experimentally accessible and verifiable, at scales comparable to ordinary baryons and mesons, as opposed to the energetically impossible Planck foam version of string theory. It has perhaps 16 or so solutions, most likely, as opposed to the 10
500 vacuums of Planck foam string theory. It was abandoned in 1974. It’s time we got back to it.
----------------------------------------
NOTE: A mini-essay is my attempt to capture an idea, approach, or prototype theory inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:A well-founded formulation for quantum chromo- and electro-dynamics by Wayne R Lundberg----------------------------------------
A Long Quote from my Lundberg Essay AssessmentMost folks aren’t aware of it, but nucleons like protons and neutrons have additional spin states that appear like heavier particles built from the same set of quarks. Thus in addition to uud forming a spin 1/2 proton, the same three quarks can also form a heavier particle with spin 3/2 (1 added unit of spin) and spin 5/2 (2 added units of spin). These three variations form a lovely straight line when plotted as mass versus spin, which in turn implies a fascinatingly regular relationship between mass and nucleon spin.
These lines are called Regge trajectories, and back in the late 1960s and early 1970s they looked like a promising hint for how to unify the particle zoo. Analyses of Regge trajectories indicated string-like stable resonance states were creating the extreme regularity of the Regge trajectories. These “strings” consisted of something very real, the strong force, and their vibrations were highly constrained by something equally real, the quarks that composed the nucleons (and also mesons, which also have Regge trajectories). These boson-like resonances of a string-like incarnation of the strong force were highly unexpected, extremely interesting, and experimentally accessible. Theorists were optimistic.
Then it all went to Planck.
Specifically, the following paper caught on like wildfire (slow wildfire !) and ended up obliterating any hope or future funding for understanding the quite real, experimentally accessible, proton-scale, strong-force-based string vibrations behind Regge trajectories. They did this by proposing what I like to call the Deep Dive:
Scherk, J. & Schwarz, J. H., Dual Models for Non-Hadrons,
Nuclear Physics B, Elsevier,
1974, 81, 118-144.
So what was the Deep Dive, and why did they do it?
Well, it “went down” like this: Scherk and Schwarz noticed that the overall signature of some of the proton-sized strong-force vibrations behind Regge trajectories were very similar to the spin 2 signatures of the (still) hypothetical
gravitons that were supposed to unify gravity with other three forces of the Standard Model. Since the emerging Standard Model was having breathtaking success in that time period for explaining the particle zoo, quantum gravity and the Planck-scale foam were very popular at the time… and very tempting.
So, based as best I can tell
only on the resemblance of these very real vibration modes in baryons and mesons to gravitons, Scherk and Schwarz made their rather astonishing, revelation-like leap: They decided that the strong-force-based vibrations behind Regge trajectories were in fact
gravitons, which have nothing to do with the strong force and are most certainly not “composed” of the strong force. The Planck-scale vibrations of string theory are instead composed of… well, I don’t know what, maybe intense gravity? I’ve never been able to get an answer out of a string theorist on that question of “what is a string made of?” This is not an unfair question, since for example the original strings behind Regge trajectories are “composed” of the strong force, and have quite real energies associated with their existences.
I still don’t even quite get even the logic behind the Deep Dive, since gravity had exactly zero to do with either the substance of the strings (a known force) or the nature of the skip-rope-like, quark-constrained vibrations behind Regge trajectories. Nonetheless they did it. They took the Deep Dive, and it only ended up costing physics the following:
… 20 orders of magnitude of and shrinking size, since protons are about 10
-15 meters across, and the gravitons were nominally at the Planck foam scale of 10
-35 (!!!), which is a size scale that is inaccessible to any conceivable direct measurement process in the universe; plus:
… 20 orders of magnitude of
increased energy costs, which is similarly universally inaccessible to any form of direct measurement; plus:
… a complete liberation from all of those annoying but experimentally validated vibration constraints that were imposed in real nucleons and mesons by the presence of quarks and the strong force. That’s a
cost, not a benefit, since it explodes the range of options that have to be explored to find a workable theory. Freeing the strings from… well…
any appreciable experimental or theoretical constraints… enabled them instead to take on the nearly infinite number of possible vibration modes that a length or loop of rope gyrating wildly in outer space would have; and finally:
… just to add yet a few more gazillion unneeded and previously unavailable degrees of freedom, a huge increase in the number of available spatial dimensions, always at least 9 and often many more.
And they wonder why string theory has 10
500 versions of the vacuum… :)
Oh… did I also mention that the Deep Dive has cost the US (mainly NSF plus matching funds from other institutions)
well over half a billion dollars, with literally not a single new experimental outcome, let alone any actual working new process or product, as a consequence?
This was only to be expected, since the Deep Dive plunged all research into real string-like vibrations down into the utterly inaccessible level of the Planck foam. Consequently, the
only product of string theory research has been papers. This half a billion dollars’ worth of papers has built on itself, layer by layer of backward citations and references, for over 40 years. In many cases, the layers of equations are now so deep that no human mind could possibly verify them. Errors only amplify over time, and if there is no way to stop their propagation by catching them though experiments, it’s the same situation as trying to write an entire computer operating system in one shot, without having previously executed and validated its individual components.
In short, what the US really got for its half billion dollars was a really deep stack of very bad programming. Our best hope for some eventual real return on string theory investments is that at least a few researchers were able to get in some real, experimentally meaningful research in all of that, to produce some real products that don’t depend on unverifiable non-realities.
view post as summary
Author Terry Bollinger wrote on Feb. 27, 2018 @ 05:00 GMT
Juan Ramón González Álvarez wrote on Mar. 4, 2018 @ 23:44 GMT
"The World’s Most Famous Equation" is also one of the most misunderstood. First, it was first derived by Poincaré, not Einstein, and it is better written as
E
0 = mc
2"Thus the 20 digit sequence could in principle be replaced by a short binary program that generates and indexes pi". Which would consume more memory than simply storing the original 20 digit...
view entire post
"The World’s Most Famous Equation" is also one of the most misunderstood. First, it was first derived by Poincaré, not Einstein, and it is better written as
E
0 = mc
2"Thus the 20 digit sequence could in principle be replaced by a short binary program that generates and indexes pi". Which would consume more memory than simply storing the original 20 digit string.
"In physics the sole criterion for whether a theory is correct is whether it accurately reproduces the data in foundation messages." A theory can be refuted without even running a single experiment. We have other criteria to evaluate data, including internal consistency checks.
"The implication is that a better way to think of physics is not as some form of axiomatic mathematics, but as a type of information theory". It is neither.
Challenge 2. Bosons are in reality virtual combinations of Fermions that arise in the formalism when one switches from a non-local real picture to the approximate local picture of QFT. All the properties of bosons are derived from the properties of fermions, including spin. E.g. for photons the available spin states are
(+-1/2) - (+-1/2) = 0,1,-1,0.
The Standard Model needs to postulate the properties of bosons: mass, spin, charge. I can derive those properties from first principles.
"There are after powerful theoretical reasons for arguing that gravity is not identical in nature to the other forces of the Standard Model. That reason is the very existence of Einstein’s General Theory of relativity, which explains gravity using geometric concepts that bear no significant resemblance to the quantum field models used for other forces". Gravity can be formulated non-geometrically. So there is nothing special about it regarding this. On the other hand the gauge theory used in QFT for the other interactions can be given a geometrical treatment with the gauge derivatives playing a role similar to the covariant derivatives in GR, and the field potentials playing a role similar to the Christoffel symbols.
view post as summary
report post as inappropriate
Author Terry Bollinger replied on Mar. 5, 2018 @ 03:01 GMT
Juan Ramón González Álvarez,
Thank you for your interesting comments! It took me a while to realize that your essay was back in 2012 (must have been an interesting year!), and that FQXi grants forward commenting access to all prior participants. That’s good to know.
Poincaré was amazing! His math was so advanced in comparison to that of Einstein (who had to get his wife’s...
view entire post
Juan Ramón González Álvarez,
Thank you for your interesting comments! It took me a while to realize that your essay was back in 2012 (must have been an interesting year!), and that FQXi grants forward commenting access to all prior participants. That’s good to know.
Poincaré was amazing! His math was so advanced in comparison to that of Einstein (who had to get his wife’s help even to do the somewhat repetitious math of his SR paper) that I wonder how well Einstein could have followed it. Einstein’s path to E=mc
2 was in any case very different and kind of weird? Einstein just did not think.
In sharp contrast, Poincaré’s more Maxwell-based argument in “La théorie de Lorentz et le principe de reaction” (“The Theory of Lorentz and the Principle of Reaction”) is lucid, mathematically clear, and makes beautiful use of the work of both Maxwell and Lorentz. More than his equation
per se, I like Poincaré’s straightforward assertion that:
“… if any sort of device produces electromagnetic energy and radiates it in a particular direction, that device must recoil just as a cannon does when it fires a projectile.”
----------
Regarding 20 digits from pin, sure, a full array at either end would be huge. If you wanted to be exact on the analogy, though, you would instead take the processing-storage tradeoff and spend huge amounts of processing time to re-generate the pi sequence up to that point. It would be an insane way to compress data for any practical use, but of course that’s not the point. The issue is that you have to be very careful about saying “this is
the most compressed form of any data.” So even if it took a month to generate the 20 digits, the short program for doing it that way still fully qualifies as a more compact way of telling someone at a remote site how to obtain those 20 digits.
----------
I do like and feel there is some real conceptual merit to thinking of boson as “combinations” in some sense of two mutually-canceling charged fermions, that is, the photon is in “some sense” a combination state of the positron and electron. But the math reigns in the end, as with any conceptual model.
For example: In your reply to Challenge 2, the spin set created by combining the spin ½ electron and the spin ½ positron is indeed {0,-1,+1}, but photons are of course always spin magnitude 1, not spin 0. You perhaps are talking about their measured spins at a detector? In any case, the question is not whether you can express photons as fermion pairs, but how that would induce the symmetric-antisymmetric relationship that so sharply distinguishes fermions from bosons. If you feel that the composite-fermion approach can lead to that, I’d suggest you try to provide a detailed argument for why.
----------
I’ve downloaded your 2012 essay and briefly scanned it. I have this sneaking suspicion from that and your assertions above, some unexplained, that your immediate reaction to quite a few ideas in physics is extreme skepticism? I’ll try to look at your essay more closely as time permits, with the qualification that I have a long queue of both comments and essays from 2017 that I need to get to first.
Thanks again,
Terry
view post as summary
Author Terry Bollinger wrote on Mar. 6, 2018 @ 03:40 GMT
Biomolecular Renormalization: A New Approach to Protein ChemistryTerry Bollinger, 2018-03-05
Abstract. In every cell in your body, hundreds of proteins with very diverse purposes float in the same cytosol fluid, and yet somehow rapidly and efficiently carry out their equally diverse tasks including synthesis, analysis, demolition, replication, and movement. Based on an...
view entire post
Biomolecular Renormalization: A New Approach to Protein ChemistryTerry Bollinger, 2018-03-05
Abstract. In every cell in your body, hundreds of proteins with very diverse purposes float in the same cytosol fluid, and yet somehow rapidly and efficiently carry out their equally diverse tasks including synthesis, analysis, demolition, replication, and movement. Based on an earlier 2017 FQXi Essay contest mini-essay on the importance of renormalization and the Nature paper below, I propose here that the many protein chemistry pathways that go on simultaneously in eukaryotic and prokaryotic cells are enabled, made efficient, and kept isolated by a multi-scale
biomolecular renormalization process that breaks each interaction into scale-dependent steps. I conclude by discussing ways in which this concept could be applied both to understanding and creating new biomolecules.
----------------------------------------
NOTE: A mini-essay is my attempt to capture an idea, approach, or prototype theory inspired by interactions with other FQXi Essay contestants. This mini-essay was inspired by:1.
What does it take to be physically fundamental by Conrad Dale Johnson2.
What if even the Theory of Everything isn’t fundamental by Paul Bastiaansen3.
The Laws of Physics by Kevin H Knuth4.
The Crowther Criteria for Fundamental Theories of Physics5.
The Illusion of Mathematical Formality by Terry Bollinger (mini-essay)Non-FQXi References6.
Extreme disorder in an ultrahigh-affinity protein complex, March 2018, Nature 555(7694):61-66. Article in ResearchGate project
Novel interaction mechanisms of IDPs7.
Extreme disorder in an ultrahigh-affinity protein complex, March 2018, Nature 555(7694):61-66. NOTE: This article is behind a (large) paywall.----------------------------------------
Background: Scale-Dependent Protein InteractionsIn the March 2018 Nature paper
Extreme disorder in an ultrahigh-affinity protein complex, the authors provide a fascinating and extremely detailed description of how certain classes of “intrinsically disordered proteins” (IDPs) can bind together based initially on large-scale charge interactions that are then followed by complex and remarkably disorderly bindings at smaller size scales. The purpose of this essay is not to analyze this specific paper in detail — this excellent paper does that very well for its intended biochemistry audience — but to show how an external set of physics-derived, scale-dependent
renormalization framework can be used not only to provide an alternative way to look at the interactions of these proteins, but to understand a broad range of large-molecule interacts in a new and potentially more unified and analytical fashion. This broader framework could in principle lead to new approaches to both understanding and designing proteins and enzymes for specific objectives, such as how to bind to a wide range of flu viruses.
The Importance of Approximation-At-A-DistanceThe initial approach of two IDP proteins via simple, large-scale difference of electrical charge appears to be an example of biological multi-scale physics-style "renormalization." By that I mean that the proteins are interacting in a hierarchical fashion in which large, protein level charge attractions initiate the process while the proteins are still at some distance from each other and details are irrelevant due to charge blurring. This is the central concept of renormalization in, say, the QED theory of electron charge: You can at large distances (scales) approximate the electron charge as a simple point, much as you are approximating the complex protein charge as a "lump charge" in first stage.
As the proteins approach, more detailed patterns grow close enough to become visible, and the initial lump-protein-charge model fails. One must at this point "renormalize," that is, drop down to a smaller, more detail scale that allows analysis in terms of smaller patterns within smaller regions of the protein. In the case of the dynamic and exceptionally disorganized IDP proteins, these later stages result in surprisingly strong bindings between the proteins. More will be discussed later about this intriguing feature, which I believe can be reinterpreted as a more complicated process that only appears to be random and disorganized from an outside perspective. It is at least possible, based on a renormalization analysis, that this “randomness” is actually a high-density, multi-level transfer of data. This transfer would be enabled by the large number of mobile components of the protein behaving more cogs and wheels in a complicated machine than as truly random parts. Alternatively, however, if
binding truly is the top priority for the proteins, the moving parts could also accomplish that without using the resulting bindings as data.
Broadening the Model: Multi-Level Attraction and RejectionHowever, even more interesting than detailed binding when proteins grow closer is the possibility that the interactions at that level reject rather than encourage further interactions. Such cases might also be very common, possibly even dominant. You would have a "dating service" that allows the proteins to spend a small amount of time and mobility resource to check out a potential match, but then quickly (and this is important) realize at
low cost the match will not work. Amplify such low-cost rejections by huge numbers of protein types and individual instances, and the result is a very substantial overall increase in cellular efficiency.
If however the next level of charge-pattern detail does encourage closer attraction, the result would be to head down the path of repeated downward renormalization of scale, as individual sheets and strands move close enough to "see" more detail. If the proteins were exact matches to begin with, then renormalization (which in this contex just means "scaling down to see greater levels of charge pattern detail") would proceed all the way down to the atomic charge level. The "dating service" would be a success, and the match accomplished. But more importantly, it would be accomplished with high efficiency by avoiding getting into too much detail too quickly.
Broader Implications of Multi-Scale Protein InteractionsThere are a number of very interesting potentials in such a renormalization interpretation of protein-to-protein binding. Importantly, most of these potentials apply to pretty much any form of large-bio-molecule binding, including emphatically DNA) and (to me even more interesting) enzymatic creation of novel molecules. These potentials include:
o Efficient, low-time-cost, multi-stage elimination of non-matches.Proteins (or DNA) would be able to approach at the first scale level based on gross charge, then quickly realize there is no match, and so head off to find the right "machinery" for their tasks. The efficiency issue is huge: Repeated false matches at high levels of detail would be very costly, causing the entire cell to become very inefficient.
o Increased probability of correct protein surface matchups.Or, conversely: Lower probabilities of protein matchup errors. A huge but somewhat subtle advantage of multi-scale attraction is that it gives each new level of smaller detail a chance to "reorient" its components to find a better local match. One way to think of this advantage is that the earlier larger-scale attractions are much like trip directions that tell you which interstate highway to take. You don't need detail at that level, since there will in general be only one interstate (one "large group area match") that gets you to the general region you need for a more detailed matchup. Only after you "take that interstate" and approach more closely do the detailed "maps" show up and become relevant.
o Complex "switch setting" during the multi-scale matchup process.Since proteins are not just static structures but nano-scale machines that can have complex levels of local group mobility (more later on the implications of that), such lower-scale matchups can be more than just details showing up at the finer scales. They can also re-orient groups and structures, which in turn can potentially "activate" or "change the mode" of one or both proteins, much like turning a switch once you get close enough to do so. These "switches" would of course themselves be multi-scale, ranging e.g. from early large-scale reorientations of entire beta sheets down to later fine-scale rotations of side groups of individual amino acids. What is particularly interesting about this idea is that you potentially could program remarkably complex sequences in time and space of how such switches would be reset. There is potential in multi-scale, multi-time switch setting for a remarkable degree of relevant information to be passed between proteins.
o Multi-scale adjustment of both specificity and "stickiness".As with gecko feet, if the goal of the protein is aggressive "grabbing" of a range of some broad class of proteins, this can be programmed in fairly easily via the multi-scale model. It works like this: If the purpose of the protein is to bind and entire class of targets based on overall large-scale charge structure (and please note the relevance of this idea to e.g. ongoing efforts for universal flu vaccines), then the next lower level of scale in the protein should be characterized by extreme mobility of the groups that provide matching, so that they can quickly rotate and translate into positions that allow them to match essentially any pattern in the target molecule.
Conversely, if certain patterns at lower scales indicate that the target is wrong, then those parts of the program should present a rigid, immobile charge pattern upon closer approach. Mobility of groups thus becomes a major determinant of both of how specific the protein is, and how tightly it will bind to the target.
o Energetic activation of low-probability chemical reactions. This is more the enzyme interpretation, but it's relevant because multi-scale provides a good way to "lock in" just the right sequence of group repositions to create highly unlikely binding scenarios. Imagine first large groups then increasingly smaller and more specific groups all converging in attraction down to a point where some small group of atoms is forced into an
uncomfortable positioning that normally would never occur. (This is a version of the multi-level-switch scenario, actually.)
At that point a good deal of energy is available due the higher-level matchups that have already occurred; the target atoms are under literal pressure to approach in ways that are not statistically likely. And so you get a reaction that is part of the creation of some very unlikely molecule. This is really quite remarkable given the simplicity and generally low-overall-energy level of amino acid based sequences, yet it comes about fairly easily via the multi-level model.
Another analogy can be used here: Imagine trying to corral wild horses who have a very low probability of walking into your corral spontaneously. Multi-scale protein matchup energetics then are like starting with large-scale events, say helicopters with loudspeakers, as the first and largest-scale way of driving the horses into a certain region. After the horses get within a certain smaller regions, the encirclement process is then scaled down (renormalized) to use smaller ground vehicles. The process continues until "high energy" processes such as quickly setting up physical barriers come into play, ending with full containment.
o Enablement and isolation of diverse protein reaction systems within the same cytosol medium.The idea that in terms of interactions, molecules can both immediately reject and reject at low cost interactions not relevant to their purpose is another way of saying that even if a huge variety of molecules with very diverse purposes are distributed within the same cytosol, they can behave in effect as if they do not "see" any other molecules except the ones with which they are designed to react. These subnetworks thus can focus on their tasks with efficiency and relative impunity against cross-reactions.
There is a fascinating and I think rather important corollary to this idea of multi-scale enabled isolation of protein chemistry subnetworks, which is this: It only works if the proteins are pre-structured to stay isolated. That is, on average I would guess that high levels of mutual invisibility between protein reaction subnetworks is
not likely, and that the subnetworks must in advance agree to certain "multi-scale protocols" about how to distinguish them from each other. This distinction would begin and be most critical at the largest and most efficient scales of charge blurring, the same levels that your paper abstract describes.
So, a prediction even: Careful analysis of the charge profiles of the many types of proteins found in eukaryotic (and prokaryotic, likely more accessible but not your main bailiwick) will reveal that multi-scale isolation of multiple subnetworks of interactions that are based first on high-level, "blurred" charge profiles between the proteins, with additional isolations at lower scales. It will be show statistically that the overall level of isolation between the subnetworks is extremely unlikely without all such reaction paths sharing the charge-profile equivalent of a registry in which each reaction subgroup has its own multi-scale "charge address".
o Possible insights into the protein folding problem.Finally, it is worth noting that the hierarchical guidance concept that underlies biomolecular renormalization could well have relevance to the infamous multi-decade protein folding problem, which is this: How does a simple string of amino acids fold itself into a large and precisely functioning protein “machine”? This feat is roughly equivalent to a long chain of about twenty different link types somewhat magically folding itself into some form of complicated machine with moving parts.
Either directly through multi-scale attractions or indirectly through helper molecules, it is at least plausible that biomolecular renormalization may play a role in this folding process. With regard to helper molecules, one intriguing hypothesis (nothing more) is that
previously folded proteins of that same type could provide some form of multi-scale guidance for how to fold new proteins.
While an intriguing idea, it is also frankly unlikely for the following reason: Such assistance would almost certainly require the existence of some class of “form transfer” helper molecules that would look at the existing molecules and from them find and present that information to the folding process. It is hard to imagine that such a system could exist and not have already been noticed.
Nonetheless, the concept of folding-begets-folding has an intriguing appeal from a simple information transfer perspective. And in one area it would resolve a very interesting resolution to a long-term mystery of large biomolecules, prions. Prions are proteins that have folded or refolded into destructive forms. Once established, these incorrectly folded proteins show a remarkable, even heretical ability to reproduce themselves by somehow “encouraging” correctly folded proteins to instead adopt the deleterious prion folding.
Folding-begets-folding would help to explain this mysterious process by making it a broken version of some inherent mechanism that cells have for reproducing the folding structures of proteins. Whether any of this is possible, and whether if so it is related to biomolecular renormalization, is an entirely open question.
Conclusions and Future DirectionsAs a concept, biomolecular renormalization appears to have good potential as a framework not only for understanding known and recently uncovered protein behaviors, but also to provide a more theory-based approach to designing proteins and enzymes. It may also provide insights into cell-level biological processes that previously have seemed opaque or mysterious under other forms of analysis.
view post as summary
Author Terry Bollinger wrote on Mar. 6, 2018 @ 03:43 GMT
Author Terry Bollinger wrote on May. 1, 2018 @ 02:37 GMT
Hi folks,
I've been so quiet that at least one FQXi community member was worried about me (which I really appreciated, incidentally).
For the last couple of months I've been working on a paper on special relativity (SR). While some of the ideas in it are ones I've been exploring for years, it was my need for a good reference paper relevant to several 2017 FQXi essays that provoked me to make completion of the paper a priority. The paper will have more explanatory graphics than most physics papers, since SR is an intensely geometric theory that requires good graphics to describe and explore properly. I'll provide status updates on the paper here.
I've been trying to practice what I advocated in my 2017 FQXi essay,
Fundamental as Fewer Bits. Focusing on conciseness leads in turn to an assertion with which I think most physicists would agree in principle, but which can be surprisingly difficult to apply in practice:
Physics is both extremely efficient and minimally redundant.It sounds reasonable, right? It is, after all, just a variant of Occam's Razor.
However, if you apply the above assertion to current physics in a ruthless, machine-logic-level, history-indifferent way, it can be devastating in deeply interesting ways. I'll leave it to the reader to wonder why and how.
Cheers, Terry Bollinger
Author Terry Bollinger wrote on May. 2, 2018 @ 21:26 GMT
Hi folks,
For anyone who is disappointed in not winning this year, and who may also have thought that my essay had a pretty good shot, don't feel too badly. The FQXi Essay Winners direct-contact announcement day has come and gone, and like most of you I didn't even get a mention.
That is of course disappointing. I really did think it was a pretty solid piece of work, enough so that I will most definitely use those ideas elsewhere. But the nice thing is that my involvement with all of you in this incredibly diverse community of off-the-beaten path thinkers was so stimulating that it helped me to look at my own years of private physics notes in new ways, and to develop a renewed enthusiasm for capturing certain recurring themes in the form of papers intended for journal publication. For that I thank all of you!
I will continue on occasion to post here notices about papers, ideas, or online postings of figures and such. It's possible but a bit unlikely that I may submit a new essay in some future FQXi Essay Contest. However, I think working towards getting published in appropriate journals is a better goal for me now, especially since I don't have much interest in the prize money part of this contest. FQXi questions are often delightfully inspiring, but perhaps that is the best way to view them: As personal research challenges, and not necessarily as part of a contest
per se.
Finally, my sincere congratulations to whomever the winners for the contest were this year!
I have some personal favorites, but if you the winners are not whom I expected, I will focus all the more on reading your winning essays carefully to understand better your perspectives and insights.
Cheers, Terry
Giovanni Prisinzano wrote on May. 3, 2018 @ 15:04 GMT
Dear Terry,
I'm happy to read new posts from you, even if there's a bit of disappointment in the last one. That I can certainly understand, because your essay is perhaps the most inspiring and convincing among those I read in the contest and certainly deserves to be among the winners. I am still using the present tense because the announcement has not yet happened, even if the date indicated for direct notifications has already passed. But we cannot completely rule out the possibility of a delay, since - as I remember with certainty - the names of the winners of the previous contest were posted with a delay of at least a week, although of course I cannot know if the times of personal notifications were respected.
I can only agree with your intention to publish your future contributions in appropriate journals. I would probably do that too, if I had the opportunity and the necessary skills. Participating in a FQXi contest is a very engaging and exciting adventure, but perhaps some changes in the guidelines and in the rating criteria would be appropriate to maintain the same interest in those who, like me and others, have already participated more than once.
I too would like to offer my sincere congratulations to the winners and above all I hope to continue reading about you and your works, Terry, in the near future.
Cheers,
Giovanni
report post as inappropriate
Author Terry Bollinger wrote on May. 3, 2018 @ 17:07 GMT
Giovanni,
It is good to hear from you again! Your essays and those you recommended (Coryat, Losev, Becker, and Bastiaansen I think it was), along with others (Tejinder Singh for example) were among that most helped me appreciate and to at least some degree better understand broader, more philosophical approaches to understanding reality. Given the paucity of answers that science has to...
view entire post
Giovanni,
It is good to hear from you again! Your essays and those you recommended (Coryat, Losev, Becker, and Bastiaansen I think it was), along with others (Tejinder Singh for example) were among that most helped me appreciate and to at least some degree better understand broader, more philosophical approaches to understanding reality. Given the paucity of answers that science has to offer even on why we are here at all, that is a humbling and valuable perspective.
In fact, the title of one of my planned mini-essays is
Time, Life, and Awareness: A Physics Perspective. The only reason I have not completed and posted it is that I realized that I needed to complete a physics paper on the nature of time and time symmetries in special relativity to make my points about life and awareness more than just speculation. The nature of time is quite a bit different from most speculations about it, yet the arguments for how it really works are neither complicated nor easily dismissed once you frame the physics so as to avoid unnecessary redundancies and inefficiencies.
Our universe is almost unbelievably accommodating of inefficient theories, allowing (as Spekkens speculated in his 2012 FQXi essay) an incredibly diverse range of theories to predict the same results. Sharpening some of the classic arguments of special relativity to better address the causality that Spekkens postulated as a unifying principle can at the very least provide some insights. Remarkably, those insights are also relevant to understanding the deep relationship that living organisms have with time, and the inverse relationship of awareness that allows us to connect more directly with how the universe works.
That sounds like a highly abstract statement, but I assure you it is not: There is a very real, experimentally accessible difference between intelligence using only classical time, and intelligence focusing on causal state change as defined by the laws of all of physics. In that broader perspective, the use of classical time is an important but incomplete subset of how the universe changes state, a simplification necessary for surviving in the (very friendly overall) part of the universe in which we exist.
More later. Paper on causal symmetry in special relativity first!
Thank you also for the information on how the FQXi announcements work. I think the official date for posted announcements is May 8? In any case, for me it's all water under the bridge at this point, since I feel much happier assuming that it's all said and done, and so time to move on.
Cheers,
Terry
view post as summary
Giovanni Prisinzano wrote on May. 4, 2018 @ 20:21 GMT
Hi Terry,
Thank you for your kind, prompt, and articulate response, to which I intended to respond "by speedpost", (I don't know if I can say so, there is an Italian way of saying: "a stretto giro di posta", that is perhaps untranslatable), but then I started reading the essay by Tejinder Singh, which you mentioned and which I had unfortunately neglected so far. And I got a bit “lost”, not only in the essay as such, but especially in the conversation between him and you in his thread, which is very interesting, but requires time and attention to be followed as well as it deserves. But I'm a bit slow, in reading and writing, and I don't even have much free time, by now.
I also began to think, after reading about your planned mini-essay, of the nature of time, or rather to re-think of it, because for the 2015 FQXi contest I had written an essay about this and before even a book, in Italian. In them I argued that time, as well as space, has a mathematical nature, but I also tried to maintain its difference with respect to space and to suggest a possibility of explaining the passage from the past to the future, or "arrow" of time, which science has never so far fully succeeded in clarifying. Later, I thought that my approach was too speculative, but after reading other contributions on the relationship between mathematics and the world, such as that of Tegmark, or even, in a different perspective, of Singh himself, I don't think it is too much...
But on this I will return. First I have to complete reading the thread of your conversation with Tejinder Singh!
Cheers,
Giovanni
report post as inappropriate
Author Terry Bollinger wrote on May. 5, 2018 @ 04:45 GMT
Hi Giovanni,
I think the American equivalent of "a stretto giro di posta", at least for older Americans like me, is "by FedEx", that is, by Federal Express. This phrase was used most often back when FedEx was the
only company capable of shipping items overnight via their centralized receive-sort-ship facility in Memphis, Tennessee. The founder of FedEx, Frederick Smith, famously got...
view entire post
Hi Giovanni,
I think the American equivalent of "a stretto giro di posta", at least for older Americans like me, is "by FedEx", that is, by Federal Express. This phrase was used most often back when FedEx was the
only company capable of shipping items overnight via their centralized receive-sort-ship facility in Memphis, Tennessee. The founder of FedEx, Frederick Smith, famously got an average-only grade when he first proposed this idea in a college paper.
You are exactly correct that space is different in a critical way from time, and that way is this: A space-only separation between two objects can always be made completely symmetric, while a time-only separation between two points (e.g. two images of the same object) can
never be made fully symmetric due to one image always being farther back in time than the other.
That broken symmetry is saying something very important about the nature of time, which to be specific is that space is
more fundamental than time, not less. Or to be more precise: Space-time symmetry is at its roots all about how objects in space change relative to each other, with time emerging as a result of that set of relationships. Thus in special relativity, the merger of space and time that Minkowski so delighted in (more so than Einstein at first) is due not to the independence of the two, but to the fact that time has no meaning at all without matter and energy changing state in space.
"Changing state" I should note is
not the same thing as classical time, since classical time has an entire suite of precise and often space-like features that are narrower and more constrained than the concept of state change in isolation. Giving away a bit of my argument in
Time, Life, Awareness: A Physics Perspective, this is also why less "scientific" practitioners of traditions that emphasize instantaneous sensor perception over history-based processing and interpretation of that sensory data are on to something non-trivial: They are recognizing that traditional classical time is far more of an in-your-head information processing construct than is generally realized, and that any full theory of how the universe changes state
necessarily must extend beyond that model. I say "necessarily" because, trivially, neither classical not quantum concepts of time are sufficiently broad to enable full envelopment of how the universe actually works. Only a change model that enables classical and quantum time to emerge from a broader set of state change rules can suffice, and that model is, remarkably, a lot closer to instantaneous awareness than it is to ponderous (but also incredibly useful) entropic-based past-and-future perception and modeling of time.
This is of course delightfully in opposition to the views of a great many physicists, to whom the supremacy of time as reality is a psychologically paramount and simply not something to be questioned. But then again, that is exactly how you end up with both cultural and individual cognitive log jams: The very assumption that must be questioned in order to break the log jam seems so obvious, so fundamental, so
sacred, that no person in that culture dares even to suggest otherwise.
Concerning that disregard, in my approach to physics I rely significantly on my involvement with new research directions in machine cognition, that is, with artificial intelligence. Good machine cognition is culturally oblivious in a way that is extraordinarily difficult for humans. Machines don't worry about what thesis advisors will think, or where funding will come from, or whether like the sad but brilliant Boltzmann they will be ostracized and driven to suicide by smart but vile geniuses like Mach. Instead, they worry about mathematical symmetries, data reduction, exploration paths, and heuristics for reducing the total search space. Machines don't forget or minimize those annoying dangling threads that have never been explained well. Instead, they place them at highest priority for both further exploration and path assessment, precisely because such unresolved issues are the gaps in the armor of existing theory that are most likely to lead to breakthroughs.
Again, good talking to you again, even though I will confess to doing a bit of a monologue from my side! I look forward to looking up your 2015 FQXi essay and reading it on Saturday, Cinco de Mayo. In the US this is a popular holiday for eating all types of Mexican food. Tacos and Time, a delicious dual delight!
Cheers,
Terry
view post as summary
Author Terry Bollinger wrote on May. 5, 2018 @ 15:49 GMT
Kolmogorov Minimums and Wheeler-Feynman Emission-Absorber Theory [ABSTRACT ONLY]Terry Bollinger, 2018-05-05
Abstract. While the predominant reason for arguing the existence of a block universe is the Einstein assumption that there is no other way to reconcile the many possible multiple-angle space foliations of special relativity, quantum mechanics also contains...
view entire post
Kolmogorov Minimums and Wheeler-Feynman Emission-Absorber Theory [ABSTRACT ONLY]Terry Bollinger, 2018-05-05
Abstract. While the predominant reason for arguing the existence of a block universe is the Einstein assumption that there is no other way to reconcile the many possible multiple-angle space foliations of special relativity, quantum mechanics also contains theories that appear to argue powerfully for the simultaneous existence of all past and future points. Foremost among these is Wheeler-Feynman emission-absorption theory, in which the acceleration (recoil) of a photon-emitting electron in the present is modeled as the result of another type of photon that has traveled backwards in time from the electron that eventually receives the first photon. This second absorbing electron could easily exist billions of years in the future. In this mini-essay, both the forward (advanced) and backward (retarded) photons of Wheeler-Feynman theory are reinterpreted as examples of complementary “path expansions” of some still-unknown Kolmogorov Minimum (KM) representation that integrates both types of photons into a single atemporal “photon interaction” class of state changes. Wheeler-Feynman theory is then examined as an example of how KMs can explain many types of physics dualities as stemming from the necessity to pair each path “out” from a KM with a structurally very similar path back “in” to the KM representation of the event. This duality is especially explicit in Wheeler-Feynman theory, in which the assumption that all radiation events are direct particle-to-particle interactions drives the need for dual theories for future-directed advanced photons and past-directed retarded photons. Finally, the argument that Wheeler-Feynman theory supports a block universe is shown to be tautological, a direct result of the initial Wheeler-Feynman assumption that photon emissions
must be triggered by direct, electron-to-electron interactions across indefinitely large spans of time. In KM theory this assumption instead becomes an “expansion driver” for elaborating and complicating the compact but highly adaptable KM model of photon interaction. In the Wheeler-Feynman case, these outward-and-return elaborations result in dual theories of very similar but reversed-in-time advanced and retarded photons that largely cancel out each other’s effects, but still accomplish the underlying atemporal goal of a photon exchange between two electrons.
view post as summary
Author Terry Bollinger wrote on May. 5, 2018 @ 15:53 GMT
Login or
create account to post reply or comment.