CATEGORY:
Questioning the Foundations Essay Contest (2012)
[back]
TOPIC:
Recognising Top-Down Causation by George F. R. Ellis
[refresh]
Login or
create account to post reply or comment.
Author George F. R. Ellis wrote on Jul. 17, 2012 @ 11:23 GMT
Essay AbstractOne of the basic assumptions implicit in the way physics is usually done is that all causation flows in a bottom up fashion, from micro to macro scales. However this is wrong in many cases in biology, and in particular in the way the brain functions. Here I make the case that it is also wrong in the case of digital computers – the paradigm of mechanistic algorithmic causation - and in many cases in physics, ranging from the origin of the arrow of time to the process of quantum state preparation. I consider some examples from classical physics; from quantum physics; and the case of digital computers, and then explain why it this possible without contradicting the causal powers of the underlying micro physics. Understanding the emergence of genuine complexity out of the underlying physics depends on recognising this kind of causation. It is a missing ingredient in present day theory; and taking it into account may help understand such mysteries as the measurement problem in quantum mechanics:
Author BioGeorge Ellis is a relativist and cosmologist residing in Cape Town, South Africa. His books include On the Large Scale Structure of Space-Time co-authored with Stephen Hawking. In addition to contemplating relativistic and philosophical aspects of cosmology, he is now engaged in trying to understand how complex systems such as you and me can arise out of the underlying physics.
Download Essay PDF File
John Merryman wrote on Jul. 17, 2012 @ 17:27 GMT
George,
While I would agree top down effects are under-appreciated in the fundamental sciences, I suggest the real point to be made is the inseparable dichotomy of top down vs. bottom up relations. Often information systems(maths) congeal into top down platonic systems, where the underlaying medium becomes immaterial and dismissed as non-existent. I think it is the feedback between bottom up actions and top down interactions that really forms and informs reality. As Newton said, 'For every action, there is an equal and opposite reaction." Logically actions are linear, but the reactions tend to be non-linear feedback. I don't think it is entirely true that we cannot understand top down activity in terms of bottom up actions, because the initial action creates the potential for feedback and reaction and when this is multiplied across massive scales, feedback becomes infinitely variable. So top down and bottom up are inseparable sides of the same coin. Energy(bottom up) manifests information(top down), as information defines energy.
It is only as processes and states become ever more complex, that we start to loose sight of the fundamental processes still motivating them. The evolution of the brain is a good example: E.O. Wilson described the insect brain as a thermostat, yet it has been shown that varieties of ants use counting footsteps as a navigation tool, so they also have an inherent sequential function as well. I would argue these two functions underlay the conventional divisions of the right hemisphere of the brain as being emotional and intuitive, while the left is analytical and logical. Essentially the right brain is a very evolved thermostat, while the left is an equally evolved sequencer, or clock. So these two very basic functions of measuring energy levels and identifying sequential patterns within this environment, are the salient features of this most evolved and complex manifestation of the universe.
report post as inappropriate
Paul Reed replied on Jul. 18, 2012 @ 05:25 GMT
John
There is a presumption in here that, to put it simply, the future can be affected. Which it cannot. Because it does not exist, and is therefore not available to be affected. Or put another way, "reactions" are just the next set of "actions". The brain, etc is irrelevant to physical existence.
Paul
report post as inappropriate
John Merryman replied on Jul. 18, 2012 @ 10:31 GMT
Paul,
Since present is cause to future effect, it is affected. If I was to break a leg today, it would certainly affect what I will be doing tomorrow.
"Reactions" may be just the next set of "actions," but notice you included the plural. The action, as we tend to perceive it, is singular, while the environmental reactions to it are plural. The feedback from a complex environment to a simple action is complex.
The brain evolved out of physical existence and is a reflection of it, thus both time and temperature are foundational to its functions.
report post as inappropriate
Paul Reed replied on Jul. 18, 2012 @ 13:11 GMT
John
"If I was to break a leg today, it would certainly affect what I will be doing tomorrow".
Yes, the present that subsequently occurs is different from what it would have been. But what it would have been never existed. The future does not exist, so you cannot affect it.
As above, reactions are the next actions, the fact that I used the plural form of these words is irrelevant. My point was a repetition of the above. While they can be depicted as reactions, there is no form of reversal of physical existence. Everything could be described as a reaction to something.
Brains are physically existent, not a "reflection" of it, so are eyes, ears, etc. We are not somehow external to physical existence. What is different in sentient organisms that they possess a processing capability that enables them to be aware of the physical existence.
Paul
report post as inappropriate
Author George F. R. Ellis replied on Jul. 18, 2012 @ 15:03 GMT
John, thanks for that. You state "I think it is the feedback between bottom up actions and top down interactions that really forms and informs reality." We agree. It is the inter level feedback loops that generate real complexity.
As to the nature of time issue: I basically agree with you. See
http://arxiv.org/abs/gr-qc/0605049
John Merryman replied on Jul. 18, 2012 @ 16:01 GMT
George,
Thank you for the link. While I haven't read it, from the preface it would seem to be the topic you presented in your
Nature of Time entry. While I was in basic agreement with the premise, the problem I have is that the past is not, physically, or perceptually, unchanging. To quote from
my own entry in this contest, " While we naturally think of the entire universe as proceeding from its universal past into its universal future, with this present as a stage on that vector, those prior and subsequent stages do not physically exist and the material by which they were manifested has been cycled back into other forms. It is as though the thread of time is being woven from strands frayed off from what had previously been woven and the past ultimately becomes as unknowable as the future."
To summarize; the continual creation of past events doesn't push the present into the future, rather it pushes prior events further into the past, thus ever changing our relative temporal position to those events. While they cannot be changed in their own context, any relevant perception of them continues to evolve, as does any physical remnants of actions which occurred.
As I see it, the present isn't physically moving along a vector from past to future, but that the changing configuration of what exists, collapses future potential into past circumstance.
Paul,
Rather than clutter up George's thread with our discussions, I'll take those points over to
your thread.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 19, 2012 @ 04:27 GMT
The FQXI essay a few years ago was on the nature of time. I don't want to go into that again here. Yo'll find an in depth presentation of my view there.
John Merryman replied on Jul. 19, 2012 @ 10:24 GMT
George,
That's understandable. Thanks for the reply.
report post as inappropriate
hide replies
Daniel L Burnstein wrote on Jul. 17, 2012 @ 20:08 GMT
Dear George,
An intriguing and very well written essay that raises interesting questions and/or formulates them in new interesting ways.
Though I believe that any and all interactions can be expressed and described in terms of the fundamental aspects of reality, we lack the theory to do so. And even if we did have such theory that would show all higher scale interactions to be emerging from the fundamental interactions, the amount of data necessary to track every elementary particle and force would prohibit the description of even the simplest systems.
My understanding is that objects are structurally bound if, within a given scale of reality and under effect of a given force associated with the given scale on them, they behaves as a single object. So, the mathematical models of a particular scale of physical reality can threat composite objects as "virtually fundamental" in such a way that the top-down or bi-directional causalities not only make sense, but becomes the only workable alternative to tracking the interactions between the fundamental particles composing the interacting structures.
That said, the subject of your essay deserves a lot of consideration. I will certainly explore some of the avenues it opens.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 18, 2012 @ 15:09 GMT
Dear Daniel, thanks for that.
As to your second para, the issue is what are "fundamental aspects of reality". One does not have to agree that the only such aspects are those described by physics: what abut mathematics for example? Or logic?
Your third para is more or less agreeing with me. See my answers to others below
Daniel L Burnstein replied on Jul. 18, 2012 @ 21:43 GMT
"As to your second para, the issue is what are "fundamental aspects of reality". One does not have to agree that the only such aspects are those described by physics: what abut mathematics for example? Or logic?"
Interesting questions...
How we define “fundamental” determines how we interpret data and build models.
While in mathematics one can arbitrarily chose any...
view entire post
"As to your second para, the issue is what are "fundamental aspects of reality". One does not have to agree that the only such aspects are those described by physics: what abut mathematics for example? Or logic?"
Interesting questions...
How we define “fundamental” determines how we interpret data and build models.
While in mathematics one can arbitrarily chose any consistent set of axioms as a basis of an axiomatic system, the axioms in a physics theory should represent fundamental aspects of reality. This raises the essential question: What constitutes a fundamental aspect of reality?
What I am exploring (which I briefly discuss in my essay and at length in another work) is the idea that reality obeys a principle of strict causality. From the principle of strict causality, it follows that an aspect of reality is fundamental if it is absolutely invariable. That is, regardless of interactions or transformations it is subjected to, a fundamental aspect of reality remains unaffected.
Reality, I suggest, can be thought as an axiomatic system in which fundamental aspects correspond to axioms and non-fundamental aspects correspond to theorems.
The empirical method is essentially a method by which we try to deduce the axiom set of reality, the fundamental components and forces, from theorems (non-fundamental interactions). There lies the problem. Even though reality is a complete and consistent system, the laws extracted from observations at different scales of reality and which form the basis of physics theories do not together form a complete and consistent axiomatic system.
The predictions of current theories may agree with observations at the scale from which their premises were extracted, but they fail, often catastrophically, when it comes to making predictions at different scales of reality.
This may indicate that current theories are not axiomatic in the sense I described above; that is, they are not based on true physical axioms, that is; the founding propositions of the theories do not correspond to fundamental aspects of reality (as per the above definition of “fundamental.”) If they were, then the axioms of distinct theories could be merged into consistent axiomatic sets. There would be no incompatibilities.
Also, if theories were axiomatic systems in the way we describe here, their axioms, would be similar or complimentary. True axioms can never be in contradiction.
This raises important questions in regards to the empirical method and its ability to extract true axioms from theorems it deduces from observations. Even theories which are based on the observations of phenomena at the microscopic scale have failed to produce true axioms (if they had, they would explain interactions at larger scales as well). The reason may be that everything we hold as fundamental, the particles, the forces, etc, are not. So we ended up with theorems which can be applied successfully to the scale they were extracted from, but not to others scales.
Also, theories founded on theorems rather than axioms cannot be unified. That suggests that the grand unification of the reigning theories which has been the dream of generation of physicists may be mathematically impossible because their axiom sets are incompatible or mutually exclusive.
So, what I find interesting is that our approaches are in diametrically opposed. While you propose a top-down model of causality and the representations that models it, which I see as deconstructive approach ( gathering observational and experimental data and mathematically processing it in an attempt to extract or deduce from it the fundamental laws of the Universe), I propose a bottom-up approach where were physical emerge from the smallest and simplest possible axiom set. An axiomatic approach, as I define it, is the opposite of an empirical method. I suspect that there may be limit to heuristics, a limit to the empirical method and when this limit is attained, physics may have to rely on an axiomatic approach. The exploration of top-down causality may actually help find the “heuristical” limit, if such limit exists. That limit is the point at which reality is unobservable and somewhere beyond would be the true fundamental scale.
But that fundamental reality is unobservable does not imply we can’t design a physics theory that describes it. It may very well be possible to devise a complete and consistent set of axioms to which interactions at all scales of reality can be reduced to. This means that even if the fundamental scale of reality remains unobservable, an axiomatic theory would make precise predictions at scales that are.
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Jul. 19, 2012 @ 04:26 GMT
"Reality, I suggest, can be thought as an axiomatic system in which fundamental aspects correspond to axioms and non-fundamental aspects correspond to theorems."
- a very old dream, and one that is probably unattainable both because of Godel's theorem (on the logical side, showing th eproblems with axiomatic systems) and because of the issues Laughlin raises (on the physical side, showing the limits of bottom up deduction; see his quote in the appendix to my essay).
In any case suppose it were true, this raises a whole new set of issues:
* in what way do these axioms and theorems exist, and where do they exist? Are they Platonic forms for example?
* what decides the form they have? (there are various possible forms of logic: who chose this one?)
* how do they have the power to create any physical entity whatever?
Actually axiomatic systems are rather limited in their powers and in their ability to represent reality. I suggest you take a look at Eddington's book
On the nature of the physical world regarding our use of mental models, and the limits to their use. They are partial representations of reality, and should not be confused wit reality itself.
Daniel L Burnstein replied on Jul. 20, 2012 @ 21:21 GMT
" a very old dream, and one that is probably unattainable both because of Godel's theorem"
Gödel's incompleteness theorems are often invoked as an argument against the possibility of complete and consistent axiom set from which all interactions at all scales of physical reality can be derived. The problem is, the incompleteness theorem apply to the formulation meta-mathematical statements...
view entire post
" a very old dream, and one that is probably unattainable both because of Godel's theorem"
Gödel's incompleteness theorems are often invoked as an argument against the possibility of complete and consistent axiom set from which all interactions at all scales of physical reality can be derived. The problem is, the incompleteness theorem apply to the formulation meta-mathematical statements about systems (arithmetic principally). But one has to remember that while, aside from basic rules of composition, there are constraints to the making of such meta-mathematical statements, theorems, (nothing prevents false statements or statements that can't be derived from any given finite axiom set), physical reality strictly constrains any system so that it must be consistent with the fundamental laws that govern forces and other interactions. So does Gödel’s incompleteness theorems really preclude any possible answer to Hilbert's 6th problem?
If the Universe is made of a finite set of fundamental objects which combine in accordance to a finite set of laws that a finite number of fundamental interactions to produce physical reality, then it follows that Gödel's first incompleteness theorem is, at least in its present form, wrong when applied to reality. Also, if you believe that the fundamental components and laws are a consistent and that the Universe is a coherent system, then a physical interpretation of Gödel's second incompleteness theorem must also be wrong.
"in what way do these axioms and theorems exist, and where do they exist? Are they Platonic forms for example?"
We need to distinguish the axiom from the fundamental aspects of reality they would stand for. And by definitions, axioms cannot be proven. They are merely defined. Once that is done, the axiomatic system may be put to the test. If the axiomatic system is complete and consistent, then all that interactions should be derivable from it. It should also enable the emergences of falsifiable predictions.
" what decides the form they have? (there are various possible forms of logic: who chose this one?)"
That is the tricky part. Any choice must be made based on a number of assumptions. There can be a number of viable axiomatic systems that may be used, but whatever the choice, it must be self-consistent an all interactions must be either derivable from or reducible to it.
"how do they have the power to create any physical entity whatever? Actually axiomatic systems are rather limited in their powers and in their ability to represent reality."
If the Universe is found to be both consistent and complete, that is, the fundamental particles and the laws that govern them are consistent (consistency) and all that they produce remains part of the Universe (completeness), then all physical processes are emergent. It can then be shown that it is possible to create an axiomatic system that represent the fundamental aspects of reality and that representation of all interactions can be derived within such axiomatic system. If the Universe is a consistent and complete, then axiomatic representation can certainly be powerful enough to represent it.
Though a work in progress, I believe that I have shown that an hypothetical universe that is comparable to our Universe in complexity can emerge from an simple axiomatic set. My essay, titled "Questioning the Assumption that Space is Continuous" shows one way that can be done (my essay is based on a larger work, part of which can be freely).
view post as summary
report post as inappropriate
Daniel L Burnstein replied on Jul. 20, 2012 @ 21:24 GMT
Correction. I meant to write:
[...]But one has to remember that while, aside from basic rules of composition, there are ***no*** constraints to the making of such meta-mathematical statements,[...]
report post as inappropriate
Paul Reed replied on Jul. 21, 2012 @ 05:21 GMT
Daniel
Agreed. Any such concepts as incompleteness, etc are a reflection of our failure to comprehend all that existed, not a feature of physical reality. Which existed in a definite form, as at any point in time, and is, by definition, a sequence which is being driven by the lowest level of that which it comprises.
Paul
report post as inappropriate
hide replies
J. C. N. Smith wrote on Jul. 17, 2012 @ 22:29 GMT
Dear Professor Ellis,
it was a pleasure to read your thought-provoking essay.
I'd like to comment on just one aspect of your essay: the arrow of time. As I've argued in my essay in this competition,
Rethinking a Key Assumption About the Nature of Time, what we perceive as and refer to as "the flow of time" is, in reality, nothing more and nothing less than the evolution of the physical universe, an evolution governed by rules which we strive to understand and which we refer to as the laws of physics. When seen in this light, the so-called "arrow of time" is seen to be inevitable, if not almost trivial.
To say that the laws of physics are reversible is, in my opinion, a red herring. Yes, of course they are reversible! The fact of the matter, however, is that the universe is comprised of a great deal of macroscopic "stuff" (to use the technical term) which is, for the most part, in motion. We observe that the physical universe is evolving. This evolution represents a great deal of stuff in motion, i.e., a great deal of momentum. Yes, of course it is true that particle A or object A theoretically *could* be moving from right to left from your perspective, but if it is, in fact, moving from left to right from your perspective, then that is hard, objective reality! Particle A or object A can't be moving in both directions! Taken on the scale of the entire universe, it is this evolution of "stuff" that we perceive as "the flow of time," or, alternatively, as what we call "the arrow of time." This of course does not rule out reversibility at the microscopic, thermodynamic level of a gas, for example.
If I've correctly understood the point of your essay, these observations support your thesis.
Best regards,
jcns
report post as inappropriate
Paul Reed replied on Jul. 18, 2012 @ 05:32 GMT
JCN
"This of course does not rule out reversibility at the microscopic, thermodynamic level of a gas, for example"
Of course it does. Physical alteration occurs, it cannot then be reversed. The sequence can involve a subsequent state which is identical to a previous one, but that is not reversal.
Paul
report post as inappropriate
J. C. N. Smith replied on Jul. 18, 2012 @ 12:56 GMT
Paul,
With apologies to Professor Ellis for what probably is a distraction from the main point of his essay, insofar as it may bear at least tangentially on his topic I will reply to your post here, but if we wish to pursue this debate further we should move the discussion to one of our own blogs. You wrote:
"The sequence can involve a subsequent state which is identical to a previous one, but that is not reversal."
This is the thing you've never appeared to comprehend, Paul. According to my view of time (which I believe is consistent with Julian Barbour's view, in this regard at least), a particular time is identically equivalent to a particular configuration of the universe. This is my preferred wording of the concept which Barbour expresses by stating that "The relative configurations, or shapes, of the Universe do not occur at instants of time . . . they are the instants of time."
By this way of thinking, if the configuration of the universe were, hypothetically, to oscillate between two identically equivalent configurations, then time would oscillate between those two particular times. That said, however, things would get sticky because of the momentum involved in such an oscillation. The precise moments representing the end points of the oscillatory motion would be identically equivalent configurations and identically equivalent particular times. This sort of thing is easier to envision if we think of the universe as comprised entirely of three not-further-reducible billiard balls in a not-further-reducible shoebox.
I have no desire whatsoever to belabor this argument further, Paul, here or elsewhere, but if you insist on doing so, please pick another blog (yours or mine) where we may do so. Thanks.
jcns
report post as inappropriate
Paul Reed replied on Jul. 18, 2012 @ 13:15 GMT
JCN
As requested I will copy this and a response across to my blog, but hopefully people will follow that, because there is not much point in us two having a repeat of previous exchanges amongst ourselves.
Paul
report post as inappropriate
Author George F. R. Ellis replied on Jul. 18, 2012 @ 15:15 GMT
Thanks for those comments. Yes they seem to support my thesis.
My views in the flow of time are at see http://arxiv.org/abs/gr-qc/0605049.
I'll be doing more on this later this year.
hide replies
Karl Coryat wrote on Jul. 17, 2012 @ 23:10 GMT
Hello George, great job on the essay, and thank you for participating. I was surprised that you didn't specifically mention rescuing free will, as this seems to be what you are getting at in the section on state vector preparation. An experimenter's brain/body, in choosing a polarization angle for example, is imposing top-down causation upon the apparatus and ultimately upon the micro systems prepared, correct?
I couldn't help noticing similarities between your diagrams 1/6/7 and Figure 1 in
my essay, which is based on a graphical formalism developed by Bob Coecke at Oxford. I tried to come up with a sketch for a top-down causation mechanism by tracing the flow of contextual information, where complex systems impose context-specific boundary conditions upon measurement events, thereby generating further-enriched complexity in the process. This results in a universe of ever-increasing complexity. I hope you have the time to give it a look.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 18, 2012 @ 15:18 GMT
Thanks for that. Yes if I had more space I was going to add that the discovery of the Higgs is an example of top down causation from the human mind to the level of particles, causing them to smash together in a preplanned way in the LHC.
Your emphasis on the contextual nature of information is in accord with my view.
Alan Lowey replied on Jul. 19, 2012 @ 10:54 GMT
Paul Reed wrote on Jul. 18, 2012 @ 05:17 GMT
George
“A sensible view is that the entities at each classical level of the hierarchy (Table 1) are real”
While this reflects the normal, and indeed practical, view of physical reality, it is ontologically (physically) incorrect. Because certain existent, but superficial, physical characteristics are deemed to constitute any given ‘it’ (eg computer, you, etc). That ‘it’ is then thought to remain in existence, albeit with changes occurring to it, until one or more of the defining characteristics is no longer manifest. However, this just depicts reality at a higher level than the actuality, though it could be correct, in itself, at that level. Physically, that ‘it’ is a sequence of physically existent states. The higher level of differentiation just giving the appearance of less change than there physically is, and the illusion of a level of persistence to existence which does not physically occur.
For physical reality to occur, and alter, there must ultimately be a physically existent state at any given point in time. This can be defined as the state of the properties of the elementary particles involved, and their spatial position, as at that point in time. So there is a “fixed set of lower level entities”. And, by definition, any “event” could be tracked back to alterations at that level. That is, physically it must be ‘bottom up’.
The point here is that while that is physically what occurs, we could never establish it in such detail, and would probably all go mad trying. So, we conceptualise up some levels. But we must maintain ontological/physical correctness about the direction of the process. What is ultimately causing alteration in the properties of elementary particles and hence a change in their spatial position is another issue.
Another point to bear in mind that what exists is a present (ie that which was physically existent as at a given point in time). Previous existences have ceased (the past) as at that point in time. Successive existences (the future) do not exist. In other words, one does not affect the future, what happens is that a present occurs which is different to the one which would have otherwise done so.
Paul
report post as inappropriate
Author George F. R. Ellis replied on Jul. 18, 2012 @ 15:13 GMT
Hi Paul
Thanks for the comments
> “A sensible view is that the entities at each classical level of the hierarchy (Table 1) are real”. While this reflects the normal, and indeed practical, view of physical reality, it is ontologically (physically) incorrect.
• well you are stating the standard fundamentalist reductionist viewpoint. It may or may not be...
view entire post
Hi Paul
Thanks for the comments
> “A sensible view is that the entities at each classical level of the hierarchy (Table 1) are real”. While this reflects the normal, and indeed practical, view of physical reality, it is ontologically (physically) incorrect.
• well you are stating the standard fundamentalist reductionist viewpoint. It may or may not be true.
> Because certain existent, but superficial, physical characteristics are deemed to constitute any given ‘it’ (eg computer, you, etc). That ‘it’ is then thought to remain in existence, albeit with changes occurring to it, until one or more of the defining characteristics is no longer manifest. However, this just depicts reality at a higher level than the actuality, though it could be correct, in itself, at that level.
• Your definition of actuality. Not mine or for example Feynman’s or Anderson’s or Schweber’s. Yes it is correct at that level. And as that level has causal powers, it is real (see my definition 2). Please look at your computer in order to confirm for yourself that higher levels have causal powers (unless you believe the computer came into existence without cause). Or maybe you claim the computer does not really exist? I can’t deal with that kind of obscurantism.
> Physically, that ‘it’ is a sequence of physically existent states. The higher level of differentiation just giving the appearance of less change than there physically is, and the illusion of a level of persistence to existence which does not physically occur.
• Your definition of existence. Not mine or your bank manager’s. This kind of statement reminds me strongly of some Eastern religions. The level of persistence is real, it is the basis of daily life. Assuming of course that daily life exists. If it does not, then physicists and physics experiments don’t exist.
> For physical reality to occur, and alter, there must ultimately be a physically existent state at any given point in time. This can be defined as the state of the properties of the elementary particles involved, and their spatial position, as at that point in time.
• This statement has not taken present day quantum theory on board. It is precisely at the particle level that reality is unclear, because of (i) the uncertainty principle, (ii) wave-particle duality, and (iii) entanglement. Many quantum field theorists claim there are no particles, only fields. And string theorists claim they are vibrations in superstrings. Is that “real”? On your reductionist view, it follows that the particles don’t exist either: they are “nothing but” excitations of strings (if we believe those exist). A present day view should at least take quantum physics into account, if not string theory (which is not a solid foundation, as it is not even well defined, let alone proven to be right).
> So there is a “fixed set of lower level entities”. And, by definition, any “event” could be tracked back to alterations at that level. That is, physically it must be ‘bottom up’.
• Well you seem not to have read the party of my essay where I carefully explain that there are often not fixed lower level entities: their nature, or indeed their existence, is dependent on their higher level context. In the case of string theory, the nature of particles depends on the string theory vacuum – a non-local higher context for their existence. Their properties are not invariant, they depend on this vacuum. You are using a billiard ball metaphor that does not apply to “fundamental reality”, i.e. the lowest levels of existence we can understand.
> The point here is that while that is physically what occurs, we could never establish it in such detail, and would probably all go mad trying. So, we conceptualise up some levels. But we must maintain ontological/physical correctness about the direction of the process. What is ultimately causing alteration in the properties of elementary particles and hence a change in their spatial position is another issue.
• what process? conceptualisation? Actually we conceptualise down, on the basis of our physics experiments, from the level of daily life to the micro level. That’s the real direction of the process of physics theorising. As to “causing alterations” – the heart of causation - ultimately, it is top-down effects that decide what changes in lower level entities will take place, because they set the scene for the lower level actions. That context determines the specific outcomes that occur.
> Another point to bear in mind that what exists is a present (ie that which was physically existent as at a given point in time). Previous existences have ceased (the past) as at that point in time. Successive existences (the future) do not exist.
• Agreed
> In other words, one does not affect the future, what happens is that a present occurs which is different to the one which would have otherwise done so.
• Strange phraseology but more or less in accord with my proposal of an Evolving Block Universe (EBU): see http://arxiv.org/abs/gr-qc/0605049
I did not have space to give all the references I would have liked to include. One that is great is Eddington’s book The Nature of the Physical World, which is far deeper than many more recent writings (see his earlier chapters for the relation between physical reality and the mathematical models that some people mistake for reality). I’ll put up Feynman’s writing on this theme of levels of reality in a separate post.
However you have not responded to my main challenge. How does the existence of computer programs relate to your concept of actuality? Do you claim
• They don’t exist? – then their outcomes, such as aircraft designed via computers, are uncaused and just appear magically
• They exist and are made up of elementary particles? – if so what are these particles an din what way do the constitute a computer program>?
• They exist and are not made up of particles? – this of course is my position. Can’t see that either of the others makes sense.
George
view post as summary
Paul Reed replied on Jul. 18, 2012 @ 19:47 GMT
George
Rather than respond to all that as such, let me express it so:
Having eradicated all metaphysical possibilities, we have two knowns: 1 Physical existence is independent of sensory detection. 2 Physical existence involves alteration. This means physical existence is a sequence, and that can only occur one at a time, because the successor cannot occur unless the predecessor ceases. In other words, there is a definite physically existent state as at any given point in time (timing, a point in time, ie the unit of timing, that being the fastest rate of change in reality).
What was physically existent as at any given point in time, is known as the present. Difference involves: 1) substance (ie what it was), 2) order (ie order of occurrence), 3) frequency (ie the rate at which differences occur. That is, the number of changes, irrespective of type, which occurred in any given sequence, compared to any other number that occurred meanwhile. The latter could be in any sequence (including the former), and either occurred concurrently, or otherwise. This is timing.
Now, this involves a vanishingly small degree of change and duration, but it must be so. Otherwise physical existence cannot occur. The key point here being that it reveals the falsity of attributing the concept of time to being a characteristic of a reality (ie a physically existent state). It is concerned with the difference between realities, not of a reality. Physically, there is alteration, and the timing system calibrates the rate at which change of any type occurs.
Paul
PS: I will have a look at your ref to time
report post as inappropriate
Author George F. R. Ellis replied on Jul. 19, 2012 @ 04:34 GMT
Paul
I really don't want to go into the issue of time here, it is a separate issue than what I am focusing on in my essay. Nevertheless I'll respond this time:
* I agree with your first main paragraph, interpreted as regards the passage of time along world lines in spacetime
* IN the third paragraph, you state " this involves a vanishingly small degree of change and duration, but it must be so." This is a physics assumption that may or may not be true. Many assume spacetime is quantised, in which case there is a minimum unit of time, and what you say is not true.
George
Paul Reed replied on Jul. 19, 2012 @ 05:37 GMT
George
My problem here is that the very way in which physical reality occurs, which is what I am really writing off, albeit generically, involves an implication for time, ie the lack of it therein. And an understanding as to what timing is reveals the same point. Anyway, there is no "passage of time", in the sense that time is 'something'. Physically, there is alteration, in a sequence, and one aspect of that is the rate at which that occurs, for which we can use timing to calibrate. But that concerns difference between realities, not a feature of a reality. Spacetime is an invalid model of physical reality. Time, or more precisely, timing, is extrinsic thereto.
That was not a "physics assumption". All we can know, and physical reality is what we can know of it, is that there is something independent of sensory systems, because they receive it (it being the result of a physical interaction between other pyhsically existent phenomena) and when such inputs are compared, difference is identifiable. A unit of time is, by definition, the fastest change to occur, because timing is rating change, per se.
Paul
report post as inappropriate
hide replies
Thomas Howard Ray wrote on Jul. 18, 2012 @ 12:18 GMT
George, this is just a wonderful piece of work, far worthier of more honor than any prize competition could bestow. I suppose that should come as no surprise -- if anyone were capable of distilling the history and dynamics of the entire universe into 10 pages, it would be you.
" ... life would not be possible without a well-established local arrow of time." So well put. And "Emergence of genuine complexity is characterised by a reversal of information flow from bottom up to top down." If you get a chance to vist my essay site, I would hope to convince you that both conditions are satisfied by topological orientability in a coordinate-free locally realistic model.
Again, thanks for this great paper.
Tom
report post as inappropriate
George Ellis replied on Oct. 6, 2012 @ 05:23 GMT
Thanks Tom for this very positive comment, much appreciated.
george
report post as inappropriate
Joe Fisher wrote on Jul. 18, 2012 @ 14:46 GMT
Dear Professor Ellis,
Your meticulously crystal clear cogent reasoned essay seems to me to be one of the more superior reads of the essays published at this website so far. That said, I think the Universe is the simplest structuring emerging. There is and ever will be only one Universe once although it seem to be having three differing aspects only one of which is seeing appearing. As I have thoughtfully pointed out in my essay, Sequence Consequence, identical snowflakes have never existed so it is reasonable to assume that identical physical states cannot ever exist. I contend that when you pressed down the letter A on your computer keyboard, the A you produced on your computer screen was not only minutely different from all of the other A’s on your computer screen, it would also have to be different than any other A that has ever appeared on any computer screen in the past, different than any A presently appearing on any computer screen located anywhere on earth, and also different than any A that will ever appear on any computer screen that will ever become operational in the future. There is only one of anything once in the one real Universe once. Each one of anything will seem to be having three aspects only one of which one can see here and now.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 19, 2012 @ 04:49 GMT
Dear Joe
Thanks for that. Your comments on identity strike to the heart of what I say about multiple realisability. You are right, the keyboard letters are never exactly identical: yet the abstract letter "A" represented by them is still the letter "A" despite all the variations you mention.
It is also the letter "A" if
* you change font (Times New Roman to Helvetica)
* you change to bold or italic
* you change size of the font
* you change colour of the font
* you change the medium from light on a computer screen to ink on paper
One of the key problems in Artificial intelligence is to assign all these different representations to the same abstract entity that they all represent. This way varied lower level representations of a higher level entity occur is characteristic of top-down causation: what matters is the equivalence class of all these representations, which is the characteristic of the higher level entity, not which particular representation has been chosen.
So all those different appearances all represent the same thing. And our minds easily handle this and recognize the higher level abstract thing all these phenomena represent, whether it is the letter "A" or the plan for a jumbo jet airliner. Those higher level entities (such as the plan for the airliner) really exist as entities in their own right. Proof: Jumbo jet airliners exist. It could not do so unless the abstract plan, with all its multiple represntations, were real.
Paul Reed replied on Jul. 19, 2012 @ 05:54 GMT
Joe
Exactly, as I have said elsewhere to you, and this is my fundamental point. As at any given point in time (as in timing), there is a specific physically existent state. To discern it, we would have to identify the particular state of the properties, and the relative spatial position, of every elementary particle involved. An impossible task, but our inability to do that does not detract from the fact that that is what constitutes physical reality (aka the present)as at that point in time. Even your A is more than one of these physically existent states. Misconceptualising this, leads to problems. Neither does sensory detection have any impact on that, because it occurred before it was sensed (something which the Copenhagen interpretation does not recognise).
Paul
report post as inappropriate
Sridattadev wrote on Jul. 18, 2012 @ 15:26 GMT
Dear Professor Ellis,
We are in between the bottom up and top down approaches of information flow. Relativistic universe is in a constant flux of this information flow and it is in an infinite feed back loop. Conscience is both at the absolute center and outer periphery of the universe (holographic effect). We as individual beings are caught in between these two equivalent states (singularity) and percieve the relativistic universe as a virtual reality.
Please see the essay
Conscience is the cosmological constant.Love,
Sridattadev.
report post as inappropriate
J. C. N. Smith wrote on Jul. 18, 2012 @ 18:22 GMT
Dear Professor Ellis,
You wrote, "The degree of complexity that can arise by bottom-up causation alone is strictly limited. Sand piles, the game of life, bird flocks, or any dynamics governed by a local rule . . . do not compare in complexity with a single cell or an animal body. The same is true in physics: spontaneously broken symmetry is powerful . . . but not as powerful as symmetry breaking that is guided top-down to create ordered structures (such as brains and computers). Some kind of coordination of effects is needed for such complexity to emerge"
While I'm aware of the stringent constraints on the length of our essays, your essay appears to cry out for some discussion of how your ideas square with the concept of Darwinian natural selection. We're familiar with the top-down influence which created computers, but what top-down influence created brains? I'd welcome your thoughts on these points.
Your essay recalled to my mind the following words of David Deutsch: ". . . everything that is not forbidden by laws of nature is achievable, given the right knowledge. . . . This is the cosmic significance of explanatory knowledge --and hence of people, whom I shall henceforward define as entities that can create explanatory knowledge." ('The Beginning of Infinity, p. 56)
Thank you again for an excellent essay!
jcns
report post as inappropriate
Author George F. R. Ellis replied on Jul. 18, 2012 @ 20:06 GMT
Dear jcns
Adaptive selection is one of the most important types of top-down causation. I did not have space to go into that aspect of things in the essay, but it is discussed in two papers accessible as follows:
On the nature of causation in complex systems ,
[linl:http://www.mth.uct.ac.za/~ellis/Top_down_gfre.pdf] Top down causation and emergence: some comments on mechanisms
Adaptive selection is top-down because the selection criteria are at a different level than the objects being selected: in causal terms, they represent a higher level of causation. Darwinian selection is the special case when one has repeated adaptive selection with heredity and variation. It is top-down because the result is crucially shaped by the environment [as demonstrated by numerous experiments: e.g.a polar bear is white because the polar environment is white].
However adaptive selection occurs far more widely than that; e.g. it occurs in state vector preparation, as I indicate in the essay.
Hope that clarifies this.
George
Author George F. R. Ellis replied on Jul. 18, 2012 @ 20:09 GMT
J. C. N. Smith replied on Jul. 18, 2012 @ 21:01 GMT
George,
Thank you very much for the references. I'll take a close look at your paper 'Top-down causation and emergence: some comments on mechanisms' as well as your paper 'Physics in the Real Universe: Time and Spacetime.' It's my preliminary sense that we share more than a few ideas in common about the nature of time. More later.
jcns
report post as inappropriate
Avtar Singh wrote on Jul. 18, 2012 @ 21:56 GMT
Dear George:
Excellent paper and clearly written to provide a wholesome perspective of reality provided by the top-down causation. In other words, the sum of parts is not the Whole, which could be more than and different from the linear sum of parts.
The theme of your paper is vindicated by the fact that a top-down causation model with simple boundary conditions is shown to predict the observed expansion of the universe and galaxies without any bottom up causation used in the standard model or particle physics. As described in my posted paper – “From Absurd to Elegant Universe”, the current paradoxes, singularities, and inconsistencies in the standard cosmology are shown to be artifacts of the absence of the top-down wholesome approach. The proposed Relativistic Universe Expansion (RUE) model based on the top-down conservation of the relativistic mass-energy-space-time continuum accurately predicts the observed universe accelerated expansion, dark energy or cosmological constant, and galactic star velocities without the concept of dark matter. It also predicts the dilation and creation of mass without any anti-matter and eliminates black hole singularity without the need for any super luminous inflation. The model also explains/predicts the inner workings of quantum mechanics and resolves paradoxes of the measurement problem, quantum gravity and time, and inconsistencies with relativity theory.
The evidence presented in my paper directly and mechanistically vindicates the following statements regarding the top-down causation in your paper:
“ …the foundational assumption that all causation is bottom up is wrong, even in the case of physics.”
“The key feature is that the higher level dynamics is effectively decoupled from lower level laws and details of the lower level variables:……you don’t have to know those details in order to predict the higher level behavior.”
My paper also proves as true the following concluding statement in your paper:
“…. recognizing this feature will make it easier to comprehend the physical effects underlying emergence of genuine complexity, and may lead to useful new developments, particularly to do with the foundational nature of quantum theory. It is a key missing element in current physics.”
I am delighted to read your paper as it mirrors the overall theme and results of my paper. I would greatly appreciate and welcome your comments on my paper.
Sincerely,
Avtar Singh
report post as inappropriate
Avtar Singh replied on Jul. 18, 2012 @ 22:16 GMT
Here is the link to my paper:
Please visit the
From Absurd to Elegant Universe.
or,
http://fqxi.org/community/forum/topic/1317
report post as inappropriate
Georgina Parry wrote on Jul. 18, 2012 @ 21:57 GMT
Dear George Ellis,
I found your essay very comprehensible, succinct and eloquent, as others have also found. It was enjoyable to read. The subject matter is interesting to me. I was also interested to see, in your comments, that you have written another essay in which natural selection is discussed. I have just touched on this kind of pattern control at the end of my essay and if there had been space I would have liked to have discussed it further. So it was really good to see your essay here because you have done a really thorough and clear job of getting the very important concept across.
One complaint often given by intelligent design supporters is that complex forms or functions can not arise by random chance. I think we are both saying that the outcome is not chance but a consequence of the organisation that already exists -and- the rules of physics and biology. Your photoshop example made me think of how an egg shell is formed. Calcium carbonate from ground up oyster shell or cuttle fish bone may be input to a bird (organised structure) and a beautifully formed eggshell is output. That egg form would not occur without the complex bird organism.It is a product of the organisation and rules not just self assembly of atoms.
I really like that you have considered this over many different scales from the smallest to the largest. There seems to be organisation at whatever scale is investigated and I think we agree that to concentrate on the smallest scales, and to expect all of the answers to come from there, is "myopic".It is also really good that you have explained your work in this discussion thread. I have found your comments helpful and think your full participation and patience is admirable.
You are sure to have many more appreciative readers.Good luck in the competition.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 19, 2012 @ 04:58 GMT
Dear Georgina Parry
many thanks for that. I like your eggshell example - yes it is a nice illustration. For an in depth discussion of top-down causation in developmental biology, the book by Gilbert and Epel ("Ecological developmental biology") is excellent.
The key point about adaptive selection (once off or repeated) is that it lets us locally go against the flow of entropy, and this lets us build up useful information. In this regard, I can't resist the following comment: it is often said that you can't unscramble an egg. Yes you can. How? By feeding the omelette to a chicken! (you get less egg than you started with:that's the Second Law in operation)
Good luck to you too.
Author George F. R. Ellis replied on Jul. 19, 2012 @ 05:00 GMT
Typo: you can't unscramble an omelette
J. C. N. Smith wrote on Jul. 19, 2012 @ 14:58 GMT
Dear George,
The paper to which you referred me, 'Top down causation and emergence: some comments on mechanisms,' did indeed help to answer my earlier question about squaring your ideas with natural selection. Thank you.
I'd like to comment on the point you made in your example illustrated by the question: "Why is an aircraft flying?" You wrote, "And why was it designed to fly? Because it will potentially make a profit for the manufacturers and the airline company! Without the prospect of that profit, it would not exist. This is the topmost cause for its existence."
I question whether there may be an even higher level cause: some human somewhere along the line posed the question "If birds can fly, why can't I?" And then our fellow humans refused to stop seeking until they found a satisfactory answer. Human curiosity about the way things work.
We might ask why all these essays have been written and submitted to the FQXi essay competition. Was it primarily because all these authors hope to win some easy money? I suspect not. More likely it is because they all have thought about the workings of the universe and have developed their own ideas and explanations that they believe are sensible, and they seek to share their ideas with similarly thoughtful people and, hopefully, perhaps to receive validation in the form of recognition and appreciation, regardless of any potential monetary reward.
Is it possible that human curiosity and creativity and eagerness for constructive collaboration are among the top of the topmost causes?
jcns
report post as inappropriate
John Merryman replied on Jul. 19, 2012 @ 16:33 GMT
jcns,
With George's point about planes existing because they make a profit for airlines and manufacturers, it is a top down logic of careful analysis of the situation and how it might be incrementally expanded. With your observation about flight being a consequence of human curiosity, it leans more toward a bottom up evolutionary striving, where all possible options get tried and those which succeed are the most repeated. Obviously there is no clear line between the two, but a constant feedback between experimentation and planning.
One might define the basis or bottom, as simple, while the elevated state is simply more complex, rather than "higher." So that initial question, "If birds can fly, why can't I?" is not so much a higher cause, but a more elemental cause. George's top down position is rather a vantage point from where one might plan on how to push even further up.
report post as inappropriate
John Merryman replied on Jul. 19, 2012 @ 19:17 GMT
It should be noted that complexity tends to multiply, until it becomes unstable, which is where our banking system currently is. The reason for this particular exponential complexity has been the advantage it provides those managing banking to drain resources from the rest of the economy. Obviously this is not to the benefit of society, or even the long term health of banking, which is built on trust, so the question it brings up is as to whether there is such a thing as a "top," from which one might look down, or is that always just a completely subjective point of reference?
report post as inappropriate
J. C. N. Smith replied on Jul. 20, 2012 @ 13:06 GMT
John,
You have raised some interesting points. Rather than comment directly on them myself I'd prefer to get George's own views; he clearly has given this topic far more thought than I have, and probably far more than both of us combined.
It certainly is a fascinating topic. David Deutsch has offered what strikes me as a classic comment on the topic in his book 'The Fabric of Reality' as follows (apologies for the odd spacing; I know not how to fix it):
"For example, consider
one particular copper atom at the tip of the nose of the statue
of Sir Winston Churchill that stands in Parliament Square in
London. Let me try to explain why that copper atom is there. It
is because Churchill served as prime minister in the House of
Commons nearby; and because his ideas and leadership contributed
to the Allied victory in the Second World War; and because it is
customary to honor such people by putting up statues of them;
and because bronze, a traditional material for such statues,
contains copper, an so on. Thus we explain a low-level physical
observation-- the presence of a copper atom at a particular
location-- through extremely high-level theories about emergent
phenomena such as ideas, leadership, war and tradition.
There is no reason why there should exist, even in
principle, any lower-level explanation of the presence of
that copper atom than the one I have just given."
jcns
report post as inappropriate
Paul Reed replied on Jul. 20, 2012 @ 16:05 GMT
JCN
But this is not a physical explanation as to why it is there, is it?
Paul
report post as inappropriate
John Merryman replied on Jul. 20, 2012 @ 19:05 GMT
jcns,
Alot of it could be described as wave action. To compress the analogy somewhat, the stokers on an old coal ship may not know, or at least it doesn't matter if they know, where the ship is going. Even the cells in your arm don't understand the higher order functions of you writing at the computer. Yet in some ways, even those further up the conceptual food chain might be oblivious even higher order intentions, even to the point of those seemingly at the top. To use the banking analogy, there is the personal motivation of making money among bankers, yet there is a higher order function of circulating value within the economy. It is when the bankers start primarily focusing on their own intentions of making money and losing sight of that higher order function, that the wave crests, or goes into a terminal bubble phase. This might go back down the scale; when those cells in your arm stop serving some higher order function and only want nutrition, your arm would cease to function, either due to exhaustion, disease, etc.
We could take this analogy much further up the chain and suppose life on this planet was trying to form a functioning central nervous system, with human civilization as its particular medium, the copper of the statue, so to speak. Yet it would presume some even higher order purpose, such as seeding the universe, then you get back down to the foundational functions of basic life and how it propagates, like fungi coming together to scatter spore. There are those endless feedback loops....
report post as inappropriate
Author George F. R. Ellis replied on Jul. 20, 2012 @ 21:25 GMT
Dear jcns
I agree very much with your first posting: the topmost level in the human motivational system is purpose or meaning. One should carefully distinguish two things here: in Jaak Panksepp's book "Affective Neuroscience", he identifies the SEEKING system as one of the genetically determined primary emotional systems; this drives us to search for understanding and meaning as a primary drive. The kinds of meaning we attribute to ourselves, to life, and the universe are higher understandings that arise out of an emotional basis but are themselves of an intellectual nature; they are formulations of what is meaningful to us, embracing ethical and aesthetic issues and our purpose in relation to them. For many people this highest purpose is indeed money! But for many others it is the kind of higher ethical purpose you refer to. This is the highest level in our hierarchy of goals for it is such ethical understandings that determine what lower level goals are desirable or acceptable.
So you ask "Is it possible that human curiosity and creativity and eagerness for constructive collaboration are among the top of the topmost causes?" These are hugely important in terms of motivation, yes; but the issue of whether they trump economics or not is an ethical decision or stance (some people let the one rule their lives, others the other). So I'd place that highest.
The quote from Deutsch is great. I had not read it before. It is about how the ethical level (the level of Telos, or purpose) drives the rest in a top-down way. And of course this applies to scientists too, for the basic issue there is why do they do science? Why spend one's life on that pursuit? And on what kind of science (blue sky or applied)? It is one's purpose - a non-physical entity - that shapes it all.
George
Paul Reed replied on Jul. 21, 2012 @ 05:41 GMT
George/JCN/John
All the above may be correct from the perspective being taken (ie non physical). But, at the physical level, there is no influence on the next state which will exist (commonly known as future), because it does not exist. What happens is that a state occurs which is different from what which would otherwise have occurred. Nothing has been changed, physically, because nothing existed physically to change. In terms of sequence, what might be seen as ‘oscillation’, ‘reaction’, etc, is just re-occurrence. For example: A B C B D, is not, physically, some form of ‘return’ to B, but a re-occurrence of B. And for those who would ask the next question, what if it was A B C B D D D E. The answer is that D continued to exist for more than one point in time, that being, by definition, the level at which the sequence is fully differentiatable, because it is the fastest at which any change occurs, and D was a state which did not alter at that speed.
Paul
report post as inappropriate
hide replies
Lawrence B. Crowell wrote on Jul. 19, 2012 @ 17:25 GMT
What sort of role might Erdos-Renyi networks play here? The sort of nearest neighbor approach with probability weights is a neural model of sorts. These networks are the basis for percolation theory and mean field theory. When the number of connected nodes reaches some threshold the properties of the system can change. In the case of percolation theory this can lead to a rapid failure of a material.
report post as inappropriate
Author George F. R. Ellis wrote on Jul. 20, 2012 @ 21:55 GMT
Hi Lawrence
there are some similarities because the brain a structured network, but it is not like the Erdos-Renyi networks because they are carefully constructed to be random whereas the brain is not: it's connections embody the results of our interaction with the world, encoding our knowledge and learnings. While the usual statistical approaches to networks are illuminating to some degree, they miss out on key issues such as identifying the structural motifs that enable brain circuits to function as they do. Uri Alon's writings are illuminating in this regard; see his book: "Introduction to Systems Biology: Design
Principles Of Biological Circuits", and for example
here .
Lawrence B. Crowell replied on Jul. 21, 2012 @ 14:36 GMT
In thinking about this I then ponder whether random networks subjected to some sort of selection process can then evolve into a form suggested here. This is in a manner of thinking a sort of Darwinian process. A random network might compared to some random noise or white noise system, but where given some sample of possible random networks subjected to a culling process plus some “survival criterion” results in networks which are less random and output what might be called pink noise.
Thanks for the reference.
Cheers LC
report post as inappropriate
Daniel L Burnstein wrote on Jul. 20, 2012 @ 22:04 GMT
[corrected] Please ignore earlier reply.
@ Prof. Ellis
As a follow-up to our exchange.
" a very old dream, and one that is probably unattainable both because of Godel's theorem"
Gödel's incompleteness theorems are often invoked as an argument against the possibility of a complete and consistent axiom set from which all interactions at all scales of physical reality can...
view entire post
[corrected] Please ignore earlier reply.
@ Prof. Ellis
As a follow-up to our exchange.
" a very old dream, and one that is probably unattainable both because of Godel's theorem"
Gödel's incompleteness theorems are often invoked as an argument against the possibility of a complete and consistent axiom set from which all interactions at all scales of physical reality can be derived. The problem is, the incompleteness theorems apply to the formulation of meta-mathematical statements about systems (arithmetic principally). But one has to remember that, aside from basic rules of composition, there are no constraints to the making of such meta-mathematical statements, theorems, (nothing prevents false statements or statements that can't be derived from any given finite axiom set).
Physical reality, on the other hand, strictly constrains any phenomena so that it must be consistent with the fundamental laws that govern forces and other interactions. So does Gödel’s incompleteness theorems really preclude any possible answer to Hilbert's 6th problem?
If the Universe is made of a finite set of fundamental objects which combine in accordance to a finite set of laws that a finite number of fundamental interactions to produce physical reality, then doesn't it follows that Gödel's first incompleteness theorem is, at least in its present form, wrong when applied to reality?
Also, if you believe that the fundamental components and laws are a consistent and that the Universe is a coherent system, then shouldn't any physical interpretation of Gödel's second incompleteness theorem also be wrong?
"in what way do these axioms and theorems exist, and where do they exist? Are they Platonic forms for example?"
We need to distinguish the axioms from the fundamental aspects of reality they would stand for. And by definitions, axioms cannot be proven. They are merely defined. Once that is done, the axiomatic system may be put to the test. If the axiom set and rule set that makes the axiomatic system are complete and consistent, then all that interactions should be derivable from it. It should also enable the emergence of falsifiable predictions.
" what decides the form they have? (there are various possible forms of logic: who chose this one?)"
That is the tricky part. Any choice must be made based on assumptions. There can be a number of viable axiomatic systems that may be used, but whatever the choice, it must be self-consistent an all interactions must be either derivable from or reducible to it.
"how do they have the power to create any physical entity whatever? Actually axiomatic systems are rather limited in their powers and in their ability to represent reality."
If the Universe is found to be both consistent and complete, that is, the fundamental objects and the laws that govern them are consistent (consistency) and all that they produce remains part of the Universe (completeness), then all physical processes are emergent. It can then be shown that it is possible to create an axiomatic system that represents the fundamental aspects of reality and that representations of all interactions can be derived within such axiomatic system. If the Universe is a consistent and complete, then wouldn't it follow that an axiomatic system that is powerful enough to describe it can be devised?
Though a work in progress, I believe that I have shown that an hypothetical universe that is comparable to our Universe in complexity can emerge from an simple axiomatic set. My essay, titled "Questioning the Assumption that Space is Continuous" shows one way that can be done (my essay is based on a larger work, part of which can be freely).
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Jul. 21, 2012 @ 12:43 GMT
I don't want to enter into to the territory of Godel's theorem Roger Penrose is the person to talk to about that. Rather I'll just commnent one statement:
"If the Universe is found to be both consistent and complete, that is, the fundamental objects and the laws that govern them are consistent (consistency) and all that they produce remains part of the Universe (completeness), then all physical processes are emergent."
This is a non-sequitur. "Completeness" above means that you can't get out of the universe by application of physical laws. It does not guarantee that everything in the universe can be attained in this way. Some physical processes are not emergent but are entailed in a top-down way. For example there is no bottom up process by which the computer memory states embodying a Quicksort algorithm can emerge from the action of the underlying physics acting in a purely bottom-up way. Indeed the same is true of the processes leading to creation of a teacup or a pair of spectacles (see my Nature article "Physics, complexity and causality" 435, 743 (2005)). If you believe this is wrong,please advise me of a physical law or process that unambiguously determines how a tea cup can be created in a purely bottom-up way.
Member Hector Zenil replied on Sep. 23, 2012 @ 21:17 GMT
Dear George,
I couldn't add a new post at the end of the discussion page so I try in this thread that seems somehow related.
How robust your hierarchy depicted in Table 2 for a digital computer system is in the light of Turing's universality? From Turing universality we know that for a computation S with input i we can always write another computer program S with empty input computing the same function than S for i. Also one can always decompose a computation S into S and i, so data and software are not of essential (ontological?) different nature. I also wonder if it isn't statistical mechanics the acknowledge that the view you are arguing against is not the general assumption in the practice of science.
report post as inappropriate
Thomas Howard Ray replied on Sep. 24, 2012 @ 11:57 GMT
Hector,
I hope I don't seem too presumptuous or rude for jumping in here -- I am also interested in George's reply, and your question is important to me as well. You write, " ... one can always decompose a computation S into S and i, so data and software are not of essential (ontological?) different nature ..." which I think gets down to the "murky" level 1 of Table 2, and the quantum...
view entire post
Hector,
I hope I don't seem too presumptuous or rude for jumping in here -- I am also interested in George's reply, and your question is important to me as well. You write, " ... one can always decompose a computation S into S and i, so data and software are not of essential (ontological?) different nature ..." which I think gets down to the "murky" level 1 of Table 2, and the quantum measurement problem.
George's hypothesis is that "Emergence of genuine complexity is characterised by a reversal of information flow from bottom up to top down." This agrees with complex systems research that I am aware of, accounting for multi-scale variety, bounded rationality and lateral distribution of information in the complex network. As Ellis says, "Some kind of coordination of effects is needed for such complexity to emerge ..." Bar-Yam says, "In considering the requirements of multi-scale variety more generally, we can state that for a system to be effective, it must be able to coordinate the right number of components to serve each task, while allowing the independence of other sets of components to perform their respective tasks without binding the actions of one such set to another." [Bar-Yam, Y. (2004). "Multiscale Variety in Complex Systems." Complexity vol 9, no 4, pp 37-45]
So the level of a Turing machine randomly recording finite states is continuous with the system opportunistically coordinating those independent states for specific task performance consistent with evolutionary advantages(which implies Ellis's contention that there is no life without a local arrow of time, but that is whole other discussion!). As George writes, "spontaneously broken symmetry is powerful, but not as powerful as symmetry breaking that is guided top-down to create ordered structures (such as brains and computers)."
Not to be too self-promoting, but my own essay ("The Perfect First Question") shows why broken symmetry at the most fundamental level of binary decision making (Wheeler's "it from bit") transforms perfect randomness into perfectly determined states, such that the local continuous measurement function (including quantum correlations), is always oriented in one of two directions. That supports both Ellis's arrow of time and complex system coordinating effects at mutiple scales.
Can the quantum measurement problem be simply an artifact of computation, only tangentially related to physics?
Best,
Tom
view post as summary
report post as inappropriate
Avtar Singh wrote on Jul. 20, 2012 @ 23:50 GMT
Dear George:
Following up on my earlier comments, there are additional physical and mechanistic arguments that support your statement –
“ …the foundational assumption that all causation is bottom up is wrong, even in the case of physics……… The key feature is that the higher level dynamics is effectively decoupled from lower level laws and details of the lower level variables:……you don’t have to know those details in order to predict the higher level behavior.”
The bottom up causation is used in standard cosmology to describe the universal reality based on the quantum reality exhibited by individual particles/fields. However, quantum reality represents only partial reality due the quantum measurement problem. A classical measuring instrument interprets the quantum phenomena (V~C) from a Newtonian (V~0) frame of reference, hence the quantum reality represents a truncated partial reality resulting in the observed weirdness and inconsistencies. The quantum phenomenon being observed occurs in a dilated space-time due to V~C, while the Newtonian space-time frame of reference of the observer remains fixed and undilated due to V~0. In order to describe the non-truncated wholesome universal reality, a top-down causation approach such as the one described in my paper - “
From Absurd to Elegant Universe”, is essential that satisfies laws of conservation of mass, energy, space-time, and momentum via proper inclusion of the relativistic effects. Paradoxes of quantum measurements and quantum reality (entanglement, tunneling, multiverses, multi-dimensions and anti-matter etc.) are artifacts of the quantum observational limitations imposed by the fixed space-time and not satisfying the overall conservation laws at the universe level. The top-down causation approach allows universal connectivity and non-locality via governing eternal and omnipresent conservation laws throughout the universe, which are missed out due to the consideration of only local or discrete realities of particles/fields in fixed space-time in the bottom-up approach.
And yes, bodies and brains can be created as well as annihilated by the top-down causation. As explained and described in my paper, both the creation and dilation of matter are predicted by the top-down causation model without the need for any nucleo-synthesis or anti-matter concepts used in the standard model.
The seriousness of the impact of the top-down causation should not be underscored. Without the top-down causation consideration, it is impossible to determine the universal wrongness or correctness of any assumption or theory. From the bottom-up causation, only a partial or local, and not universal, appropriateness of any assumption/theory can be estimated. For this very reason, the widely successful bottom up quantum theory at worldly level fails to predict 96% (dark matter and dark energy) of the wholesome universe.
Sincerely,
Avtar Singh
report post as inappropriate
Author George F. R. Ellis replied on Jul. 25, 2012 @ 18:31 GMT
Dear Avtar
I am glad we agree on top down causation.
George Ellis
Author George F. R. Ellis wrote on Jul. 21, 2012 @ 12:24 GMT
Some while ago I said I'd post about a Feynman quotation which is illuminatng as to his view on which if any level is fundamental. Here it is:
In his book "The character of physical law" , on pp.124-125,
Richard Feynman summarises the hierarchy of structure, starting with
the fundamental laws of physics and their application to protons,
neutrons, and electrons, going on to atoms and heat, and including
waves, storms, stars, as well as frogs and concepts like `man',
`history, `political expediency', `evil', `beauty', and `hope'. He
then says the following (pp. 125-126):
"Which end is nearer to God, if I may use a religious metaphor.
Beauty and hope, or the fundamental laws? I think that the right
way, of course, is to say that what we have to look at is the whole
structural interconnection of the thing; and that all the sciences,
and not just the sciences but all the efforts of intellectual kinds,
are an endeavour to see the connections of the hierarchies, to
connect beauty to history, to connect history to man's psychology,
man's psychology to the working of the brain, the brain to the
neural impulse, the neural impulse to chemistry, and so forth, up
and down, both ways. And today we cannot, and it is no use making
believe we can, draw carefully a line all the way from one end of
this thing to the other, because we have only just begun to see that
there is this relative hierarchy."
"And I do not think either end is nearer to God. To stand at either
end, and to walk off that end of the pier only, hoping that out in
that direction is the complete understanding, is a mistake. And to
stand with evil and beauty and hope, or with fundamental laws,
hoping that way to get a deep understanding of the whole world, with
that aspect alone, is a mistake. It is not sensible for the ones who
specialize at one end, and the ones who specialize at the other, to
have such disregard for each other ... The great mass of workers in
between, connecting one step to another, are improving all the time
our understanding of the world, both from working at the ends and
from working in the middle, and in that way we are gradually
understanding this tremendous world of interconnecting hierarchies."
J. C. N. Smith replied on Jul. 21, 2012 @ 13:57 GMT
Dear George,
Thank you for that Feynman quote; it nails the topic perfectly, in his inimitable way.
On the theory that one good quote deserves another, here's another of my favorites by David Deutsch, this one from his book 'The Beginning of Infinity' (p.75).
"Like an explosive awaiting a spark, unimaginably numerous environments in the universe are waiting out there, for aeons on end, doing nothing at all or blindly generating evidence and storing it up or pouring it out into space. Almost any of them would, if the right knowledge ever reached it, instantly burst into a radically different type of physical activity: intense knowledge-creation, displaying all the various kinds of complexity, universality and reach that are inherent in the laws of nature, and transforming that environment from what is typical today into what could become typical in the future. If we want to, we could be that spark."
jcns
report post as inappropriate
Edwin Eugene Klingman wrote on Jul. 21, 2012 @ 20:28 GMT
Dear George Ellis,
A very interesting read. In a reply to Paul you note that "reality is unclear" at the particle level because of uncertainty, wave-particle duality, and entanglement. In this sense any change in understanding of these aspects of reality might be expected to have some effect on the conception of 'the bottom' (although equivalence classes might not change). For this reason I invite you to read and hopefully comment on my current essay,
The Nature of the Wave Function.
At the other end of the spectrum, in a comment to jcns, you bring 'meaning and purpose' into the picture. This brings up the question, "where is the top?". Do you make an assumption here, or is the top an open ended concept?
I agree that most discussions of emergence do NOT treat 'top down causation' and you are to be commended for doing so.
Edwin Eugene Klingman
report post as inappropriate
Paul Reed replied on Jul. 22, 2012 @ 05:08 GMT
"In a reply to Paul you note that "reality is unclear" at the particle level because of uncertainty, wave-particle duality, and entanglement"
Indeed, to which I have responded with the point that this cannot be so, otherwise there would be no physical existence, which there is, and no alteration to that, which there is. Whatever reality 'ultimately' is, which we can never know, because we too are part of it, what we certainly do know is that there is 'something out there' ('out' being extrinsic to sensory detection systems)and it alters. The whole process of sensory detection(ie seeing, hearing, etc) involves the physical receipt of physically existent phenomena (eg light, noise, vibration), which are themselves the result of an interaction between other physically existent phenomena (one of which we tend to label the reality). That is the fundamental physics.
So physical reality obviously occurs in a specific physically existent state. It does not exist in some "unclear" manner. The issue is our inability to identify that. The sensory systems evolved to ensure survival of organisms, not the sensing of the very constitution of reality ('the bottom'). The Copenhagen interpretation, and any other theory that assumes there is no 'bottom', or that sensing affects the 'bottom', is invalid. In the latter case, it is sheer nonsense. Not only do organisms not receive reality anyway, when sensing, by definition, reality has already occurred for them to be able to sense it!
The question then becomes, having swept away metaphysical presumptions and invalid theories, what constitutes the 'bottom'? My definition, and I am perfectly happy with improvements thereto-just no the incorrect assertion that there is not one, is: " the physically existent state which occurs as at any given point in time, is a function of the particular state of the properties of the elementary particles involved, and their spatial position, as at that point in time"
Paul
report post as inappropriate
Edwin Eugene Klingman replied on Jul. 22, 2012 @ 22:27 GMT
Paul,
You have repeated your beliefs on FQXi probably more often than any one else. My question was addressed to George Ellis, who has not flooded FQXi with his opinions, and whose thread this is.
report post as inappropriate
Vladimir F. Tamari replied on Jul. 23, 2012 @ 02:25 GMT
Well said Edwin.
Paul, has a gift for systematic analysis of statements by physicists. This may be put to very good use for example in writing a monograph on how historically various physicists changed their own positions on subjects such as SR , the ether, time, etc. That would be really interesting.
Cheers to both of you
Vladimir
report post as inappropriate
Paul Reed replied on Jul. 23, 2012 @ 04:56 GMT
Edwin
“You have repeated your beliefs on FQXi probably more often than any one else”
First, why this concept of “repeated”? In your post you picked up, and mentioned, a response to me. To which I responed, particular since it is a fundamental point in this topic, but had no response.
Second, why the concept of “beliefs”? If what I write is belief, particularly childish ones, then as I said before, and as would have been a better response here, why don’t you point out, factually, where I am, obviously, wrong?
Paul
report post as inappropriate
Paul Reed replied on Jul. 23, 2012 @ 05:08 GMT
Vladimir
You, and indeed many others (ie it is not a personal point) keep making statements about SR. I post, with evidence, that SR might not be what people think it is.
In particular, when relevant, I ask if people can please read my posts on my blog, ie 11/7 1933 & 13/7 11.24, which since I now have a blog, I took the opportunity to post. These are 8 pages of analysis of the subject. So, although it is not quite the subject of the monograph you suggest, it is a monograph that is more relevant. I could be wrong of course, but as yet, I am not even aware of anyone having read them. And incidentally, this is not my essay, so it is not as if I am making a point in order to market my essay.
Paul
report post as inappropriate
Author George F. R. Ellis replied on Jul. 23, 2012 @ 14:55 GMT
Paul this is the last time I will reply to any of your extremely repetitive comments.
You say "physical reality obviously occurs in a specific physically existent state. It does not exist in some "unclear" manner. ... The Copenhagen interpretation, and any other theory that assumes there is no 'bottom', or that sensing affects the 'bottom', is invalid."
You seem not to understand either wave particle duality or entanglement. The way experiments are done does indeed affect the properties of the bottom-most particles we can access. Please spend a bit of time reading Feynman or any other good text on basic quantum physics. I do believe that Heisenberg, Bohr, and Feynman understood the physics considerably better than either I or you do.
You continue "The question then becomes, having swept away metaphysical presumptions and invalid theories, what constitutes the 'bottom'? My definition .. is: " the physically existent state which occurs as at any given point in time, is a function of the particular state of the properties of the elementary particles involved, and their spatial position, as at that point in time".
I repeat what I have already said to you: in quantum field theory, particles are not the fundamental entities: they are just excitations of fields. They don't have either definite positions or momenta, according to the uncertainty principle. Your Newtonian model of basic reality is 90 years out of date.
As for time, I have already agreed with your statement "What physically happens is that a different physically existent state subsequently occurs from that which would have otherwise occurred." True. The future does not exist now but it will exist later on. Yes.
I can't see the point of all the further argumentation about this. Whatever else it is about time that bugs you is unclear to me, and repeating it yet again won't help. Please don't repeat it again on this particular forum.
Author George F. R. Ellis replied on Jul. 23, 2012 @ 15:11 GMT
Dear Edwin Eugene Klingman
you ask "where is the top?" A very good question, and the answer depends on context:in the case of structure of the human brain, no top is currently identifiable: things near the top seem to be non-localised. In the case of cosmology, one ends up with philosophy because the largest physical scales are unobservable (there are observational horizons in the real universe) so you can say anything you want about it and no on will ever be able to make observations that contradict what you say. So the claims I make are local claims (for any pairs of related levels) and independent of any global claims about any topmost level.
This applies also to the bottom - there may or may not be a bottom-most level; if there is one we don't know what it is (it may be string/M theory, but then again it may not). Indeed all the levels we deal with in ordinary physics are effective levels, not fundamental;and this does not matter. This is just as well , else we could not do physics.
George Ellis
hide replies
Anton W.M. Biermans wrote on Jul. 22, 2012 @ 07:34 GMT
George,
I'm afraid that you (and everybody else, for that matter) confuse causality with reason.
If we understand something only if we can explain it as the effect of some cause, and understand this cause only if we can explain it as the effect of a preceding cause, then this chain of cause-and-effect either goes on ad infinitum, or it ends at some primordial cause which, as it cannot be reduced to a preceding cause, cannot be understood by definition.
Causality therefore ultimately cannot explain anything. If, for example, you invent Higgs particles to explain the mass of other particles, then you'll eventually find that you need some other particle to explain the Higgs, a particle which in turn also has to be explained etcetera.
If you press the A key on your computer keyboard, then you don't cause the letter A to appear on your computer screen but just switch that letter on with the A tab, just like when you press the heck, you don't cause the door to open, but just open it. Similarly, if a let a glass fall out of my hand, then I don't cause it to break as it hits the floor, I just use gravity to smash the glass so there's nothing causal in this action.
Though chaos theory often is thought to say that the antics of a moth at one place can cause a hurricane elsewhere, if an intermediary event can cancel the hurricane, then the moth's antics only can be a cause in retrospect, if the hurricane actually does happens, so it cannot cause the hurricane at all. Though events certainly are related, they cannot always be understood in terms of cause and effect.
The flaw at the heart of Big Bang Cosmology is that in the concept of cosmic time (the time passed since the mythical bang) it states that the universe lives in a time continuum not of its own making, that it presumes the existence of an absolute clock, a clock we can use to determine what in an absolute sense precedes what.
This originates in our habit in physics to think about objects and phenomena as if looking at them from an imaginary vantage point outside the universe, as if it is legitimate scientifically to look over God's shoulders at His creation, so to say.
However, a universe which creates itself out of nothing, without any outside interference does not live in a time continuum of its own making but contains and produces all time within: in such universe there is no clock we can use to determine what precedes what in an absolute sense, what is cause of what.
For a discussion why big bang cosmology describes a fictitious universe, see my essay 'Einstein's Error.'
Anton
report post as inappropriate
Paul Reed replied on Jul. 22, 2012 @ 10:35 GMT
Anton
I am not going to make a judgement on the validity of your general point, but will alight on “then this chain of cause-and-effect either goes on ad infinitum, or it ends at some primordial cause which, as it cannot be reduced to a preceding cause, cannot be understood by definition”.
Now, there are two issues here:
1 Cause must involve physically existent phenomena. In simple language, cause is not something which is somehow ‘separate’ from physical existence (and I am not implying you are saying that). So it is definitive and knowable, and must have correspondence with physically existent phenomena.
2 We are concerned with knowledge of reality, not reality. In other words, assuming a valid closed system can be identified (which it can-sensory detection in all organisms), then there is a ‘limit/confine’, within which all is, potentially, knowable (only practical problems in the sensory detection process prevent this from being so, not metaphysical considerations). There is a valid limit to the knowledge that is potentially available to us. The confusion is in not understanding that we are ultimately dealing with knowledge of the actuality, not the actuality.
Paul
report post as inappropriate
Anton W.M. Biermans replied on Jul. 23, 2012 @ 01:41 GMT
Paul,
What is or happens within a perfectly closed system has no physical reality to someone outside of it: it does not belong to his universe, is unobservable so he cannot say anything about it. The same goes for the second law of thermodynamics: if a system is perfectly closed, that is, if there's no physical communication possible with what's inside of it, then it doesn't even make sense to ask how much entropy it contains. As in a self-creating universe the observation interaction affects the observed, there is no absolute, i.e., objectively observable reality at the origin of our observations. It isn't that our observation is imperfect; the point is that in a universe where particles create one another, their properties are as much the effect as the cause of their interactions so the observation interaction unavoidably affects the nature of the thing to be observed. Here we cannot really distinguish between the properties of a fundamental particle and their expression. In such universe there is no reality separate from its observation, though imperfect observational equipment or methods of course blur observations.
Anton
report post as inappropriate
Paul Reed replied on Jul. 23, 2012 @ 05:27 GMT
Anton
“What is or happens within a perfectly closed system has no physical reality to someone outside of it”
Exactly. And we are part of physical reality, we cannot extricate ourselves from it. We can only know that which is ‘outside’, ie independent of, sensory detection. Which means we have a validated closed system. That determines the ‘boundary’ between scientific knowledge and belief. Bearing in mind that we have to hypothecate to overcome known practical problems in the physics of the sensory detection process, but must reference this back to validated direct experience, ie avoid belief when doing this.
“As in a self-creating universe…” But this is not the reality of which we are a part. When we sense something, we are receiving a physically existent phenomenon, ie it exists independently of our sensing of it.
Paul
report post as inappropriate
Author George F. R. Ellis replied on Jul. 23, 2012 @ 15:22 GMT
Anton
"Causality therefore ultimately cannot explain anything." If so please explain to me how you go about your daily life. If you are unable to cause any changes about you in your daily existence, then you don't exist as a person (and you certainly won't be able to get a job).
I explained carefully at the start of my paper that there are always numerous causes in action, and we get a useful concept of "the cause" by taking all except a few for granted. This produces a valid local theory of causation. You don't have to solve problems of ultimate causation to understand local physical effects (e.g. heating water causes it to boil). Your complaint seems to be that if you can't explain the entire universe you can't explain such local phenomena. The whole practice of science disagrees with you.
George
Georgina Parry replied on Jul. 24, 2012 @ 22:04 GMT
Dear George,
I agree with your point. It is also necessary to beware of linking unrelated events and so inventing a causation story when it didn't happen in that way. It can happen in science,( including social science), that correlation is mistaken for causation.It probably occurs more often than is realised. Particularly important and well known is the placebo effect, where getting better may be nothing to do with the treatment given.
I read an interesting item a long time ago about a study into the effect of high fat diets on rabbits. Rather than getting more unwell the badly fed rabbits remained healthy. Eventually it was found that the animal handler looking after the badly fed rabbits was giving them extra attention and fondling. Presumably reducing the animals stress level and making them better able to handle the bad diet ill treatment.
The book "Freakonomics" Steven D. Levitt and Stephen J. Dubner has many amusing stories of how correlation might be mistaken for causation. Fascinating.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 25, 2012 @ 18:29 GMT
Dear Georgina,
yes indeed, in general separating out correlation from causation is very difficult, which is why I stated my definition of causation at the start of my essay in terms of the reliable consequences of an action. That makes clear what the initiating event is.
The placebo effect is fascinating; of course from my viewpoint it's a case of top-down action from beliefs (abstract entities) to physical systems (human bodies). Placebos are certainly effective, which is why drug treatments are compared to placebo treatments.
I like your story about the rabbits. The welfare of human children is similarly crucially affected by being given attention; it affects their bodily weight and even their survival.
Georgina Parry replied on Jul. 26, 2012 @ 03:35 GMT
Dear George,
re the placebo I should have said; nothing to do with a pharmacological effect of the treatment, not nothing to do with the treatment. As the treatment might include the lengthy and concerned consultation, diagnosis and prescription which can make a person feel important, valued and cared about and so affect neurotransmitter levels, their balance and so psychology.
The brain has executive control of the body including maintenance of health. The function of the organs and tissues and biochemistry of the body can be affected by changes of activity within the very complex neural networks of the brain. Which is altered by changes in neurotransmitter availability and balance.It seems to me, the interaction of the complex external environment and social interaction upon the -complex organised brain- and its body interaction, causes the change in health and not the simple sugar pill and mere (abstract) belief in its power.
As you have pointed out the effect is relevant to your top down control concept.I think even more so than you have intimated- I think it is a good example of a specific effect (output) arising from complexity and organisation, not from very simple inputs. It can't be explained as the result of the simple "sugar pill" input.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 26, 2012 @ 06:58 GMT
Dear Georgina
nicely put.
George
T H Ray replied on Jul. 26, 2012 @ 09:30 GMT
The placebo effect and the rabbit story may also be explained in terms of hidden variables, may they not? Just as correlation is not necessarily causation, information is not necessarily knowledge.
At the end of the day, the questions we mean to answer -- Is treatment no better than a placebo? Does diet affect well being? -- depend on suppressing some information in order to validate contradictory information.
A self organized universe, on the other hand, does not demonstrably suppress information. It appears to be an uncontrolled experiment and infinitely creative. In terms of theory, then, a hierarchical structure (vice laterally distributed information) would seem to impose, a priori, a restriction on causality that leaves nature no choice. If conscious IGUS (information gathering and utilizing systems) are co-creative with self organized nature, however, their choices are co-variant with evolving nature and a hierarchical structure would seem to be superfluous.
Tom
report post as inappropriate
Georgina Parry replied on Jul. 26, 2012 @ 11:47 GMT
Tom,
Information shouldn't be suppressed to add validity other information. To be valid it is necessary to have studies where as many parameters are controlled as possible and those that can not be controlled are pointed out. All of the rabbits had to be kept under identical conditions except for the dietary difference under consideration. As they were not the study was invalid.If the rabbits had not been expected to become unwell then it is unlikely the error would have been spotted.
The result of such a study shows what happens to rabbits under those conditions not what will happen to human beings living under very different and far more variable conditions. The answer to those questions "Is treatment no better than a placebo? Does diet affect well being?" has to be it depends. The answer is far more complex than many respectable scientific papers and pseudo science papers would suggest, IMO.
Tom, you said "A self organized universe, on the other hand, does not demonstrably suppress information". I'm probably not thinking about information as you are but if there is deposition or accumulation of material then couldn't the information within be considered as suppressed (even if not entirely hidden) because interaction happens mainly at surfaces, that is where the information is accessed. Surfaces are particularly important in biology.
I agree with George that some things can't happen if the organisation does not exist to give the output. Complex proteins such as enzymes are assembled within cells because there is the DNA code for them, the mRNA copies that are read and the ribosomes where the proteins are assembled. The complex enzyme proteins do not spontaneously assemble as an uncontrolled experiment.
I think I can understand what you mean about IGUS' being co-creative with nature and their choices co-varying as nature "evolves" but I don't understand what you mean about "the hierarchical structure" being superfluous in that case.
report post as inappropriate
Thomas Howard Ray replied on Jul. 26, 2012 @ 12:19 GMT
Hi Georgina,
"Information shouldn't be suppressed to add validity (to) other information."
Ah, but that's what a controlled experiment does, in principle. There's no way to tell that one has controlled for all possible variables. We can only do so to a reasonable certainty. Compare a controlled experiment to a theoretical prediction, e.g., Einstein's adjustment to the precessionary orbit of Mercury, or gravitational lensing.
"I think I can understand what you mean about IGUS' being co-creative with nature and their choices co-varying as nature 'evolves' but I don't understand what you mean about 'the hierarchical structure' being superfluous in that case."
I mean, that any hierarchy (such as a human being, the corporation of cooperating cells in which the brain has executive control, as you put it) may be undermined by an unexpected event. In Darwinian evolution, this might be a random mutation. A mutation may be beneficial, harmful or neutral -- however, we cannot know which except in the larger context of perfect information, to which we do not have access. I was impressed many years ago, in reading Kurt Vonnegut's *Breakfast of Champions* that the protagonist mused over a glass of champagne, wondering if the yeasts -- who consumed sugar and excreted alcohol until they drowned in their own excrement -- had any consciousness of what they were creating. (In the same respect, consider Emily Dickinson's poem: "My candle burns at both ends; it will not last the night. But oh my friends, and ah my foes, it makes a lovely light!")
Tom
report post as inappropriate
Thomas Howard Ray replied on Jul. 26, 2012 @ 12:23 GMT
I forgot to credit Murray Gell-Mann and Jim Hartle for coining the acronym IGUS.
report post as inappropriate
Georgina Parry replied on Jul. 26, 2012 @ 13:26 GMT
Hi Tom ,
I take your point about having to simplify what is considered. My original point made to George was about wrongly attributing causal relationships. If only a few parameters are considered a very simplistic view of what is occurring can be fabricated that is just incorrect. Serious effort must be made to avoid that if the science is to be good.
OK I think I see what you mean. Small change - big effect. I gave the example of dwarfism in my essay. Some other examples: Island population wiped out by volcanic eruption: Molecular change to DNA in egg giving non-viable embryo. Yes an individual or population can become superfluous to the universe as the result of a seemingly random event. I don't think being organised gives invulnerability but it does allow some things to happen in the universe that could not otherwise occur. So its not -just- top down control but not just bottom up either.
report post as inappropriate
T H Ray replied on Jul. 26, 2012 @ 13:51 GMT
Hi Georgina,
I think George has it right, that causality is top down. We just don't always know where the top originates, so we assign boundary condtions.
Given that limitation, I agree with you.
Given no boundary conditions, however, the top of the hierarchy is in "the mind of God." Getting inside that mind is what I mean by co-creation and covariance with nature. The means of such co-creation and the covariant results it generates, is in theoretical language, the abstractions of which George speaks -- the true "top" of the hierarchy -- independent of hierarchical subsystems for which we prescribe boundary conditions.
Tom
report post as inappropriate
Georgina Parry replied on Jul. 27, 2012 @ 00:45 GMT
Tom,
that isn't very clear to me. If you mean we explain things by chopping them up into smaller systems we can understand, then yes I agree.Is that what you mean by assigning boundary conditions? Ultimately everything is is at the mercy of what is happening in the whole universe -at all scales-, it seems to me. If our star was to explode the organisation of the living beings and machines on Earth would become totally irrelevant.If sub atomic particles and atoms did not have the properties they have the universe would not be what it is.
It isn't necessary to use galactic or universal explanations, or explanations involving complexity, or sub atomic explanations for every occurrence. It is possible at various scales to see factors or processes that are having a particular disproportionate influence on output. For example its more helpful to describe the cause of a rate of photosynthesis to be its current limiting factor,(CO2, water, Light). Taking for granted that there is a plant in which photosynthesis can occur, and not bringing into account the universal conditions that have allowed complex plant life to evolve on Earth or the properties of different chemical elements.
Maybe all causation stories are a distillation and sticking together of available information by humans for specific human purposes, not what is really happening in universe independently of that information processing.I might be agreeing with you, but saying it in my own way.
report post as inappropriate
Georgina Parry replied on Jul. 27, 2012 @ 03:02 GMT
Tom,
Does assigning boundary conditions mean specifying or at least deciding what will and will not be considered as part of the problem, the limit of the particular investigation? If so then I agree we do do that. Though we could have open ended investigations where gradually more and more parameters and variables are added to the boundary of the investigation, giving further complexity to the answer. Gaps could also be filled at different scales and linked, until theoretically everything is causally linked to everything else. Like a universal, physical and biological, ecosystem. Except we would hit lots of boundaries, which are not assigned by choice, but are the limits of our capabilities as a species.
report post as inappropriate
T H Ray replied on Jul. 27, 2012 @ 11:28 GMT
Georgina,
"Does assigning boundary conditions mean specifying or at least deciding what will and will not be considered as part of the problem, the limit of the particular investigation?"
No. It means arbitrarily limiting a continuous function to an interval of analysis that does not result in infinities or singularities.
"If so then I agree we do do that. Though we could...
view entire post
Georgina,
"Does assigning boundary conditions mean specifying or at least deciding what will and will not be considered as part of the problem, the limit of the particular investigation?"
No. It means arbitrarily limiting a continuous function to an interval of analysis that does not result in infinities or singularities.
"If so then I agree we do do that. Though we could have open ended investigations where gradually more and more parameters and variables are added to the boundary of the investigation, giving further complexity to the answer."
That's the problem, though. One cannot extend boundaries with sufficient ad hoc parameters (adjustable variables) to assure a unique solution. Consider the various climate change models in this context -- there's a raft of parameters that produce a great number of predictions; however, the outcomes are all dependent on which variables are physically manifest. Information doesn't increase knowledge -- (the source of Einstein's aphorism "Imagination is more important than knowledge") -- knowledge is realized in the correspondence of theory (what we imagine) to physical result (what happens).
"Gaps could also be filled at different scales and linked, until theoretically everything is causally linked to everything else."
We already know that everything is causally linked to everything else -- at least, we assume so. If there are uncaused effects, the implication is that some force external to nature ("God of the gaps" argument) tinkers with creation at either random or unknown intervals. We can't treat such a case by scientific method, even if it should happen to be true. Science depends on replication of results in an objective way.
Suppose I have a pain in my stomach, and a physician determines that a signal from my brain is causing the sensation. She gives me a chemical to block the signal. No pain -- yet can we assign the cause of the pain to a random brain impulse? Probably not -- suppose an infection has created pressure on nerve endings in my gut -- the pain is a symptom, created by positive feedback between nerve endings and brain. The brain is not controlling the activity causing the pain; it is relaying information that compels the body to create a negative feedback loop, bringing into play a range of responses that release various chemicals and white cells to attack the infection and restore normal function. Any treatment intervention is only an extension of this negative feedback. If the body's reponses and treatment fails, a unbroken positive feedback loop leads to extinction of the organism.
Locally, positive feedback is always disagreeable. Consider the "squeal" of a positive feedback loop between a microphone and amplifier. We cannot identify the cause of the squeal (microphone or amplifier) though we know the effect is not uncaused.
By top-down causation, I think George Ellis implies that any ultimate cause must result in negative feedback -- i.e., there exists a universal control mechanism that accounts for the coherence and comprehensibility of the universe as we experience it. (I suggested in my ICCS 2007 paper, "Time, change and self organization" that gravity itself qualifies as such a mechanism, because it operates in but one direction, toward the center of mass.) The subsystems of the universe (such as we creatures) therefore must be endowed a priori with the means of cognitively choosing a direction that escapes the positive feedback loop that leads to extinction. As George says explicitly, "... life would not be possible without a well-established local arrow of time." The property of consciousness cannot be separated from all of the properties of life itself, whether organic or inorganic.
Tom
view post as summary
report post as inappropriate
Georgina Parry replied on Jul. 27, 2012 @ 14:02 GMT
Tom,
thank you for trying to help me understand. Was I talking about continuous functions?
You said:"information doesn't increase knowledge". I think data acquisition is one important aspect of science but utilising the data to give solutions or comprehension of what is going on is another different aspect. There is perhaps a rush to interpret data and give it meaning or significance it doesn't have- because that seems like its creating new knowledge.
I thought about giving an analogy of tuning a very out of tune piano, but its long and complicated. When its right though the whole thing, which could be 120 strings, works together. Thats what science should be like eventually.I think it is possible to build up understanding. Certainly food chains or other kinds of ecological webs can be constructed staring by looking at a few species interactions and niches and then adding more and more. It would be possible to have two different partial webs that have one of the same species in them, but are otherwise totally different. Both can be correct. I don't think that is a problem unless it is assumed that because these studies have been done scientifically each one is by itself the absolute truth.
Tom I think I'm agreeing with you and George. There is a control at the largest scale, passage of time (as J.C.N Smith and I have been describing it) necessary for anything else to happen but not able to create the complexity of the universe alone, there also has to be continual motion of matter and particles (and I say in my essay and elsewhere that that is the cause of gravity not curvature of space-time.) Yes I think George is talking a lot of sense but what you have quoted also seem to me really obvious things said eloquently. Which isn't a bad thing. Maybe they need saying and saying well.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 27, 2012 @ 19:24 GMT
Thank you both for this dialogue.
"But what you have quoted also seem to me really obvious things said eloquently."
- Yes it is all obvious, once you have seen it! But many have not seen it yet. They therefore do need saying.
hide replies
Vladimir F. Tamari wrote on Jul. 23, 2012 @ 02:11 GMT
Dear George
(As a courtesy to Professor Ellis, and to keep discussions focused I hope other posters will not respond to this on his page unless he does.)
You make a convincing case for being wary of simplistic down-up causation. Your arguments make sense but only in the context of present-day physics which is far from being a harmonious conceptual whole where one theory applies both to the very large and the very small - please see my present fqxi essay Fix Physics! about that. I am conviced that if such a simple theory of everything were to be found, causation would be always local and linear at the smallest scales and the effects of large systems will be the resultant of effects transmitted locally and causally down to the local level and vice-versa simultaneously in a balanced way:
Think of a ripple tank where one 'simple local' point sends out waves to a 'complex system' consisting of points on a surrounding perimeter. Anywhere in the space between the 'local or down point' and those of a larger 'higher complex perimeter' system the resulting interference pattern will be caused by both systems simultaneously. Here is my tentative approach to such a ToE
Beautiful Universe Theory .
With best wishes from Vladimir
report post as inappropriate
Author George F. R. Ellis replied on Jul. 23, 2012 @ 15:27 GMT
Vladimir you state
"I am conviced that if such a simple theory of everything were to be found, causation would be always local and linear at the smallest scales and the effects of large systems will be the resultant of effects transmitted locally and causally down to the local level and vice-versa simultaneously in a balanced way'
I agree with you completely. My more technical article on the way quantum theory works is based in precisely that premise. You will find it
here .
George
Vladimir F. Tamari replied on Jul. 23, 2012 @ 16:46 GMT
Thanks George
I just downloaded your paper and it will take me some time to read. My first impression is that it differs from my Beautiful Universe paper in scope and intention (and the presence of non-linearity it appears) as you will see if you read it. Another difference is that mine is the work of someone who has waded in deeper waters than he was trained for!
Cheers
Vladimir
report post as inappropriate
nmann wrote on Jul. 23, 2012 @ 15:59 GMT
Dr. Ellis,
"In addition to contemplating relativistic and philosophical aspects of cosmology, he is now engaged in trying to understand how complex systems such as you and me can arise out of the underlying physics."
Could this be made to work without a thorough simulation, down to the particle interaction level, of the underlying physics -- in this case, of condensed matter physics?
One assumes the higher-level effects ("epiphenomena" seems to be a word to avoid) emerge as interactive constraints upon the substrate physics. (Let's not get into the contentious issue of substrate independence, which isn't a required topic at this point.) Whether or not their emergence is inevitable (and why shouldn't it be?) we know they wouldn't exist, at any rate to begin with, in the absence of the underlying physics. And the more complex the higher-level emergent phenomena, the more you need to know about the operational physics in order to map the emergence ... or is that a fallacious assumption?
Anyway, what about the fermion minus-sign problem? And thanks for your provocative and knowledgeable essay.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 24, 2012 @ 05:59 GMT
You don't need simulation to establish effective laws at any particular level. For example you can establish the effective gas laws without using any simulations, through two routes: (a) experimentally, (b) by use of kinetic theory. The latter does not need to involve simulations, nor does it need quantum field theory, much less M-theory or string theory.
"One assumes the higher-level effects ... emerge as interactive constraints upon the substrate physics." I'd phrase it this way: the high level structure emerges somehow (it may be spontaneous, or may be manufactured, or may emerge through developmental processes) and then sets constraints on the underlying physics.
Yes, this emergence would not take place in the absence of the underlying physics. "The more complex the higher-level emergent phenomena, the more you need to know about the operational physics in order to map the emergence" - well not really. Digital computers are really good examples: they are the most complex things we have built. You don't need to know anything about the underlying physics to design the computer itself, you just need to know that it established the possibilities of existence of transistors and hence various kinds of gates. On that basis you can work out the logic of integrated circuits and make CPUs, memory banks, etc. Hence computer scientists are not taught quantum theory as part of their computer science courses. Someone else needs to know how the transistor works but you don't need to do so: you can take the transistors as the bottom level, for your purposes. And it's crucial that we are able to do so, for as I've already said we don't know what the bottom level is: we'd be unable to do most of present day physics if it was requisite that we understand the quantum gravity foundation layer first. The key question is, Which is the operational level? It's the one that is convenient for you to choose as the lowest level in your particular analysis.
The fermion minus-sign problem is to do with quantum Monte Carlo simulations; a technique for trying to understand specific types of emergent systems. I cannot meaningfully comment on that technique and problem, except to say that I don't think it helps understand systems such as the brain or a computer.
nmann wrote on Jul. 24, 2012 @ 16:11 GMT
Thanks for the extended response. We'll simply have to disagree that manufactured artifacts are useful analogues of complex nonequilibrium systems -- complex natural processes -- to the extent I read you as believing them to be.
"Hence computer scientists are not taught quantum theory as part of their computer science courses."
Actually, that's not true in the case of quantum computation itself. And perhaps overly optimistic though it may seem, qcomp programming is taught on the theoretical level. You can't understand quantum algorithms without some fundamental knowledge of QM. Anyway, my own paradigm these days is the role of proton-coupled electron transfer (PCET) in photosynthesis. Now, to be sure, a plant doesn't need to understand anything at all about quantum tunneling in order to do its photosynthesizing thing, but if you're designing an artificial leaf (vide the Nocera team's ambitious project) you definitely do. Anyway you and I, a couple of complex systems, generate information no computer can, no matter how sophisticated its programming.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 25, 2012 @ 18:08 GMT
"We'll simply have to disagree that manufactured artifacts are useful analogues of complex nonequilibrium systems -- complex natural processes -- to the extent I read you as believing them to be." Well I don't want to overdo the analogy, but for me a key similarity is they both are hierarchically structured modular systems with information hiding. This kind of structuring is very nicely described by Grady Booch in his book on object oriented programming. The mechanisms used in these two kinds of systems are quite different, but some of the logic is similar. And of course you can get the digital system to simulate many aspects of the physical system to high accuracy, basically because digital computers are universal computers (Turing).
Yes of course I agree about quantum computing. I should have added the caveat "classical computing" in all above.
Proton-coupled electron transfer seems fascinating. I'd be really interested to know how it relates to quantum state vector reduction. And I agree with you about the limitations of computers (though many don't).
nmann wrote on Jul. 24, 2012 @ 18:28 GMT
Note. Anticipating a possible objection here. An artificial leaf, albeit a manufactured artifact, doesn't stand in relation to a real leaf as a digital computer does to a brain. An artificial leaf reproduces a known process selected from a real leaf's repertoire of physics, whereas a computer can't be demonstrated to reproduce any process selected from the physics or systemic functionality of the brain.
An artificial leaf copies to some degree a real leaf. The computer is sui generis, a physically realized TM, as anyone familiar with Turing's papers from the 1930s realizes. Is the brain a TM? A lot of people seem to think so, but haven't proven it.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 25, 2012 @ 18:19 GMT
"a computer can't be demonstrated to reproduce any process selected from the physics or systemic functionality of the brain." The physics functionality, I agree; the systemic functionality, perhaps not. The brain is an embodied brain certainly; but it also carries out pattern recognition and prediction processes that can be digitally simulated to some degree (indeed that's where Artificial Neural Nets came from). Furthermore computers can indeed learn to some degree, through adaptive programs such as genetic algorithms.
The brain is not a Turing Machine inter alia because emotion plays a key role in its functioning (see for example Damasio's writings). You can to some degree simulate those effects but I certainly don't believe you can reproduce them.
Steve Dufourny replied on Jul. 28, 2012 @ 21:15 GMT
it becomes interesting.:) It was times to have concrete discussions.
Mr Ellis.
the artificial intelligence, that said , can be made if the biology is inserted. the informations can be encoded with a kind of sortings of these informations. But of course it becomes intriguing. The emotions indeed are results of specific biological evolution. Can we reproduce these emotions, I think that no also, but we can create a process of evolution and sortings implying a kind of artificial intelligence.I ask me how the synapctic messages can be inserted ? How the diffusions of informations are inside a closed system. The brain is fascinating. It is possible easily to insert parameters of movements. But this entanglement, correlated with the evolutive human brain for example, is so complex. I agree so, we cannot reproduce emotions, but we can imply a kind of comportment correlated. Like a specific algorythm of selectivity about these comportments. It is intriguing all that.
The brain is more than a turing machine, we are aged of 13.7 to 15 billions years ! It is evident that the biology is very complex in all its combinations. The brain and the adn like wonderful creations.
Regards
report post as inappropriate
nmann wrote on Jul. 25, 2012 @ 20:06 GMT
"Proton-coupled electron transfer seems fascinating. I'd be really interested to know how it relates to quantum state vector reduction."
This topic has never been mentioned in any paper I've seen. The question for me has been to what extent is PCET a coherent quantum phenomenon. You don't even need to believe in wave function collapse to wonder about that, as one is interested also in the whole emerging discussion of room-temperature and high-temperature quantum coherence. (See Vlatko Vedral on the ubiquity of entanglement. Seth Lloyd on entanglement as crucial to ordinary bivalent bonds. And when you visited the IQOQI you were doubtless regaled with an account of the super-hot double-slit buckyball experiment.) Nocera & Co. are adamant that PCET's a coherent quantum tunneling effect, and appear to have established this contention to the satisfaction of those sectors of the scientific community that have paid attention.
Here's a now-defunct webpage from Daniel Nocera's MIT site, undoubtedly not technical enough for you but okay for most kibbitzers, which a friend of mine downloaded and converted to pdf a while back. It lays out the basic stuff:
http://www.dancing-peasants.com/Proton-Coupled_Electro
n_Transfer.pdf
I can cite deeper material if desired.
report post as inappropriate
nmann wrote on Jul. 25, 2012 @ 20:09 GMT
looked okay in the preview
http://www.dancing-peasants.com/Proton-Coupled_Electr
on_Transfer.pdf
report post as inappropriate
Author George F. R. Ellis replied on Jul. 27, 2012 @ 19:17 GMT
Here it is I hope:
Proton-Coupled Electron TransferIt's interesting, thanks. From my viewpoint, it's a really nice further illustration of how local context underlies real quantum detection events as claimed in my paper
arXiv:1108.5261, because (1) it is indeed a detection event [a photon causes an electron to be released that then causes further reactions down the line] which (2) takes place because of the specific molecular structures R1 and R2 within which electron is imbedded. These are higher level structures, i.e. at a larger scale than the electron, that channel the electron's interactions by setting its local context: a form of top-down constraint. Thus it's another very nice illustration of top-down effects: what happens at the electron's level would not happen if the specific molecular structures were not there. At the quantum level, this must cause a collapse of the wave function, because specific classical events occur as discussed in this reference (it is striking that this paper uses essentially classical models, enhanced by the concept of tunnelling and the idea of the photo-electric effect: there's no wave function for example).
Edwin Eugene Klingman wrote on Jul. 25, 2012 @ 20:23 GMT
Dear George Ellis,
In a comment above, you state, "Proton-coupled electron transfer seems fascinating. I'd be really interested to know how it relates to quantum state vector reduction."
I do invite you to read my current essay,
The Nature of the Wave Function, as I address the issue of "quantum state vector reduction" and I would very much appreciate your thoughts on my approach.
Edwin Eugene Klingman
report post as inappropriate
Author George F. R. Ellis replied on Jul. 29, 2012 @ 10:31 GMT
Well I'm very puzzled. I wrote a response to your essay and thought I'd posted it over on your thread. Seems not to be there - I wonder what happened.
George
Author George F. R. Ellis wrote on Jul. 26, 2012 @ 06:42 GMT
Paul Davies and Sara Walker have put up two very useful papers on the internet that will interest those of you involved on the biology side. They are
Evolutionary Transitions and Top-Down Causation
and
The Algorithmic Origins of LifeHere is the abstract of the latter paper:
"Although it has been notoriously difficult to pin down precisely what it is that makes life so distinctive and remarkable, there is general agreement that its informational aspect is one key property, perhaps the key property. The unique informational narrative of living systems suggests that life may be characterized by context-dependent causal influences, and in particular, that top-down (or downward) causation -- where higher-levels influence and constrain the dynamics of lower-levels in organizational hierarchies - may be a major contributor to the hierarchal structure of living systems. Here we propose that the origin of life may correspond to a physical transition associated with a fundamental shift in causal structure. The origin of life may therefore be characterized by a transition from bottom-up to top-down causation, mediated by a reversal in the dominant direction of the flow of information from lower to higher levels of organization (bottom-up), to that from higher to lower levels of organization (top-down). Such a transition may be akin to a thermodynamic phase transition, with the crucial distinction that determining which phase (nonlife or life) a given system is in requires dynamical information and therefore can only be inferred by identifying causal relationships. We discuss one potential measure of such a transition, which is amenable to laboratory study, and how the proposed mechanism corresponds to the onset of the unique mode of (algorithmic) information processing characteristic of living systems."
nmann replied on Jul. 26, 2012 @ 16:00 GMT
Information is probably just as real and physical as Energy. Maybe Claude Shannon stands as something like the Sadi Carnot of Information. (Incidentally, one is always interested in a distinguished physicist's take on the Brukner-Zeilinger quantum revision of Shannon and also on Jan Kåhre's work.) But we don't yet have an Infodynamics on the order of Thermodynamics.
The Elephant in the Room throughout all of this, one suddenly realizes, has been Macrorealism. "Recognising Top-Down Causation" might be characterized as a defense of same, accomplished by means of conscripting Hierarchy Theory and subtly decoupling QM. It's a pretty good defense.
report post as inappropriate
T H Ray replied on Jul. 27, 2012 @ 14:21 GMT
Hi nmann,
" ... we don't yet have an Infodynamics on the order of Thermodynamics."
Yes, we do. Shannon's information entropy is perfectly modeled by the same mathematics as thermodynamic entropy. Applied to a network of communication nodes, a dynamic system emerges.
Tom
report post as inappropriate
nmann replied on Jul. 27, 2012 @ 15:25 GMT
Hi Tom,
What we don't have is an understanding of informational transduction. There's a sense in which energy comes coded too (for example, you can't make a computer operate directly on any form of energy other than electrical) but we've identified and defined electrical energy and understand how it as well as kinetic energy, thermal energy, mechanical energy etc. are linked and are able to be converted from one form to another.
But how does the electrochemical information coursing around in your brain relate to the symbolic information your brain outputs, as represented for instance by your post? How many transduction processes are involved, and what the heck are they?
report post as inappropriate
T H Ray replied on Jul. 27, 2012 @ 16:05 GMT
Hi nmann,
Certainly, computers operate on as many varieties of energy as are available -- a computer as simple as an abacus uses mechanical energy, and one can conceive of constructing a more complex computer with, e.g., water or another fluid substituting for electricity flowing through logic gates.
" ... how does the electrochemical information coursing around in your brain relate to the symbolic information your brain outputs, as represented for instance by your post?"
George answered that in a reply to Georgina: "The placebo effect is fascinating; of course from my viewpoint it's a case of top-down action from beliefs (abstract entities) to physical systems (human bodies)." This can be generalized; the relation between thought and action is mediated by cognitive differentiation, between states of being, which is how a computer -- though by programming rather than cognition -- converts differential equations (difference equations, actually, because the computer is a finite state machine) to discrete output (information). The information can then be used as new input.
"How many transduction processes are involved, and what the heck are they?"
I don't think we need transduction to explain information processes. Though some favor the view that humans are only computers made of meat, I think that nature is more subtle, incorporating a structure by which information is continuous and infinite, which makes the meat computer -- the finite state -- view untenable. My essay in this contest explains why.
Tom
report post as inappropriate
T H Ray replied on Jul. 27, 2012 @ 18:28 GMT
George,
Thanks for these links. Having read the first, I am impressed that the top down model driving " ... transition from a group of independent low-level entities to the emergence of a new higher level (collective) entity" is a much more robust model of evolution than the linear "warm little pond" myth that our generation were taught in school.
The description embodied in figure 2 also supports a dynamic self organized model of growth self-limited by boundaries of scale. Marvelous.
Tom
report post as inappropriate
Anonymous replied on Jul. 27, 2012 @ 19:41 GMT
An interesting interaction...
"Recognising Top-Down Causation" might be characterized as a defense of Macrorealism, accomplished by means of conscripting Hierarchy Theory". Yes indeed: a nice description.
nmann, as Tom says, computers can function on any substrate (mechanical, electrical, electronic, fluid, molecular): what remains constant is the logical operations realised by whatever physical substrate is used.
Shannon's entropy measure quantifies how much information is transmitted at a certain level, but completely fails to relate to meaning and so somehow misses the point of what information is about. For example a single message "yes" or "no" might be encoded in a single bit; and it might be "yes" or "no" to dropping a nuclear bomb on a city and so initiating World War III, or it might be a decision about going to a show tonight or not. Context is the key to what it is about. The length of the message (in Shannon's terms) is decoupled from it's implications. I like the discussion of these issues in Juan Roederer's book "Information and its Role in Nature" .
report post as inappropriate
Author George F. R. Ellis replied on Jul. 27, 2012 @ 19:43 GMT
Sorry guys, the above was from me. I thought I was logged in when I submitted it.
George
nmann replied on Jul. 27, 2012 @ 20:57 GMT
Tom --
"Certainly, computers operate on as many varieties of energy as are available -- a computer as simple as an abacus uses mechanical energy, and one can conceive of constructing a more complex computer with, e.g., water or another fluid substituting for electricity flowing through logic gates."
I've tried for years to buy a hydraulic computer but whenever I go into the Apple Store and ask about it they laugh at me. My laptop only operates on electricity.
George --
"nmann, as Tom says, computers can function on any substrate (mechanical, electrical, electronic, fluid, molecular): what remains constant is the logical operations realised by whatever physical substrate is used."
Sure, for better or worse that's basic functionalism. My issue is the coding of the information. You wouldn't maintain, for example, that we actually eavesdrop on the internal communications of DNA and RNA simply because we've defined some of their operations and coded them in our own code ... one hopes you wouldn't, anyway.
report post as inappropriate
T H Ray replied on Jul. 28, 2012 @ 11:20 GMT
nmann,
I expect that while you may be an expert on computers, you don't know much about computing.
I'm reminded of being at a conference a few years ago, being self-conscious about using the old-fashioned method of overhead projection on cels to present, while most who were much younger than I had prepared fancy PowerPoints. Marvin Minsky was a plenary speaker, however, and kept the assembly waiting while an overhead projector was rolled to the podium, set up and adjusted. He responded to an unasked question, "I just work work with computers. I don't like them."
Tom
report post as inappropriate
T H Ray replied on Aug. 7, 2012 @ 15:04 GMT
Unlike Paul Davies' popular books, which often read like good detective stories, the Walker/Davies piece "The Algorithmic Origins of Life" is kind of tough sledding. At least for me.
Take the statement, "To say that information is 'instructional' (or algorithmic) and 'coded' represents a crucial conceptual leap -- separating the biological from the non-biological realm -- implying that a...
view entire post
Unlike Paul Davies' popular books, which often read like good detective stories, the Walker/Davies piece "The Algorithmic Origins of Life" is kind of tough sledding. At least for me.
Take the statement, "To say that information is 'instructional' (or algorithmic) and 'coded' represents a crucial conceptual leap -- separating the biological from the non-biological realm -- implying that a gene is 'for' something."
Even though I strongly subscribe to the view -- as I believe both Ellis and Davies also do -- that the universe is suffused with meaning and consciousness, I just can't get my mind around what the statement above logically entails: that " ... coded instructions are useless unless there is a system that can decode. interpret and act on those instructions."
In fact, we don't have a warrant to believe that the world is algorithmically compressible. If it isn't, there is no posssible non-arbitrary demarcation between organic and inorganic life. Self-replicating systems, demonstrably, are sustained on the concept of adaptation alone. In my local ecosystem, a mosquito is useless to me, while globally, my continued existence may depend on the mosquito larvae on which the fish feed and on which I in turn feed. I agree with the authors that analog systems are less adaptable than digital-switching memory processing, such as a CNS-endowed creature possesses; however, analog processes in complex systems allow robust network switching of useful resources for required task performance. So I have to disagree that " ... in informational terms ... analog systems are not as versatile or as stable as digital systems and as such likely have very limited evolutionary capacity." In fact, the evolutionary capacity of the complex system is measured in variety and redundance of resources. Nature trades efficiency for creativity, and those created products are manifestly analog systems which provide new input for creating more novel digital mechanical systems producing new analog creations.
I don't know how -- with this piece -- Davies escapes joining the side of biological determinism (The "gene machine" of Dawkins) which in *The Matter Myth* he and John Gribbin criticized: "Many people have rejected scientific values because they regard materialsm as a sterile and bleak philosophy, which reduces human beings to automatons and leaves no room for free will or creativity." Personally, I still regard myself as a materialist and reductionist, though like Gell-Mann, I find no conflict between a continuum of consciousness (quarks to Jaguars) and free will. If one refrains from drawing boundaries between life and non-life, algorithmic subroutines that define life and imbue its creatures with free will are not discontinuous with the complex system by which such life is sustained, though which itself is not demonstrably algorithmically compressible.
I support the "information narrative." I think I'm more prone, though, to accept an approach that treats the narrative *itself* as an evolutionary continuum, such as Gregory Chaitin's newly published *Proving Darwin: Making Biology Mathematical.*
As always, though, Davies is a stimulating and provocative thinker. Thanks for providing this link to the Sara Walker--Paul Davies paper.
Tom
view post as summary
report post as inappropriate
Domenico Oricchio replied on Sep. 2, 2012 @ 22:49 GMT
I read today the article of Walker and Davies, that you recommend in this blog.
I think that the first organisms, on the Earth, can have contained a unique mixed genetic code: dna+rna in the same helix.
The same structure can contain the hardware, and the software: for example a mechanical clock (or astrolabe) is built to carry out some functions, and there is not a software, because...
view entire post
I read today the article of Walker and Davies, that you recommend in this blog.
I think that the first organisms, on the Earth, can have contained a unique mixed genetic code: dna+rna in the same helix.
The same structure can contain the hardware, and the software: for example a mechanical clock (or astrolabe) is built to carry out some functions, and there is not a software, because the planner have the software (the idea).
Is it possible that analogic mechanism with great complexity can self-replicating, in an environment that give the right substances? I think that each sensor, and function, can be made in analogic, then a self-replicating robot can be made; but is this a life form? There is self-replication, there is evolution (because of the construction error), but there is not software (only a mechanical structure that make a task); it is possible that the robot can make alternative tasks (I think some nanotechnologic machines self-replicating with elementary chemical and that make some tasks): the problem is that there is not a code (DNA) but a structure.
It is interesting to try a test for life like the Turing test (top-down life test): an artificial ant (with the same external function of an ant) can be recognized like an artificial ant (without dissection)?
It is possible to built, with supercomputer, self-replicant program (that self-write in some memory location, isolated from external environment) more and more complex, until to obtain an artificial life (like game of life) of great complexity (artificial software microorganism like virus, or better bacteria), where the artificial environment make the selection (top-down selection); only the real interaction (with sensor and mechanism) permit to obtain an artificial life (artificial insect) without self-replication.
I think that can be possible to build artificial robots (nanotechnology) using complex chemical reaction (bottom-up construction), using some automatic robots to try chemical reaction to simulate different complex chemical reaction (in a far future can be possible the computer simulation of different chemical reaction to built life); I think that can be possible alternative chemical reaction to obtain life (without organic chemistry): the Universe is great, so if it is possible an alternative chemical reaction to obtain life, then in the Universe these reactions happen (I am sure for the elementary process, not until the life creation).
Saluti
Domenico
view post as summary
report post as inappropriate
Domenico Oricchio replied on Sep. 4, 2012 @ 22:40 GMT
I think that the life can be a bottom-up process with a simple starting point.
It is possible, in biology, to search the minimal chemical structure capable to reproduce itself; it is possible to insert this process in a noisy system (with radiation, thermal noisy, etc.) so that the reproduction is not perfect, then there is evolution, a continuous change of the structure that become ever more complex, until the life.
It is possible, in computer science, to write the minimal program capable to reproduce itself; it is possible to add noise in the write process (imperfect drive, low quality discs, etc.), after some time (in parallel computer it is possible to accelerate the process) it is possible to obtain artificial life; if there are self-reproducing programs (so that there is program interactions in the environment), then is there a ethical life (programs that interact well with other programs)?
Is the life connect with the self-reproduction?
Some animal cannot reproduce (for example mule, or some pandas), so that are these not life? If in the world happen (thought experiment) a virus that destroy the possibility to reproduction of the human being, then the human being is not more life?
Is it possible to recognize the microbial cyst like life? There is not metabolism, movement, all the characteristics of the life.
It is complex, in a bottom-up test, and in a top-down test, recognize the life.
I write these thoughts because seem (to me) interesting on the biological side.
Saluti
Domenico
report post as inappropriate
hide replies
Peter Jackson wrote on Jul. 27, 2012 @ 19:03 GMT
George
Excellently written piece, well argued and very agreeable. I see scale as a full 2 way causal street. But I have questions;
1. Has anybody argued otherwise? I agree few think about it, and need to, but you don't falsify a counter argument. I wonder if there even is one?
2. The point about complexity is anyway, I agree, worth eeking out and considering. I've suggested one step more, in that simply, because there are so many small particles, complexity is so great is resembles 'chance' to us. If we were the size of a proton might we not find nature simple, as we do macro nature now?
3. This suggests the 'bottom' may be only assumed the one way source of causality as we see, so feel we 'better' understand the top end. Do you agree?
4. Can mathematics using just 'point' particles really properly describe the effects of evolution of interaction between waves and 'real' particles over non zero time when negotiating a medium boundary in relative motion?
5. As a relativist, do you consider that understanding the quantum universe better will allow us to unite physics? - by providing a quantum mechanism to produce the macro effects we term relativity?
I've derived a 'two way' mechanism discussed in my essay. The motion of one medium or 'system' within another will give rise to quantum effects, which then in turn implement the postulates of special relativity and curved space time. This seems to resolve a number of astronomical anomalies, and a causality issue with assumptions about refraction not previously identified.
I'd be extremely grateful if you were able to read my essay and give your views.
http://fqxi.org/community/forum/topic/1330
I've thrown is some kinetic theatre to break up the density.
Best wishes
Peter
report post as inappropriate
Author George F. R. Ellis replied on Jul. 28, 2012 @ 05:28 GMT
Dear Peter
1. "Has anybody argued otherwise?" Oh yes: it is the basic assumption of many, e.g. Francis Crick in his book The Astonishing Hypothesis; Lewis Wolpert in response to talks I have given; Jonathan Shock, to name a few.
2. "If we were the size of a proton might we not find nature simple" well yes: quantum theory is linear, that's its key feature. But it only applies on small scales.
3: We understand the top end (i.e. the scale of everyday life) better because that's our scale! Its only on this scale that we can easily test and probe and experience.
4: You have to take the properties of the boundary into account as well. You regard it as a macro entity, i.e. you don't try to describe its constitution detail.
4. Understanding the quantum level does not per se make relativity emerge - yo have to put it in by hand. That's the difference between quantum theory and quantum field theory.
I enjoy the theatre in your essay.
George
Peter Jackson wrote on Jul. 27, 2012 @ 19:11 GMT
George
5. Should read; ...the 'bottom' may be only assumed to be the (one way) source of causality because we can actually SEE the top end, so feel we better understand it. Do you agree?
Peter
report post as inappropriate
Peter Jackson replied on Jul. 28, 2012 @ 18:04 GMT
George
Thank you. I'm not too astonished some argue against, in Quantum and Classical there must of course always be someone who's convinced black is white.
Point 4. You say "You have to take the properties of the boundary into account as well. You regard it as a macro entity, i.e. you don't try to describe its constitution detail." Interesting view. I know you're currently thinking in a different area, but my essay is actually ALL about the constitution of the quantum boundaries of 'space time geometries' (frames) and how the real interactions there (with non point particles and temporal evolution) produce all the classical macro scale effects we term Relativity.
I'm a little surprised and disappointed that did not emerge for you. I hoped you may try to falsify the ontology as we've have had no success doing so to date.
Have you actually read it all yet?
Best wishes
Peter
report post as inappropriate
Avtar Singh wrote on Jul. 27, 2012 @ 22:05 GMT
Dear George:
What are your thoughts on the role of consciousness or free will as the top down causation that gives rise to human beings? I have expressed some of my thoughts below based on my paper -“
From Absurd to Elegant Universe”:
Causation vs. Free Will – What is Fundamental?
The following arguments support the conclusion that Free Will or Spontaneity or Consciousness is the fundamental or root cause process of all physical phenomena and the widely used assumption that particles/strings are fundamental reality is wrong as evidenced by its failure to predict/describe 96% of the universe and resulting in the prevailing paradoxes/inconsistencies.
An outcome of an event is determined by the input parameters and the governing law (or equation). The governing laws are the fundamental universal laws of conservation of mass, energy, momentum, space, and time which are existent at Free Will without any external cause. The input is also chosen at the free will of the observer or operator. In some cases, the input is determined by the outcome of a preceding event such as in the Domino Effect. But even in those cases, the originating or primary root input is always determined at the free will of the originator or source. Hence, the universe is not a Clockwork Universe wherein its fate is predetermined. The evolution of the material or manifested universe is subject to the free-willed laws and inputs.
The widely used assumption that particles or strings of matter are the most fundamental elements of universal reality is incorrect. The particles are known to be born spontaneously out of or decay spontaneously into the so-called vacuum or nothingness. Hence, the fundamental reality, both top-down and bottom-up, is vacuum (or the Zero point state of the mass-energy-space-time continuum as described in my paper. This state is synonymous with the implicit eternal and omnipresent laws of the universe.
The fundamental physical process that leads to spontaneous (no causation) birth or decay of particles is the free will or spontaneity in the universe. A universal theory that does not entail this free-will dimension allowing spontaneous conversion of mass-energy-space-time continuum will remain incomplete and unable to describe the universal reality. This is vindicated in my paper wherein it is demonstrated that allowance of such spontaneous process in conjunction with general relativity leads to the correct prediction of the observed universe, creation and dilation of matter, and classical as well as quantum behavior of particles eliminating black hole singularities and paradoxes related to inner workings of quantum mechanics.
Regards
Avtar Singh
report post as inappropriate
Author George F. R. Ellis replied on Jul. 28, 2012 @ 05:10 GMT
Dear Avtar
while I believe in free will - inter alia, because science is not rationally possible if we do not have some meaningful kind of free will, as pointed out for example by Anton Zeilinger - I do not believe it is manifested by particles in themselves. Quantum uncertainty is not the same as free will, it is arbitrary, while free will entails purpose and meaningful choice.
Regards
George
nmann replied on Jul. 28, 2012 @ 23:48 GMT
Zeilinger has indeed said that. However, he has also said that something which sounds to me a lot like superdeterminism cannot be ruled out. And his group's paper describing their experimental violation of [a slightly tweaked version of] the Leggett inequality [ArXiv 0704.2529, page 7] says this:
"... Furthermore, one could consider the breakdown of other assumptions that are implicit in our reasoning leading to the inequality. These include Aristotelian logic, counterfactual deniteness, absence of actions into the past or a world that is not completely deterministic. ..."
The first time I read the sentence I thought maybe they needed an editor for English clarity but no: it says what it says.
report post as inappropriate
nmann replied on Jul. 28, 2012 @ 23:53 GMT
That's "counterfactual deFIniteness" of course. Careless cutting and pasting on this poster's part.
report post as inappropriate
Author George F. R. Ellis replied on Jul. 29, 2012 @ 10:25 GMT
Well various vies on quantum mechanics, going back to Wheeler and Feynman, suggest it could involve action into the past in some conditions/on some scales; see
here for a view on this. Also "a world that is not completely deterministic" is the standard view, is it not? As I've said, form my viewpoint that gives room for adaptive selection to take place and generate stuff not uniquely implied by the initial data.
T H Ray replied on Jul. 29, 2012 @ 11:32 GMT
As I argue in my essay, if the source of all information is the point at infinity (which exists at every point of four dimensional spacetime), we have local and simultaneous access in principle to everything in Wheeler's world built of information alone, though the act of measurement orders recorded events into our unique worldline.
One recalls the arithmetic theorem that a single point may simultaneously approach any other set of points provided that the point is far enough away.
Tom
report post as inappropriate
nmann replied on Jul. 29, 2012 @ 22:47 GMT
"Also 'a world that is not completely deterministic' is the standard view, is it not?"
"A world that is not completely deterministic" is "one of the other assumptions that are implicit in our reasoning leading to the inequality" which the results of the Leggett experiment (resulting in a violation of Leggett's inequality) might be regarded as having put at hazard ... made subject to "breakdown" as the experimenters put it. Along with other stuff I find personally find somewhat easier to live without ("Aristotelian logic, counterfactual definiteness" even "absence of actions into the past").
In other words, the experimental outcome brings forward the possibility of a world that IS completely deterministic. Zeilinger copped to that somewhere, as I recall, but stated that he personally found it unimaginable. Of course there'd be no possibility at all of "free will." Which isn't the standard view or else 't Hooft wouldn't be particularly controversial.
A lot of double negatives, agreed.
report post as inappropriate
Avtar Singh replied on Jul. 30, 2012 @ 22:21 GMT
Dear George:
Thanks for replying to my post on Free Will. I have responded to your comments on my paper under my posting - “
From Absurd to Elegant Universe”. Please let me know if I addressed all your comments/questions satisfactorily.
I agree with your statement -" ..I do not believe it is manifested by particles in themselves. Quantum uncertainty is not the same as free will". Quantum uncertainty is caused by measurement error or incapability to measure a quantum phenomenon, while Free Will is only possible because of the certainty of the universal laws. If the laws were uncertain, no free will is possible because it will be all chaos without certain laws. Often, the certainty of the laws is confused or mistaken with Determinism or fixed fate. Both the free willed input and laws determine the outcome or fate, which is not fixed in advance.
Regards
Avtar
report post as inappropriate
Steve Dufourny replied on Aug. 11, 2012 @ 11:18 GMT
Hello dear thinkers,
It is a relevant discussion. I agree also with the determinism.
Mr Singh,
You say "A universal theory that does not entail this free-will dimension allowing spontaneous conversion of mass-energy-space-time continuum will remain incomplete and unable to describe the universal reality."
What is for you a free will dimension, physically speaking,..... scalars , vectors,proportions, causes,.... ???
Regards
report post as inappropriate
Avtar Singh replied on Aug. 13, 2012 @ 18:54 GMT
Hi Steve and Friends:
Thanks for your comment on Determinism.
Your Question: “What is for you a free will dimension, physically speaking,..... scalars , vectors,proportions, causes,.... ???”
Answer:
Free Will in a physical theory is not a spatial or time like dimension but a Degree of Freedom that allows spontaneous conversion of mass to energy or vice-versa without any external condition of cause. Such Degree of Freedom is necessary to allow equivalence of mass and energy and to integrate the missing physics of spontaneous decay and birth of particles from the Zero-point state of the so-called vacuum or dark energy, wherein mass, space, and time are fully dilated as described in my paper - - “
From Absurd to Elegant Universe”.
Since this Zero-point state is the most fundamental state of the universe from which particles are born and into which the matter decays over time, the physics of all these phenomena are fundamentals that must be included in any universal theory to avoid any singularities and paradoxes such as those experienced by the current theories – general relativity and quantum mechanics.
I see a lot of questions and discussions going on in this forum regarding paradoxes that should not arise if the above physics is integrated as shown in my paper. I would greatly appreciate review and comments on my paper from all the participants in this forum so as not to miss the important insights regarding the missing physics that could resolve the ills of physics and cosmology today and avoid unnecessary as well as irrelevant questions that are nothing but artifacts of the missing physics. The universe is a lot simpler to understand than portrayed by current incomplete theories.
Regards
Avtar
I am also posting this as a main comment blog below.
report post as inappropriate
Steve Dufourny replied on Aug. 30, 2012 @ 10:55 GMT
Hello,
It is an interesting answer. The free will is relevant indeed if and only if the rational determinsim is the torch of hidden variables considering the finite groups. The quantization appears more easily.
The degrees of freedom are always so rational. It is logic in fact. The sortings and synchros. appear with a pure universal determinism.
Regards
report post as inappropriate
hide replies
J. C. N. Smith wrote on Jul. 30, 2012 @ 18:06 GMT
Dear George Ellis,
Apropos of nothing other than the topic of your essay, I just now stumbled upon another delightful quote which I think you might appreciate in the event you've not already seen it.
"We seek reality, but what is reality? The physiologists tell us that organisms are formed of cells; the chemists add that cells themselves are formed of atoms. Does this mean that these atoms or these cells constitute reality, or rather the sole reality? The way in which these cells are rearranged and from which results the unity of the individual, is not it also a reality much more interesting than that of the isolated elements, and should a naturalist who had never studied the elephant except by means of the microscope think himself sufficiently acquainted with that animal?" - - Henri Poincare, 'The Value of Science,' originally published in 1913, translated by George Bruce Halstead, Cosimo Classics, ISBN: 978-1-60206-504-8, p.21.
Cheers,
jcns
report post as inappropriate
Georgina Parry replied on Jul. 30, 2012 @ 20:27 GMT
Hi J.C.N Smith,
Its another good one, well found.
I find it interesting that increasing in scale and complexity of pattern also affects how the arrangement is able to interact with its environment.The variety of ways in which it can interact seems to increase with complexity. The size of the impact on the immediate environment increases with scale. (Though shape (as form is related to function) and populations also need consideration.)
report post as inappropriate
Author George F. R. Ellis replied on Jul. 31, 2012 @ 03:43 GMT
Dear J.C.N.Smith,
very nice, thank you. Here's another one for you, one of my favourites:
"All Truth is shadow except the last, except the utmost; yet every Truth is true in its own kind. It is substance in its own place, though it be but shadow in another place (for it is but a reflection from an intenser substance); and the shadow is a true shadow, as the substance is a true substance."
Isaac Pennington (1653).
Somehow that seems to state things very nicely.
George
Author George F. R. Ellis replied on Jul. 31, 2012 @ 04:03 GMT
Dear Georgina
indeed - one of the characteristics of truly complex systems is that higher level variables are not always just coarse grainings of lower level variables; they are sometimes crucially related to the details of the structure.
Hence in highly ordered structures, sometimes changes in some single micro state can have major deterministic outcomes at the macro level (which is of course the environment for the micro level); this cannot occur in systems without complex structure.
Examples:
(i) a single error in microprogramming in a computer can bring the whole thing to a grinding halt (got that at the moment in my laptop);
(ii) a single swap of bases in a gene can lead to a change in DNA that results in predictable disease;
(iii) a single small poison pill can debilitate or kill an animal, as can damage to some very specific micro areas in the brain.
This important relation between micro structure and macro function is in contrast to statistical systems, where micro changes have no effect at the macro level, and chaotic systems, where a micro change can indeed lead to a macro change, but it's unpredictable.
Cheers
George
J. C. N. Smith replied on Jul. 31, 2012 @ 12:56 GMT
Dear George and Georgina,
On the topic of complex systems, certainly among the most interesting of complex systems are those which have the attribute which we call sentience. If we may take a flight of fancy for a moment it is interesting to consider this topic, which certainly seems germane to the topic of this essay.
Sentience clearly is an emergent quality; it appears to require an awareness of an environment and, ideally, an ability to react to that environment. This awareness and ability to react are not possible in the absence of ensembles of atoms sufficiently complex to function as sensors and as (at least) rudimentary data processors and actuators. In other words, they "emerge" only from complex ensembles of atoms.
It seems that necessary (but not sufficient) requirements for an ensemble of atoms to have the attribute we call sentience, therefore, are the following: it must be sufficiently complex to include a sensor, a data processor, and (desirable but not absolutely necessary) a servo-mechanism to act on the output from the data processor. A traffic control device embedded in the road at a traffic signal has these three attributes, but we would not call it sentient. What more is required? The ability of the ensemble to react to its environment in novel, unpredictable, and creative ways to preserve the integrity of its own being, perhaps?
Regardless, sentient portions of the universe (e.g., people, for example) afford the universe a form of partial self-awareness. The actions of sentient portions of the universe represent the universe influencing its own future in purposeful ways. Can this somehow afford the universe, when viewed as a whole, some sort of "advantage," whether evolutionary or otherwise, relative to a totally non-sentient universe? It is difficult to envision how this could be the case, or, even if it were the case, how it could matter.
The universe, whether or not it includes some sentient portions along with the non-sentient portions, is whatever it is. On the other hand, however, a universe which includes sentient beings is a universe in which some of its component parts may be concerned about their own future, and, by extension, about the future of their immediate environment, and, by extension, about the future of the universe as a whole. How this could be construed as an "advantage," however, is not immediately clear to me. But enough with flights of fancy.
Thank you, George, for the excellent Pennington quote!
Cheers,
jcns
report post as inappropriate
Author George F. R. Ellis replied on Jul. 31, 2012 @ 19:30 GMT
Well to reply to this properly would be a very long article, maybe a book .. instead, seeing we are in this area, I'll just give you two illustrations of top-down effects in the brain.
Illustration 1: How does reading work? Here’s a remarkable thing.
• Yu cn red this evn thogh words are mispelt,
• and this thuogh lwtters are wrong,
• And this though words...
view entire post
Well to reply to this properly would be a very long article, maybe a book .. instead, seeing we are in this area, I'll just give you two illustrations of top-down effects in the brain.
Illustration 1: How does reading work? Here’s a remarkable thing.
• Yu cn red this evn thogh words are mispelt,
• and this thuogh lwtters are wrong,
• And this though words missing.
How can it be we can make sense of garbled text in this way? One might think the brain would come to a grinding halt when confronted with such incomplete or grammatically incorrect text. But the brain does not work in a mechanistic way, first reading the letters, then assembling them into words, then assembling sentences. Instead our brains search for meaning all the time, predicting what should be seen and interpreting what we see based on our expectations in the current context.
Actually words by themselves may not make sense without their context. Consider:
• The horses ran across the plane,
• The plane landed rather fast,
• I used the plane to smooth the wood.
- what `plane’ means differs in each case, and is understood from the context. Even the nature of a word (noun or verb) can depend on context:
• Her wound hurt as she wound the clock
This example shows you can’t reliably tell from spelling how to pronouce words in English, because not only the meaning, but even pronunciation depends on context.
The underlying key point is that we are all driven by a search for meaning: this is one of the most fundamental aspects of human nature, as profoundly recorded by Viktor Frankl in his book Man's Search for Meaning. Understanding this helps us appreciate that reading is an ongoing holistic process: the brain predicts what should be seen, fills in what is missing, and interprets what is seen on the basis of what is already known and understood. And this is what happens when we learn to read, inspired by the search for understanding. One learns the rules of grammar and punctuation and spelling too of course; but such technical learning takes place as the process of meaning making unfolds. It is driven top down by our predictions on the basis of our understandings, based in meaning..
Illustration 2: Vision works the same way, as demonstrated by Dale Purves in his book "Brains: How They Seem to Work". The core of his argument is as follows (from the abstract of his article on visual illusions):
"The evolution of biological systems that generate behaviorally useful visual percepts has inevitably been guided by many demands. Among these are: 1) the limited resolution of photoreceptor mosaics (thus the input signal is inherently noisy); 2) the limited number of neurons available at higher processing levels (thus the information in retinal images must be abstracted in some way); and 3) the demands of metabolic efficiency (thus both wiring and signaling strategies are sharply constrained). The overarching obstacle in the evolution of vision, however, was recognized several centuries ago by George Berkeley, who pointed out that the information in images cannot be mapped unambiguously back onto real-world sources (Berkeley, 1975). In contemporary terms, information about the size, distance and orientation of objects in space are inevitably conflated in the retinal image. In consequence, the patterns of light in retinal stimuli cannot be related to their generative sources in the world by any logical operation on images as such. Nonetheless, to be successful, visually guided behavior must deal appropriately with the physical sources of light stimuli, a quandary referred to as the "inverse optics problem". "
The resolution is top-down shaping of vision by the cortex, based in prediction of what we ought to see. Visual illusions are evidence that this is the way the visual system solves this problem.
Intriguing, isn't it?
George
view post as summary
Author George F. R. Ellis replied on Aug. 1, 2012 @ 12:41 GMT
And just for completeness here is one from the latest
neuroscience literature[\link]:
Cognitive functions of the posterior parietal cortex: top-down and bottom-up attentional control
Sarah Shomstein*
Department of Psychology, George Washington University, Washington, DC, USA
Although much less is known about human parietal cortex than that of homologous monkey cortex, recent studies, employing neuroimaging, and neuropsychological methods, have begun to elucidate increasingly fine-grained functional and structural distinctions. This review is focused on recent neuroimaging and neuropsychological studies elucidating the cognitive roles of dorsal and ventral regions of parietal cortex in top-down and bottom-up attentional orienting, and on the interaction between the two attentional allocation mechanisms. Evidence is reviewed arguing that regions along the dorsal areas of the parietal cortex, including the superior parietal lobule (SPL) are involved in top-down attentional orienting, while ventral regions including the temporo-parietal junction (TPJ) are involved in bottom-up attentional orienting.
Author George F. R. Ellis replied on Aug. 1, 2012 @ 12:43 GMT
Oh dear botched that up but the link works!
J. C. N. Smith replied on Aug. 1, 2012 @ 13:15 GMT
George,
Indeed, it is intriguing.
You wrote, "The underlying key point is that we are all driven by a search for meaning: this is one of the most fundamental aspects of human nature . . . ."
Agreed. Relating this to my comments about the nature of sentience as a property of certain complex systems such as humans, this search for meaning apparently becomes possible only when the complexity of a sentient being has reached some critical tipping point. Or, alternatively, perhaps not a single critical tipping point as much as a more gradual transition across a broader spectrum separating humans from so-called "lower," i.e., less capable, sentient beings such as chimpanzees, for example? At what point does "the search for meaning" kick in? And could this be the point at which two curves showing the relative importance of bottom-up vs. top-down causation meet and cross?
Regardless of the exact point at which the search for meaning kicks in, it seem to be the case that once it does kick in, it has, thus far at least, offered an evolutionary advantage to creatures having this trait. (This unfortunately might be undone or at least seriously set back, of course, by an injudicious application of powerful tools made possible by the search for meaning as embodied in applied science.)
Intriguing indeed.
jcns
report post as inappropriate
Author George F. R. Ellis replied on Aug. 1, 2012 @ 20:29 GMT
For completeness
here is a hot off the press set of talks on cultural neuroscience: top-down effects from society to the wiring to the individual and her brain. As I said at the start of the essay, the effect is obvious there: "Culture is now seen as an important macro-level phenomenon that affects a whole range of psychological processes." The question is if it is also important in physics; and I claim it's also there in many places when you look for it.
Author George F. R. Ellis replied on Aug. 1, 2012 @ 20:31 GMT
typo: to the individual and the wiring of her brain
hide replies
Avtar Singh wrote on Jul. 30, 2012 @ 22:32 GMT
Dear George:
Thanks for replying to my post on Free Will. I have responded to your comments on my paper under my posting - “
From Absurd to Elegant Universe”
Please let me know if I addressed all your comments/questions satisfactorily.
Regards
Avtar
report post as inappropriate
Domenico Oricchio wrote on Jul. 30, 2012 @ 23:30 GMT
Thank you for your interesting article, it is some days that I thinking to it.
I think that the top-down causation can be a strong experimental-theoretical instrument to analyze the quantum effect in a macroscopic structure; I think to amplify the little scale effect using the cooperative effect like an instrument (cooperative microscope).
It is only an idea (I speak not like an expert), but I think it is possible to measure the halo of the strong force in a single stable heaviest atomic nuclei (narrow atomic layer) using neutron beams (with different energy).
In alternative, I think that can be possible to use the superfluid liquid helium to measure the strong force in an indirect way, measuring the large scale effect of some macroscopic quantity; if the mathematical model is correct, then the macroscopic measure must be correct: the model must be include the large scale effect, and the little scale effect.
Saluti
Domenico
report post as inappropriate
Author George F. R. Ellis replied on Jul. 31, 2012 @ 04:17 GMT
Dear Domenico
I agree with your idea that in superfluidity and similar quantum phenomena, in general "the model must include the large scale effect, and the little scale effect" (this is what Laughlin wrote about in his Nobel lecture). What I am not sure about is the strong force examples you give. One might expect that if this was so, then the formulae for superfluidity would involve parameters related to the strong force, and I don't think that is the case (but I am not an expert).
Worth thinking about.
Saluti,
George
Marcoen J.T.F. Cabbolet wrote on Aug. 2, 2012 @ 12:09 GMT
Dear George Ellis,
I agree with your remark on page 1 that "the foundational assumption that all causation is bottom up is wrong, even in the case of physics".
In my PhD project I have developed a formal axiomatic system, that is potentially applicable as a foundational framework for physics under the condition that there is a matter-antimatter gravitational repulsion.
Seven non-logical axioms of this system describe what happens in the individual processes that take place at supersmall scale; in each of these processes then a choice is made (at elementary particle level thus).
As part of the research I have developed a physicalist approach to the mind-body problem from the perspective of this framework; this yielded a mechanism for mental causation which demonstrates that observers have a free will in this universe, that is, in the universe governed by these principles.
The point is that choices in the elementary processes are then imposed by the choice made at macroscopic scale by an observer. So this is an example of top-down causation; in my dissertation this is formalized in an expression.
This discussion is not mentioned in my essay (topic 1336): the essay focuses mainly on the initial considerations in the development of my theory.
Best regards, Marcoen Cabbolet
report post as inappropriate
Peter Jackson wrote on Aug. 8, 2012 @ 16:43 GMT
Hi George.
You commented on my essay on quantum boundary conditions, but it's easy to miss posts here so in response to your Point 4; ("You have to take the properties of the boundary into account as well. You regard it as a macro entity, i.e. you don't try to describe its constitution detail.") I paste my response 28/7;
"Interesting view. I know you're currently thinking in a different area, but my essay is actually ALL about the constitution of the quantum boundaries of 'space time geometries' (frames) and how the real interactions there (with non point particles and temporal evolution) produce all the classical macro scale effects we term Relativity.
I'm a little surprised and disappointed that did not emerge for you. I hoped you may try to falsify the ontology as we've have had no success doing so to date.
Have you actually read it all yet?"
etc.
I do hope you've now had a chance to do so, as it deals precisely with what you suggested, and, you are of course correct, the result is of massive import, and cause and effect is a 2 way street. I look forward to your views.
Thanks, and Best wishes
Peter
report post as inappropriate
Author George F. R. Ellis replied on Aug. 12, 2012 @ 19:01 GMT
Hi Peter
I posted a comment over there
George
Alan Lowey wrote on Aug. 10, 2012 @ 15:54 GMT
Dear George,
A truly excellent essay which I seemed to miss the first time round. Your fundamental thoughts are most welcome in the competition and I learnt a great deal from your essay. I was pleased to read that you are open minded enough to consider non-ordinary matter, unlike most essay authors. Yes, bottom-up to top-down causation needs to be appreciated by all.
P.S. I have rebalanced the Public Rating score which you most certainly deserve.
Alan
report post as inappropriate
Author George F. R. Ellis replied on Aug. 12, 2012 @ 18:59 GMT
Thanks for the positive comment, Alan.
George
Yuri Danoyan wrote on Aug. 13, 2012 @ 09:45 GMT
George
Top-Down Causation reminding me Causa finalis(Aristotelian). It is look like teleological causation.
report post as inappropriate
Author George F. R. Ellis wrote on Aug. 13, 2012 @ 12:12 GMT
Yuri
Sometimes it is, for example when I type these symbols on my computer keyboard and electrons flow to make the same symbols appear on your screen; and sometimes it is not, for example when a particular crystal structure leads to existence of Cooper pairs and hence superfluidity. I have given many examples of both in my essay and the papers it refers to.
Avtar Singh wrote on Aug. 13, 2012 @ 18:56 GMT
Hi Steve and George:
Thanks for your comment on Determinism.
Your Question: “What is for you a free will dimension, physically speaking,..... scalars , vectors,proportions, causes,.... ???”
Answer:
Free Will in a physical theory is not a spatial or time like dimension but a Degree of Freedom that allows spontaneous conversion of mass to energy or vice-versa without any external condition of cause. Such Degree of Freedom is necessary to allow equivalence of mass and energy and to integrate the missing physics of spontaneous decay and birth of particles from the Zero-point state of the so-called vacuum or dark energy, wherein mass, space, and time are fully dilated as described in my paper - - “
From Absurd to Elegant Universe”.
Since this Zero-point state is the most fundamental state of the universe from which particles are born and into which the matter decays over time, the physics of all these phenomena are fundamentals that must be included in any universal theory to avoid any singularities and paradoxes such as those experienced by the current theories – general relativity and quantum mechanics.
I see a lot of questions and discussions going on in this forum regarding paradoxes that should not arise if the above physics is integrated as shown in my paper. I would greatly appreciate review and comments on my paper from all the participants in this forum so as not to miss the important insights regarding the missing physics that could resolve the ills of physics and cosmology today and avoid unnecessary as well as irrelevant questions that are nothing but artifacts of the missing physics. The universe is a lot simpler to understand than portrayed by current incomplete theories.
Regards
Avtar
report post as inappropriate
Steve Dufourny replied on Aug. 13, 2012 @ 19:54 GMT
Hello all,
Mr Singh,
Thank you for your answer. But you know, the free will is a result of evolution.Let's take the brain, we have synapses and messages and so causes .In fact even a free will has a cause, here the entangled spheres aged of billions years. The brains are results of evolution, and the free will is a comportment.Lamarck and Darwin shall agree.Because there is a cause between the mass /energy/information Equilibrium.
So the free will is an effect of a cause. It is evident.Now when the free will converges towards the pure determinism, it is there that it becomes very relevant.Because the pure creativity can be deterministic. The rational convergences appear. If now the free will is not universally coherent, so there is a probelm. We can not say that a free will has not a cause.
The degrees of Freedom like you say must be always deterministically coherent at all 3D scales , fractalyzed with sense and reason and even wisdom.
Your zero point state seems in the same logic that a BEC of our mind. You know the number 1 is the secret , the main central sphere.The quantum number becomes a key for finite groups, the volumes so are very very relevant. It is spiritual all that.
ps: The space time dilation in a pure lorentzian appraoch is dterministic.Maxwell will agree at my humble opinion. :)
Regards
report post as inappropriate
Avtar Singh replied on Aug. 14, 2012 @ 17:05 GMT
Hi Steve, George, and Friends:
Thanks for your reply and comments.
Free will that you are referring to is nothing but biological consciousness emanating from brain, which, I agree, is the result of evolution and causative.
Free will that I am describing in my post and paper is not biological but universal or cosmic spontaneity (non-causative) as evidenced by the...
view entire post
Hi Steve, George, and Friends:
Thanks for your reply and comments.
Free will that you are referring to is nothing but biological consciousness emanating from brain, which, I agree, is the result of evolution and causative.
Free will that I am describing in my post and paper is not biological but universal or cosmic spontaneity (non-causative) as evidenced by the well-observed spontaneous decay and birth of particles from Zero-point state (So-called Vacuum). Other physical evidences of such universal spontaneity, free will, or consciousness are the well-established free-willed (self-existent) universal laws of conservation of mass-energy-momentum-space-time, wave-particle complementarity, and equivalence principle wherein the physical processes are spontaneous (eternal) and non-causative. Brains and biological consciousness evolved billions of years later than the fundamental eternal and free-willed source –Zero-point state that governs the wholesome universe.
What is missing from physics and cosmology today is a lack of this degree of freedom to allow a mechanistic conversion of mass to energy and space to time to allow a complete implementation of the equivalence principle into the current theories. Hence, the missing physics leads to singularities (general relativity) and paradoxes such as dark energy, dark matter, quantum gravity, quantum time, measurement paradox, unknown and unverifiable particles, multi-dimensions, multi-verses etc. etc…..For example, when the mass of a galaxy or universe is confined to a point-like volume singularity is experienced in general relativity because no spontaneous mass to energy conversion and subsequent evaporation is allowed. Once this is allowed, as shown in my paper, the singularity goes away. Second example, the accelerated expansion of the universe is not predicted by general relativity because of the missing physics wherein the mass evaporates into the relativistic kinetic energy that provides the observed accelerated expansion. This provision naturally provides the mechanistic physics of expansion rather than the currently used Einstein’s blunder fudge factor – cosmological constant.
The point (as described in my paper- -“
From Absurd to Elegant Universe” ) I would like to bring to the attention of scientists in this forum that the fundamental reality of the universe is the Zero-point state of the mass-energy-momentum-space-time continuum and fundamental dynamic process that governs the manifested universe is the spontaneous (Free-willed) birth and decay of particles. Neither the Particles/strings nor space-time nor biological evolution are fundamental in themselves but their overall state of the wholesome continuum. There is a lot of focused discussion in this forum on the artifacts –inconsistencies and paradoxes of the missing physics but a lack of focus on the missing most fundamental state and processes that govern the universe at its core. As shown in my paper, once the missing physics is properly included in current theories, the artifact questions and inconsistencies disappear along with artifact paradoxes listed above leading to a coherent and simple/elegant universe.
We must cure the disease (missing fundamental physics) and not focus on merely eliminating symptoms (artifact assumptions, inconsistencies, paradoxes, mysterious phenomena etc.). The castle (universal TOE) cannot be built upon missing fundamental foundations. We must not get lost in trees (artifacts) so as not to lose the vision of the forest (fundamental universal reality).
Best Regards
Avtar
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Aug. 15, 2012 @ 03:55 GMT
Hi Avtar
"Free will that I am describing in my post and paper is not biological but universal or cosmic spontaneity (non-causative) as evidenced by the well-observed spontaneous decay and birth of particles from Zero-point state (So-called Vacuum)."
The properties of the vacuum are well known and a standard part of physics. They are subject to quantum indeterminacy. To call that "free will" is stretching things: it is not free will in the usual sense.
Regards
george
Avtar Singh replied on Aug. 15, 2012 @ 21:42 GMT
Hi George:
I agree with your description of Free Will as the Cosmic Spontaneity and not biological consciousness. My reply was addressed to Steve who mentioned biological or brain-induced consciousness or free will in his reply post.
Since, definition of Free Will as understood by different people has a lot of stigma attached to it, let me try to rephrase the Free will as regard to vacuum and please let me know if you agree. The Zero-point state represents the relativistic state of the universe wherein mass-space-time have fully dilated to zero. This state represents the state of the self-existent, non-causative, hence free-willed laws of the universe without any manifestation of matter-space-time. Hence, it can also defined as the state of the Cosmic Free Will (as opposed to the individual or personal free will in the usual sense) of the self-existent and eternal universal laws.
Your comments will be appreciated.
Regards
Avtar
report post as inappropriate
Steve Dufourny replied on Aug. 16, 2012 @ 15:58 GMT
Hi to both of you,
Mr Singh,
You are welcome,
here is my point of vue,
The causality is more than we can imagine in fact.
The universality for me is the reason of being. It is evident that a certain consciousness must be correlated with this free will. The free will is less that the universal consciousness. The free will can be chaotic, the universality , it , is harmonious in its pure generality of evolution optimization spherization.
I beleive that the free will can converge with this consciousness.It is the most important at my humble opinion.
The Universe is rational and purely deterministic , the causality is at all scales in 3D. The free will is a result of evolution correlated with our brains, the stimuli are numerous like the genetic like the education, like the informations or this or that. In fact, the free will is still more interesting when the consciousness is its sister and the wisdom its brother. It is simple in fact this universality.
The free will is like a pure instinct, but we evolve and so we imrpove , we optimize, we catalyze with wisdom. It is the only one universal way of optimization spherization. We imrpove simply the mass with the help of light....
Best Regards
report post as inappropriate
Avtar Singh replied on Aug. 20, 2012 @ 22:19 GMT
Hi Steve:
Thanks for your thoughts and views. You can define Free Will in your own way as you like. What I have been referring to the Cosmic Free Will as Cosmic Consciousness that is above and beyond time and evolution. A will that is constrained in time and evolution of brain, culture, and that could be chaotic is not a truly "FREE" but rather a constrained will bounded in time and body/brain.
But, if that is your definition, I have no problem with it.
My purpose of bringing the cosmic consciousness in this forum is to raise awareness of the scientists here towards the crucial missing physics in the current theories without which no Theory of Everything is possible. What I have noticed that most popular definition of Free Will is widely understood to be related to bodily evolution because of the dominant effect of biological evolution in science and our lives.
While there is a lot of discussion in this forum on what assumptions are wrong, these is a lack of emphasis or awareness of the what is critically missing from physics that is causing the CRISIS today. my posted paper --“
From Absurd to Elegant Universe” tries to address what is missing to resolve the crisis rather than pointing to what are wrong assumptions only.
Best Regards
Avtar
report post as inappropriate
Steve Dufourny Jedi replied on Oct. 6, 2012 @ 16:36 GMT
Hello Mr Singh, It is a message of wisdom.
I beleive that the crisis can be solved in centralizing the competences. But if and only if the universal determinism is the torch of researchs and studies. The sciences, rational and foundamental have the solutions, so why?
is it a probelm of hormons and unconsciousness due to this papper governing our lifes. The equilibriums can be reached...
view entire post
Hello Mr Singh, It is a message of wisdom.
I beleive that the crisis can be solved in centralizing the competences. But if and only if the universal determinism is the torch of researchs and studies. The sciences, rational and foundamental have the solutions, so why?
is it a probelm of hormons and unconsciousness due to this papper governing our lifes. The equilibriums can be reached if we act globally in a pure spherization of all spheres. The social high spheres also can be optimized with wisdom.
I see the universal consciousness , complex and simple.If we take this infinite light above our walls, without motion. and if we consider that this infite consciousness so has created a physical sphere with spheres of light in motion becoming mass due to their intrinsic codes. So this universal sphere is like a project.Our physical consciousness evolves , like babies of this phsyical 3D sphere and its intrinsic quantum spheres and cosmological spheres.
I agree indeed, it lacks in the high spheres this universal consciousness.Perhaps it is due to our young age at this universal scale.13.7 to 15 billions years, it is young still.I beleive that the hour is serious and that it is time to act in a pure universal way globally speaking. The earth is near an add of several possible chaotical exponentials. I ask me how it is possible knowing our potential of resolution of major probelms. I don't understand the human nature. It is bizare. We have the solutions and we do not put them into practice. How is it possible ?
You know Mr Singh, I beleive simply that the most important is to be universal and determinsitic and foundamental respecting our universal physical laws. We have so many tools around us. The solutions, global exist.
The consciousness indeed is an important parameter.It is even an essential when we play with this entire universal enrgy, this entropy present in all things at its maximaum paradoxal. Just a part is sufficient. This entropy is even infinite when we unify both of system, the physical sphere and its spheres and the pure light infinite above our walls. The aether is this entropy. The free will more this consciousness give harmonious parameters of spherization instead of chaotical parameters.
Thankig you Mr Singh
Regards
view post as summary
report post as inappropriate
hide replies
George Ellis wrote on Aug. 14, 2012 @ 03:52 GMT
The issue of time has come up repeatedly in this discussion, even though it's not the essay topic. I've put up a paper on the archive today
arXiv:1208.2611v1 [gr-qc], considerably strengthening my position about time as stated in my FQXI essay some years ago. I just point this out for those interested; but discussion should take place somewhere else, else this therad will grow out of hand!.
Here's the abstract:
Space time and the passage of time
George F. R. Ellis, Rituparno Goswami
(Submitted on 13 Aug 2012)
This paper examines the various arguments that have been put forward suggesting either that time does not exist, or that it exists but its flow is not real. I argue that (i) time both exists and flows; (ii) an Evolving Block Universe (`EBU') model of spacetime adequately captures this feature, emphasizing the key differences between the past, present, and future; (iii) the associated surfaces of constant time are uniquely geometrically and physically determined in any realistic spacetime model based in General Relativity Theory; (iv) such a model is needed in order to capture the essential aspects of what is happening in circumstances where initial data does not uniquely determine the evolution of spacetime structure because quantum uncertainty plays a key role in that development. Assuming that the functioning of the mind is based in the physical brain, evidence from the way that the mind apprehends the flow of time is prefers this evolving time model over those where there is no flow of time.
report post as inappropriate
Thomas Howard Ray replied on Aug. 14, 2012 @ 11:57 GMT
George, with your permission, I think I can address at least one of your points above without mentioning the "t" word.
The distinction you make between world lines and surfaces defines the difference, does it not, between what can be described as top down causation, and what is laterally distributed causality?
Tom
report post as inappropriate
Daryl Janzen replied on Aug. 14, 2012 @ 16:41 GMT
Dear George:
I would be very honoured if you read and commented on
my essay. There, I've presented an argument that ties in closely with your points (i) -- (iii), from the perspective of cosmology---the role of which, as you've previously
written, is the "background for all the rest of physics and science", while "it is inevitable that... specific philosophical choices will to some degree shape the nature of cosmological theory, particularly when it moves beyond the purely descriptive to an explanatory role---which move is central to its impressive progress."
I hold that space-time is an evolving block, bounded by the cosmic present---but a difference between your EBU model and mine is that while you consider the EBU to be real, whereby the recombination epoch should still exist in no less real a state as five minutes ago, or even the present time that you are reading this comment (possibilism), I consider it to be an ideal mapping of the events that occur in an enduring three-dimensional universe, which is all that really exists (presentism). In section 3 of my essay, I've described how I think this view needs to be reconciled with special relativity theory; therefore, I've argued for a different physical description of simultaneity than what was given by Einstein, which is instead consistent with your point (iii)---since, as I see it, it's Einstein's interpretation of the relativity of simultaneity that leads to the requirement of a block universe.
Since the problem of the description of time in relativity theory is central to it, I would gladly receive any comments relating to this aspect of my essay.
Best regards,
Daryl
report post as inappropriate
Author George F. R. Ellis replied on Aug. 15, 2012 @ 03:43 GMT
Tom,
You say "The distinction you make between world lines and surfaces defines the difference, does it not, between what can be described as top down causation, and what is laterally distributed causality?"
Yes indeed. The first is both top-down and bottom up; this is described by the six time evolution equations of general relativity theory, similarly in the case of Maxwell's equations. The second is effective on spacelike surfaces; these are described by the four constraint equations of general relativity theory (two in the case of Maxwell's equations). These constraints are true now because they were initially true (the initial data must satisfy them) and they are conserved by the time evolution equations. Thus there is no instantaneous spatial *action* now: there are spatial relations that are true because they were set up that way and then the time evolution equations keep them so.
Daryl, I agree with your statements that there are preferred spatial sections in cosmology (see my response on your thread). However I don't think simultaneity is particularly important. Homogeneity is - and the homogeneous surfaces in an expanding cosmology are locally rest spaces for the fundamental observers, but are not globally simultaneous as defined by radar. But the latter fact has no observational or physical consequences.
George
Author George F. R. Ellis replied on Aug. 15, 2012 @ 03:50 GMT
Tom, I wasn't thinking straight in that previous answer. Top-down and bottom up causation occurs between different averaging scales both in time evolution equations, and in spatial relations, and hence also in the constraint equations.
George
T H Ray replied on Aug. 16, 2012 @ 01:51 GMT
George, thanks. I think we're of the same mind here. It would seem necessary to describe action over manifolds in a network of laterally distributed links, while wordlines necessarily evolve orthogonal to the surface state. I think that deeply, such interconnectivity might lead to a rigorous general model of the relation between continuous functions and discrete measures.
Tom
report post as inappropriate
J. C. N. Smith replied on Aug. 16, 2012 @ 13:33 GMT
George,
Thank you for the alert to your new paper on the nature of time. Fully concur with your request that discussions take place elsewhere. Any suggestions as to where that "elsewhere" should be? You can rest assured that the FQXi "time mafia" and others will be reading your paper with great interest and eager to discuss it in an appropriate venue.
jcns
report post as inappropriate
Daryl Janzen replied on Aug. 16, 2012 @ 17:55 GMT
George:
When you say, "homogeneous surfaces in an expanding cosmology are locally rest spaces for the fundamental observers, but are not globally simultaneous as defined by radar", you use an operational definition of "simultaneity", according to which synchronous events that occur on clocks that have been synchronised by radar are called "simultaneous". In my essay, I've used the word "simultaneous" to mean the sets of events that take place on surfaces of constant cosmic time. I therefore make a clear distinction between simultaneity and synchronicity in my essay, and explain how that distinction agrees with intuition and special relativity theory.
In FLRW cosmology, a particular separation between space and time is made a priori in setting up the kinematical background geometry, along with the requirement that synchronous-and-simultaneous slices (defined by that separation) should be both isotropic (according to observation) and homogeneous (so that they're isotropic at every point, in accordance with the cosmological principle). Since the RW scale-factor doesn't necessaily have to satisfy Friedman's equations a priori, the standard cosmological model is not purely general relativistic, as it only becomes general relativistic when the metric is subsequently required to satisfy Einstein's equations---the eventual result of which tells us that the maximally symmetric surfaces should be filled with matter in the form of a perfect fluid, and that they must expand according to the description that's given by Friedman's equations.
You claim in your paper that the argument from special relativity for a block universe is irrelevant; but the model in which space is flat and a(t)=1 is an (elementary) FLRW model, and although it contains no matter, the kinematical description still has to be consistent with that of the more general models, which essentially results from the separation between space and time that's given a priori in the background metric. How can the unique congruence of fundamental worldlines be claimed instead to be defined by matter, when the dynamical equations of FLRW cosmology are derived subsequent to the kinematical restrictions on the background geometry? In order to argue effectively for an EBU, it is imperative---for logical consistency in physically interpreting the special case---to reconcile the elementary FLRW model with special relativity theory. This is what I've done in section 3 of my essay, the upshot being that the surfaces of constant cosmic time which I take to define simultaneity should clearly not necessarily have to be synchronous, which is one of the basic assumptions of FLRW cosmology. I would very much like it if we could continue discussing this over on my site, where I've already posted a response to the comment you left for me.
Daryl
report post as inappropriate
J. C. N. Smith replied on Aug. 18, 2012 @ 16:56 GMT
Daryl,
With all due respect, George specifically stated above that he does *not* want to discuss his paper on time here! See above his post on 14 August which reads as follows:
"The issue of time has come up repeatedly in this discussion, even though it's not the essay topic. I've put up a paper on the archive today arXiv:1208.2611v1 [gr-qc], considerably strengthening my position about time as stated in my FQXI essay some years ago. I just point this out for those interested; but *discussion should take place somewhere else*, else this therad will grow out of hand!."* [emphasis added]
I'm hoping he'll be able to suggest another, more suitable venue for that discussion.
jcns
report post as inappropriate
hide replies
Edwin Eugene Klingman wrote on Aug. 15, 2012 @ 05:21 GMT
Dear George Ellis,
I commented above that in another reply you said that "reality is unclear" at the particle level because of uncertainty, wave-particle duality, and entanglement. Thus any new understanding of these aspects of reality might have some effect on the conception of 'the bottom' (although equivalence classes might not change). For this reason I invite you to read my current essay,
The Nature of the Wave Function. I know that you are probably as overwhelmed by the flood of essays as I am, nevertheless, I think you might find my essay interesting and relevant to "the bottom" and I would very much appreciate your feedback.
Thanks,
Edwin Eugene Klingman
report post as inappropriate
Member Benjamin F. Dribus wrote on Aug. 23, 2012 @ 02:21 GMT
Dear George,
I am very interested to know your views on the relationship between causality and time. To be sure we understand each other, let me say that I lean toward the view that time is a way of talking about causal relations. This is part of what I call the causal metric hypothesis, as I describe in my essay:
On the Foundational Assumptions of Modern PhysicsRafael Sorkin and the causal set theorists have a similar view, but with a number of important differences.
From my perspective, your essay seems to imply something quite radical, so radical that the simplest version of it is more complicated than the idea of multiple time dimensions. I mention this as an unlikely possibility near the end of my essay, but your treatment makes the idea sound quite reasonable.
Let me be more precise. As you well know, causality is sometimes regarded, at the classical level, as a binary relation on the set of spacetime events. By definition, such a relation is exclusively bottom-up; the relationships between two subsets of the universe are reducible to relations between individual events. In this view, the arrow of time corresponds to the order of events with respect to this relation. Multiple independent relations could be interpreted as multiple time dimensions in an obvious way.
What you seem to be claiming is that causality in fact involves binary relations on the power set of the set of spacetime events; i.e., that subsets involving multiple events influence each other in irreducible ways. In this view, it seems as though time might be understood as one-dimensional at the level of power sets (provided only one power-set relation is involved), but much more complicated at the level of spacetime itself.
One other point of comparison I would like to make is that a degree of holism already appears at the quantum level even if one restricts to binary relations involving only pairs of events, since the phases associated with transitions a priori depend on the entire universes involved (in practice, this would be somewhat restricted; the causal set theorists play around with axioms to this effect, but I don’t go into these details). This makes me wonder if complicated power-set relations are really necessary at the classical level. Most of your examples are classical, so it seems that you think the answer is “yes.”
I thoroughly enjoyed your thought-provoking essay. I’d be grateful for any remarks you might make on these issues.
Ben Dribus
report post as inappropriate
Edwin Eugene Klingman replied on Aug. 26, 2012 @ 18:33 GMT
Dear Ben Dribus,
I assume that Carey Ralph Carlson's essay on causal set theory gives a reasonable introduction to causal set theory and thus is helpful in interpreting your essay. My sense is that it is a mathematician's theory, or a physicist 'gone native'. As I understand it, you begin with time (as an ordered binary relation) and no space. Thus, to handle George's two-way causal flow...
view entire post
Dear Ben Dribus,
I assume that Carey Ralph Carlson's essay on causal set theory gives a reasonable introduction to causal set theory and thus is helpful in interpreting your essay. My sense is that it is a mathematician's theory, or a physicist 'gone native'. As I understand it, you begin with time (as an ordered binary relation) and no space. Thus, to handle George's two-way causal flow you appear to need multiple time dimensions (or equivalent?)-- not a solution that would appeal to most physicists.
Another non-physical mathematical interpretation involves quantum phases depending on "the entire universes involved". For a different physical understanding I refer you to my essay,
The Nature of the Wave Function, which derives finite extent wave functions from a classical field and explains how these relate to probability amplitudes and superposition of [infinite] Fourier components. In such an approach there is no "quantum wave function of the universe", only local waves.
My previous essays treated the universe as based on one physical substance (and *nothing else*) and assumes this substance (the primordial field) can evolve only through self-interaction. This leads to a scale-independent solution (hence, per Nottale, motion-invariant, ie, time-invariant) with no meaningful physical interpretation of time until the original perfect symmetry breaks. In this sense I begin with space and no time versus your assumption of time and no space.
Although it's difficult to summarize this approach in a comment, the point I'd like to make in response to your above comment is that the essential nature of the primordial field (which turns out to be gravity) is to support self-interaction (since there is initially absolutely nothing else to interact with) and this (evolving as it has into the world as we know it) is at the root of the ability of our universe to support top-down as well as bottom-up causality.
I suspect that you're rather committed to your causal metric approach but if you'd like a different take on this problem, I refer you to my previous essays,
here and
here.
Although this comment addresses your specific comment, I hope that George also is interested in one fundamental explanation of the two-way causal nature of reality.
Edwin Eugene Klingman
view post as summary
report post as inappropriate
Member Benjamin F. Dribus replied on Aug. 26, 2012 @ 21:05 GMT
Edwin,
Thanks for the response to my remark. I don't want to clutter George's thread, but the discussion is directly relevant to his essay, so I don't think he'll mind if I reply here.
I do not think the issues regarding time and causality raised by top-down causation are specific to approaches based solely on causal structures, nor to approaches involving configuration spaces. I mentioned the causal approach in this context not because it is my own, but because it simplifies the issue I was trying to get at, by removing independent structures that might clash with the causal structure, independent notions of locality, and so on. I mentioned configuration spaces because they seem to introduce top-down causality at the quantum level without requiring any radical new interpretation of the classical relationship between causality and time.
The issue can be stated in a simple setting that has nothing to do with the origin of the universe or the microscopic structure of spacetime. Assume special relativity as a large-scale, low energy approximation. We call causally-related events timelike-separated for reasons that are obvious to every physics undergraduate. What, if any, corresponding time-related statement do we then make about larger subsets that are causally related in an irreducible way?
If time in relativity is taken to represent a refinement of the causal order, then top-down causation clearly does require a radical new interpretation of time. If time merely corresponds to the lowest-level part of a power-set-relation, then this correspondence clearly endows the lowest-level part with unique significance.
Either way, I am interested to know what George would say about the relationship between causality and time in a top-down paradigm.
report post as inappropriate
Edwin Eugene Klingman replied on Aug. 26, 2012 @ 22:24 GMT
Dear Ben,
As indicated by my mention of Carey Carlson's essay, I'm a neophyte to causal sets, with little knowledge of it or intuition for it. I suspected it was over-simplifying to say that you start with 'time and no space' since you've elsewhere commented that "the causal metric hypothesis includes the assumption that what we call time is just a way of talking about causality, and what we call causality is just a way of talking about binary relations on sets." This seems to jive with "If time merely corresponds to the lowest-level part of a power-set-relation, then this correspondence clearly endows the lowest-level part with unique significance."
George mentions the brain in his essay, but does not directly mention consciousness. I suppose a materialist view supports a view of 'top down' causation that involves key strokes on a computer and other design tasks yet he does say that "The mind is not a physical entity, but it is certainly causally effective." As an exercise one can probably apply causal sets to the mind, but I believe that a more comprehensive perspective is required. Although these questions won't be settled anytime soon, I simply thought I'd point to my earlier essays that directly address these problems as I see them.
I too am interested to know what George would say about the relationship between causality and time in a top-down paradigm, and will not take any more of his blog space with my own views.
Edwin Eugene Klingman
report post as inappropriate
Author George F. R. Ellis replied on Aug. 28, 2012 @ 06:16 GMT
Dear Ben and Edwin,
thanks for these comments which are quite complex in their implications, and I can't do full justice to them at present. My view on the nature of time is set out in my paper here . I think that is compatible with top-down causation, which takes place at each instant in a local domain around each world line at all times.
A key point I make in my essay is that top...
view entire post
Dear Ben and Edwin,
thanks for these comments which are quite complex in their implications, and I can't do full justice to them at present. My view on the nature of time is set out in my paper
here . I think that is compatible with top-down causation, which takes place at each instant in a local domain around each world line at all times.
A key point I make in my essay is that top down effects don't occur via some mysterious non-physical downward force, they occur by higher level physical relations *setting constraints* on lower level interactions, which not only can change the nature of lower level entities (as in the case of the chameleon particles that might be the nature of dark matter), they can even lead to the very existence of such entities (e.g. phonons or Cooper pairs). This does not require multiple dimensions of time. So it is indeed a two-way causal flow which enables abstract entities to be causally effective (as in the case of digital computers) but does not violate normal physics. It is a largely unrecognised aspect of normal physics. The key issue you are both raising might be that in coarse graining physics one also needs a coarse graining of time to get the effective higher level laws. This certainly needs thinking about and I am not aware of much work on this.
Two further key point I make are that (i) constraints are conserved by the dynamics of time evolution, indeed on some views effectively generate time evolution, so this is all compatible with how time works, and (ii) new information can arise by processes of adaptive selection; the outcome is not uniquely determined by the initial data because of noise and quantum uncertainty at lower levels. This is a top-down process because selection criteria are higher level entities. This is a core feature of how the brain can work in a rational way that transcends the lower level physics, without violating it.
Edwin, you say "in such an approach there is no "quantum wave function of the universe", only local waves." I fully agree, that is what I say in my quantum paper
here You carry on
``My previous essays treated the universe as based on one physical substance (and *nothing else*) and assumes this substance (the primordial field) can evolve only through self-interaction. This leads to a scale-independent solution (hence, per Nottale, motion-invariant, ie, time-invariant) with no meaningful physical interpretation of time until the original perfect symmetry breaks" Well as long as the symmetry breaks, time does indeed emerge. I believe its difficult for time to emerge from a timeless substrate, inter alia because of difficulties in then getting the same arrow of time everywhere.
That's all I have time for now,
George
view post as summary
T H Ray replied on Aug. 28, 2012 @ 11:34 GMT
George,
You wrote in reply to Edwin & Ben: "The key issue you are both raising might be that in coarse graining physics one also needs a coarse graining of time to get the effective higher level laws. This certainly needs thinking about and I am not aware of much work on this."
I know Edwin eschews multiple dimensions; however, mathematical expressions of higher level laws, even in higher dimensions, do not forbid nonlocal causality in a finite space. That is, a closed logical judgment (mathematics) is 1 to 1 correspondent with a local physical result in the experimenter's measure space.
This dichotomy -- between the local measure space of infinite range and the nonlocal domain of finite range -- led me to realize that Joy Christian's proposal using dichotomous variables eliminates the local-global distinction. That makes it fully relativistic ("all physics is local") and angle preserving in its application of topological orientability.
Point is, that the general relativity interpretation of a universe finite in time and unbounded in space suffers no loss of generality as a universe finite in space and unbounded in time. This latter interpretation, though, fully embraces Minkowski space-time dynamics without ever having to refer to time as a physical phenomenon. Top-down causation is therefore continuous and locally real; continuous measurement functions are constrained by space-time topology (generalized geometry). I think this is consistent with your evolving block universe of spacetime evolution with no preferred surfaces.
Best,
Tom
report post as inappropriate
Daryl Janzen replied on Aug. 29, 2012 @ 17:53 GMT
George: ``The present: The ever-changing surface S(τ) separating the future and past - the ‘present’ - at the time τ is the surface {τ = constant} determined by the integral (20) along a family of fundamental world lines starting at the beginning of space time... But is this well defined, given that there are no preferred world-lines in the flat spacetime of special relativity?... [argument wrt GRT]... therefore there are preferred timelike lines everywhere in any realistic spacetime model... The special relativity argument does not apply.''
Tom: ``... I think this is consistent with your evolving block universe of spacetime evolution with no preferred surfaces.''
Dear George:
I'm just trying to understand your position. In your conception of an evolving block universe, don't you think there are preferred surfaces S(τ) as well as preferred timelike lines? Are you then arguing that these are well-defined in a realistic spacetime model, as opposed to special relativity, because of the existence of matter? Are you perhaps thinking of this in terms of top-down causation, whereby the matter that there is, which exists *in time*, is actually the cause of the ever-changing preferred surface S(τ)? i.e., that the matter that exists is also the cause of *existence*, so described?
Daryl
report post as inappropriate
Edwin Eugene Klingman replied on Aug. 29, 2012 @ 20:41 GMT
Dear George Ellis,
You are being asked some very good questions. I hope you're enjoying the process as much as I am. I wish to thank you for pointing out above that one must "get the same arrow of time everywhere", making me realize that my simple summary did not make this point clear. The scale invariance of my solution [prior to symmetry breaking] is [according to Nottale] equivalent to motion invariance and this effectively means that the 'shape' of the solution does not change with scale *or* the passage of time. I improperly characterized this as 'space and no time'. In actuality there can exist global [cosmic] time but it is not discernible when symmetry is unbroken. After symmetry breaks, the gravito-magnetic circulation leads to vortices -- the first cyclic phenomena -- and essentially introduces "local clocks" to space(time). It is conceivable to me that the left-handed nature of these 'clocks' is related to the one-way flow of time.
Thanks again for pointing out the need for the arrow 'everywhere'.
Edwin Eugene Klingman
report post as inappropriate
T H Ray replied on Aug. 29, 2012 @ 21:46 GMT
Wait,Daryl. How is an ever changing preferred surface not a non-preferred surface?
Tom
report post as inappropriate
Daryl Janzen replied on Aug. 29, 2012 @ 23:30 GMT
Dear Tom:
A three-dimensional space that evolves as time flows is described relativistically by defining a particular foliation of space-time, as describing the ``associated surfaces of constant time [that] are uniquely geometrically and physically determined in any realistic spacetime model''. The foliation is a set of preferred surfaces---but really each one describes the geometry of the evolving three-dimensional space at a particular value of cosmic time.
Potentially more important than ``changing'' is the fact that the surface is ``existing'', in a real flowing sense. In this physical system, an ``observer'' can move through this evolving space as time progresses; but then, according to relativity theory, ``space'' at any value of ``time'' won't be the same surface for this ``observer'', but a different spacelike hypersurface of the partially complete evolving block space-time that emerges. This is the case even when there are no forces acting on the ``observer''. Relativity then begs the question why the one foliation should be ``preferred'' over any other.
I think this is the general idea that George has in mind. That was my first question. But then I was curious whether George also thinks this question is answered by associating the bundle of fundamental worldlines with matter, and then saying that these fundamental particles of matter actually *cause* the flow of time, so-defined, and whether this could be thought of as a form of top-down causation.
Best,
Daryl
report post as inappropriate
hide replies
Anton W.M. Biermans wrote on Aug. 27, 2012 @ 03:01 GMT
George,
In your reply you don't point out what is logically wrong with my reasoning:
''If we understand something only if we can explain it as the effect of some cause, and understand this cause only if we can explain it as the effect of a preceding cause, then this chain of cause-and-effect either goes on ad infinitum, or it ends at some primordial cause which, as it cannot be...
view entire post
George,
In your reply you don't point out what is logically wrong with my reasoning:
''If we understand something only if we can explain it as the effect of some cause, and understand this cause only if we can explain it as the effect of a preceding cause, then this chain of cause-and-effect either goes on ad infinitum, or it ends at some primordial cause which, as it cannot be reduced to a preceding cause, cannot be understood by definition.''
You circumvent its irrefutable logic by asking me to explain how I go about my life, which has less to do with causality than with reason. Anyhow, I am not very interested in causality at macroscopic scale. If the antics of the moth can cause a hurricane but it depends on an infinity of other events whether the party is canceled or not, that is, if the moth only in retrospect can be accused of causing the hurricane, then it cannot be its cause at all. As far as I'm concerned, causality means that A causes B to happen with 100% certainty: to me ' approximately' causally is a contradiction in terms.
The point of my essay is that if we live in a universe which creates itself out of nothing, without any outside interference, that is, without any cause, then in such universe fundamental particles have to create themselves, each other. In that case particles and particle properties must be as much the product as the source, the effect as cause of their interactions, of forces between them.
If in a self-creating universe particles create, cause each other, then they explain each other in a circular way. Here we can take any element of an explanation, any link of the chain of reasoning without proof, use it to explain the next link and so on, to follow the circle back to the assumption we started with, which this time is explained by the foregoing reasoning, that is, if our reasoning is sound and our assumptions are valid. If we have more confidence in a theory as it is more consistent and it is more consistent as it relates more phenomena, makes more facts explain each other and needs less additional axioms, less more or less arbitrary assumptions to link one step to the next, then any good theory has a tautological character, fitting a self-creating, self-explaining universe. The circle of reasoning ought to work equally well in the reverse direction.
In other words, I don't say that events aren't related, only that we ultimately cannot say, at least at quantum level, what is cause of what, what precedes what in an absolute sense as to be able to establish what precedes what requires that we can look at the universe from outside of it, which is impossible.
Causality ultimately leads nowhere: if, for example, we invent the Higgs particle to cause other particles to have mass, then we need another particle to give the Higgs its properties, a particle which in turn owes its properties to another particle, and so on and on.
As I argue in my essay, we'll never be able to unify forces, get rid of the infinities and contradictions of present physics as long as we cling to causality.
Anton
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Aug. 28, 2012 @ 06:21 GMT
Dear Anton
"As far as I'm concerned, causality means that A causes B to happen with 100% certainty" Well that's not this universe. Please read Feynman on quantum physics.
"I am not very interested in causality at macroscopic scale". But that is what I am trying to explain.
"The point of my essay is that if we live in a universe which creates itself out of nothing, without any outside interference, that is, without any cause, then in such universe fundamental particles have to create themselves, each other." But the word "create" has no meaning of there are no causes.
George
Author George F. R. Ellis replied on Aug. 28, 2012 @ 06:23 GMT
typo: "if there are no causes."
Yuri Danoyan replied on Aug. 28, 2012 @ 07:15 GMT
"we live in a universe which creates itself out of nothing,"
George
We live in a universe that was born from a previous universe
See my essay http://fqxi.org/community/forum/topic/1413
report post as inappropriate
Author George F. R. Ellis replied on Aug. 28, 2012 @ 19:14 GMT
Yuri
"We live in a universe that was born from a previous universe"
- so how did that previous universe come about?
Actually this has nothing to do with the topic of this thread. Your quote "we live in a universe which creates itself out of nothing," was not my statement, it was made by Anton. If you disagree please take it up with Anton on his thread.
Yuri Danoyan replied on Aug. 28, 2012 @ 19:37 GMT
Previous universe also came from previous.
See my essay http://fqxi.org/community/forum/topic/1413
Read my correspondence with Stephen Weinberg.
report post as inappropriate
Author George F. R. Ellis replied on Aug. 28, 2012 @ 21:15 GMT
So it's previous universes all the way down. Are you claiming any of this is testable? Is this supposed to be science, or do you claim science does not need observations? How many universes back do you claim to prove exist, by some kind of observation - and what is the nature of the observation?
Actually Penrose got there first: see his book Cycles of Time.
I don't plan to read your correspondence with Weinberg, despite your demand that I do so.
Yuri Danoyan replied on Aug. 28, 2012 @ 22:02 GMT
I ask you two questions about testability:
1.Can you test that the Planck units of length and time have sense?
2.Can you test that c and G vary or not vary?
report post as inappropriate
Author George F. R. Ellis replied on Aug. 29, 2012 @ 07:50 GMT
1.no.
2. c, no unless you determine some method of measuring length that is independent of the speed of light, at a fundamental level. G, yes.
But this has nothing to do with my essay. Please take it up in more appropriate places.
hide replies
Anton W.M. Biermans wrote on Aug. 27, 2012 @ 08:05 GMT
George,
I see that your reply to my first post on your thread has disappeared. For the readers who want to understand my above reply to it, I again post your own reply to my comment.
Author George F. R. Ellis replied on Jul. 23, 2012 @ 15:22 GMT
Anton
"Causality therefore ultimately cannot explain anything." If so please explain to me how you go about your daily life. If you are unable to cause any changes about you in your daily existence, then you don't exist as a person (and you certainly won't be able to get a job).
I explained carefully at the start of my paper that there are always numerous causes in action, and we get a useful concept of "the cause" by taking all except a few for granted. This produces a valid local theory of causation. You don't have to solve problems of ultimate causation to understand local physical effects (e.g. heating water causes it to boil). Your complaint seems to be that if you can't explain the entire universe you can't explain such local phenomena. The whole practice of science disagrees with you.
George
report post as inappropriate
J. C. N. Smith wrote on Aug. 28, 2012 @ 19:23 GMT
Dear George,
Just a quick note to thank you for recommending Arthur Eddington's marvelous book 'The Nature of the Physical World.' I'm reading it now and enjoying it immensely. Having also just recently read Poincare's 'The Value of Science,' dating from 1913, it's fascinating to observe the evolution of thinking on many topics still of keen interest and still very much in a state of flux even today. It seems very much in keeping with the theme of this essay competition to observe the flow and, dare I say, "crystallization" (or lack thereof) of thinking on these topics over the past century.
Fwiw, I'm personally convinced that we're currently living through and participating in what Thomas S. Kuhn would describe as a "crisis state" in physics. Would you agree? And if so, do you think that this is generally recognized and/or accepted in the wider physics community? I don't read or hear others talking in these terms, but I believe the evidence for it is abundantly clear; it's virtually a classic case, in my view. Exciting (and occasionally frustrating) times to witness.
Regardless, thank you again for the book recommendation.
jcns
report post as inappropriate
Author George F. R. Ellis replied on Aug. 28, 2012 @ 21:07 GMT
Hi jcns
glad you are enjoying it. He was a great pioneer in astrophysics and cosmology, with a wonderful power of explanation. His book on the internal constitution of stars is still great reading, even though it was written before nuclear physics was understood. Physicists of his epoch did not deride philosophy, they realised its role as an underpinning to physical thought and took it...
view entire post
Hi jcns
glad you are enjoying it. He was a great pioneer in astrophysics and cosmology, with a wonderful power of explanation. His book on the internal constitution of stars is still great reading, even though it was written before nuclear physics was understood. Physicists of his epoch did not deride philosophy, they realised its role as an underpinning to physical thought and took it seriously.
Yes I do think there is a crisis in physics - but not all of it! One can get a very wrong impression of physics if you only read some of the over-hyped theoretical physics stuff, much of which seems in danger of losing touch with reality (for some people, models are more real than reality). But a vast amount of physics is absolutely solid, relating theory to marvellous experiments in materials science/solid state physics, nanophysics, quantum optics, biophysics, and so on - Nature Physics is full of the stuff, much of it very exciting. It is on the theoretical side,and in particular in relation to cosmology, where more and more extravagant hypotheses are being proposed with very little concern for usual constraints and/or for testability. "Phantom matter" and dark energy theories with p/rho < -1 are examples of the first; multiverses and theories of creation of the universe out of nothing are examples of the second. But physics has a great capacity for self-correction, and I think the more extravagant ideas will fade away and turn out to be ephemeral, as these ideas are tested and evaluated by the physics community in the long term, who hopefully will start to take philosophical issues seriously again. And I think the idea of top-down causation will gain traction and not fade away, even though it has so little support in the physics community at present. Ernst Mach and Dennis Sciama were early proponents of the idea, even if they did not call it such; present theories of the origin of the arrow of time are also of this kind; and it is starting to gain traction is some areas of astronomy, under the name "environmental effects". The exciting part is that it may help understand foundational quantum physics issues. Watch this space - but with a bit of patience!
By the way, you quote Kuhn - have you read any Imre Lakatos? He has a more developed view of how changes of scientific research programs take place.
George
view post as summary
J. C. N. Smith replied on Aug. 29, 2012 @ 00:33 GMT
Hi George,
I regret to say that I have not yet had the pleasure of reading Lakatos; thank you for pointing me toward his work. I see that several of his works are available for purchase on the internet. Could you recommend a good, not-too-technical entry point for making his acquaintance?
I've long admired Kuhn's 'The Structure of Scientific Revolutions,' and see evidence of his "crisis state" in some aspects of physics. Lee Smolin touched on some of this in 'The Trouble With Physics.' Speaking of which, I've heard from a reliable source that Smolin plans to publish at least one new book on the nature of time later this year. I hope so.
Thank you for helping broaden my horizons.
jcns
report post as inappropriate
Author George F. R. Ellis replied on Aug. 29, 2012 @ 06:48 GMT
Hi jcns
His major relevant book is
The methodology of scientific research programmes but the wikipedia article
here is a good start.
The key point is that he recognises a scientific theory as having a hard core, the central hypotheses of the theory, surrounded by a belt of auxiliary hypotheses that mediate between the core and actual data. These have to do with the experimental apparatus, sources of noise, subsidiary variables, etc. When the data don't agree with the theory, you alter the auxiliary hypotheses, not the hard core. For example in cosmology, you change your theory of galaxy evolution rather than your cosmological model. Apart from emotional issues and psychological investment in theories, it is this auxiliary structure that makes it so hard to persuade people their theory is wrong: you can always tweek some auxiliary parameter to fit the data (add another epicycle for example). The theory eventually becomes so baroque that it is no longer a satisfactory explanation. But different people differ as to when that occurs:that's when mature judgement comes in.
Yes Smolin has a book on time in the works (broadly supporting my view).
best, George
J. C. N. Smith replied on Aug. 29, 2012 @ 13:20 GMT
Hi George,
Thank you for the recommendation. Sounds like an interesting approach. I've already ordered a copy (how did we survive before the internet?), and will position it near the top of my "to read" pile.
Thank you also for the "sneak preview" of Smolin's upcoming book. If it broadly supports your view, I suspect it may also broadly support my view. I like Deutsch's comment: "The way we converge with each other is to converge upon the truth." (The Beginning of Infinity, p. 257.)
Cheers,
jcns
report post as inappropriate
hide replies
James Lee Hoover wrote on Aug. 28, 2012 @ 22:58 GMT
George,
How do you explain the bottoms-up fixation? Do you think it is a cultural thing or universal? What about same-level mode as efficient and circular, the way some of your colleagues characterize it.
I can see that the fixation you describe could explain thinking regarding many issues in physics including the nature of gravity, which I deal with.
Jim
report post as inappropriate
Author George F. R. Ellis replied on Aug. 29, 2012 @ 07:43 GMT
Hi James
I think there are two things at work. Firstly physicists recognise that all matter is controlled at the bottom level by the forces between particles; hence physics underlies all (e.g.Dirac stated this in relation to how physics underlies chemistry). There seems to be no room for any other kind of causation. I respond to that claim in the later part of my essay: essentially the...
view entire post
Hi James
I think there are two things at work. Firstly physicists recognise that all matter is controlled at the bottom level by the forces between particles; hence physics underlies all (e.g.Dirac stated this in relation to how physics underlies chemistry). There seems to be no room for any other kind of causation. I respond to that claim in the later part of my essay: essentially the context determines how the fundamental interactions work out; they offer opportunities and constraints but do not by themselves determine the outcome.
Secondly, this bottom-up view is then taken as an underlying principle of faith by hard core reductionists, who simply ignore the contextual effects that in fact occur: for example claiming that biology is controlled bottom up by genes alone, thereby ignoring all the discoveries of epigenetics, which prove this false. But such reductionism is always a cheat, because it is always only partial. Example: Francis Crick famously wrote "You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules". But nerve cells and molecules are made of electrons plus protons and neutrons, which are themselves made of quarks .. so why not "You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of quarks and electrons"? And these themselves are possibly vibrations of superstrings. Why does he stop where he does? - because that's the causal level he understands best! -- he's not a particle physicist. If he assumes that the level of cells and molecules is real, it's an arbitrary assumption unless *all* levels are real - which is my position. It's the only one that makes sense.
So in the end it's an ideological faith of hardcore reductionists: it's philosophically and/or emotionally driven. That's a further point: scientists like to claim what they do is purely rational. Any impartial study of academia will show this is not the case: emotions and associated rivalries drive a large part of what happens, even as regards physicists. The outcome in terms of mature scientific theories is of course free of these emotions, it is indeed impartial and strictly testable. But they do not arise out of an emotion free environment.
George
view post as summary
T H Ray replied on Aug. 29, 2012 @ 09:22 GMT
"If he (Crick) assumes that the level of cells and molecules is real, it's an arbitrary assumption unless *all* levels are real - which is my position. It's the only one that makes sense."
George, that is a beautifully compact statement of complex system self organization. If consciousness were not non-zero, what could we possibly mean by the term "life?"
Tom
report post as inappropriate
Author George F. R. Ellis replied on Aug. 30, 2012 @ 13:07 GMT
By the way I did not intend the above to imply that I am immune to the emotional and philosophical pressures I mention: we all are subject to them, because we are all human. I believe that what I state in my essay and in the responses above are technically correct: they reflect the true nature of causation. But, just like those who take an opposite view, I also am driven by a passionate belief. In my case, it is that the view I take is the way that opens up most depth of meaning and understanding, being the bedrock from which a truly deep view of humanity can ultimately emerge, rather than a view that in the end denies some of the depth of humanity because it reduces them to mere machines. So yes I too am driven by a philosophical agenda, as we all are. This is irrelevant to validity or otherwise of the position I put here. The question is whether it is scientifically valid or not: and I believe it is, as I have tried to argue in my essay and in my responses here.
Anton Lorenz Vrba wrote on Aug. 30, 2012 @ 22:45 GMT
Hi George,
A thought stimulating essay worthy of submission. However, there is one small point I do not fully agree with; it does not detract from the overall essence, but it does raise an interesting question.
As an example for top-down causation changing the nature of constituent entities you wrote: "....neutrons bound in nucleus have a half life of billions of years but they decay in 11 ½ minutes when free." implying, naively expressed, that an atomic nucleus is a marble like collection of protons and neutrons. I envisage each atomic nuclei as a unique homogeneous entity of energy and charge. Only, for our better understanding or reasoning have we deconstructed the atomic nucleus top-down into a number of protons and neutrons. Later we construct atomic models bottom-up immediately creating the problem why the whole thing does not fly apart under the strains of the coulomb forces, which consequently we reason away by defining the strong force.
Is the strong force now merely a bottom-up artefact? - Stof tot nadenke
Groete
Anton @ ( .../topic/1458 )
report post as inappropriate
Author George F. R. Ellis replied on Aug. 31, 2012 @ 07:10 GMT
Hi Anton
I think the strong force is indeed real: otherwise we would not have nuclei.
"I envisage each atomic nuclei as a unique homogeneous entity of energy and charge. Only, for our better understanding or reasoning have we deconstructed the atomic nucleus top-down into a number of protons and neutrons." Well they can exist in their own. But when they join together in a nucleus they lose their identity: which is one of my major points, we don't have a situation of immutable lower level objects joining together unchanged to forma higher level entity: their existential nature changes according to context. That's what a purely reductionist account misses.
greetings
George
Paul O'Hara wrote on Sep. 1, 2012 @ 15:46 GMT
George,
This is a great essay. I enjoyed reading it. Many of the ides remind me of the theory of emergent probability formulated by Bernard Lonergan in his book Insight: A Study of Human Understanding,especially chapters IV and VIII. You might want to look at it some time.
Paul
report post as inappropriate
Author George F. R. Ellis replied on Sep. 1, 2012 @ 16:19 GMT
Thanks Paul.
I have been put in touch with Lonergan's writing from time to time by several admirers, but never got into his work properly. I'll try to get round to it.
George
Robert H McEachern wrote on Sep. 1, 2012 @ 18:14 GMT
George,
Although I agree with your overall point, I note that your "Hypothesis", as stated, is self-contradictory:
"Hypothesis: bottom up emergence by itself is strictly limited in terms of the complexity it can give rise to. Emergence of genuine complexity is characterized by a reversal of information flow from bottom up to top down."
The "reversal of information flow", is well known to occur in many instances, as you note. Indeed, it is so well known, that we have a special word for it - "feedback".
In the context of your essay, if "bottom up emergence" is "strictly limited", then "feedback" processes would have never "emerged" in the first place. "Feedback" is the mechanism by which bottom up processes add top down ones, to their repertoire. It is the cause, the mechanism for emergence. The existence of bottom up processes is necessary to the existence of top down ones.
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 1, 2012 @ 21:58 GMT
Hi Robert
Thanks for that.
You say "The 'reversal of information flow', is well known to occur in many instances, as you note. Indeed, it is so well known, that we have a special word for it - 'feedback'." Well feedback is indeed one type of top-down causation, but it is not the only type that can occur, please see
here for a discussion of the four other types that are possible (a very important one is adaptive selection, for this is the process whereby new information is garnered: feedback control cannot lead to that result).
Then you say "The existence of bottom up processes is necessary to the existence of top down ones. " Yes I agree. But once they emerge, top down processes do indeed exist and have causal powers.
George
Robert H McEachern replied on Sep. 2, 2012 @ 02:17 GMT
George,
Your comment that: "feedback ...is not the only type... a very important one is adaptive selection... feedback control cannot lead to that result.", implies that adaptive selection is an example of "top down causation", but not "feedback."
Others employ a much broader definition of "feedback" than you imply. Almost all modern communications signals employ a form of feedback, known in the literature as "decision directed feedback", that is what you are calling "adaptive selection". Instead of simply feeding-back an output into the input, they exploit a priori knowledge, to feedback what the emitter must have "intended to send", rather than what was actually received. They determine what was "intended to be sent", by adaptively selecting their "best guess" from an a priori known list of allowable possibilities; a limited "alphabet".
As you say, such processes do indeed exist and have causal powers. Indeed, processes like Decision Directed Feedback are a major causal power for why an HDTV picture is so much cleaner than older TVs.
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 2, 2012 @ 06:56 GMT
Hi Robert
well that's very interesting, thanks for that; yes indeed it seems as it is a form of what I call adaptive selection. What I have classed as feedback control is cybernetic feedback control as per Wiener, alias homeostasis which occurs all over the place in biology. The key dynamical feature is a preset goal. If I'm not now allowed to call this feedback control, then I need a new name for it, because it should be differentiated from adaptive selection for the reason I mention: one reliably attains set goals; the other attains a final state that is not uniquely implied by the initial data, and thereby can accumulate new information.
I'll learn more about Decision Directed Feedback. One of the problems of interdisciplinary work is that the same idea is given different names in different domains, making it hard to talk to people from different disciplines.
Author George F. R. Ellis wrote on Sep. 2, 2012 @ 07:04 GMT
Addendum:
the really complex forms of behaviour result from my 4th and 5th categories of top-down causation:
- TD4, when goals of a cybernetic feedback control system are determined by adaptive selection, and
- TD5: when adaptive selection goals are themselves selected by a process of adaptive selection.
The latter is where intelligence comes in.
Don Limuti wrote on Sep. 3, 2012 @ 18:51 GMT
George,
Yep, I've changed my mind again. This is the best essay. Another top down phenomena.
And you have made this the most interesting thread I have seen on FQXi. It deserves an award all by itself.
Don Limuti
PS: Check out: http://www.digitalwavetheory.com/DWT/44_The_Arrow_of_Time.ht
ml
report post as inappropriate
Author George F. R. Ellis replied on Sep. 4, 2012 @ 06:45 GMT
Many thanks Don. Appreciated.
George
Frank Martin DiMeglio wrote on Sep. 3, 2012 @ 23:53 GMT
Hi George. You will see many of your ideas relate to mine (in different/linked ways) in my essay -- soon to be posted.
FUNDAMENTAL gravitational and inertial equivalency and balancing fundamentally sits at the heart of physics, and it demonstrates/proves F=ma fundamentally as well. George, do you agree with this statement?
Gravity sits at the heart of fundamental/general unification in physics. My essay proves this. Do you agree with this?
I would appreciate your rating and comments on my essay -- soon to be posted. Thanks. You are remakably silent given all that I have said in this thread. Yours is a necessary essay and topic/subject. My essay is foundational on waking AND dream physics -- and the link between the two is proven.
Thank you for your essay.
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 4, 2012 @ 06:42 GMT
Hi Frank
you stated "If the self did not represent, form, and experience a comprehensive approximztion of experience in general by combining conscious and unconscious expereince, we would then be incapable of growth and of becoming other than we are. " I agree. The question is how physics allows this to happen. Modular hierarchical structures with both bottom up and top down causation is a key part of the answer.
George
Jeff Baugher wrote on Sep. 4, 2012 @ 22:49 GMT
Prof. Ellis,
Interesting essay. I think I may agree with you, but am not sure. In my theory (recent
sketch here and
essay here), General Relativity can be rewritten so as to have a causal background (i.e. curvature doesn't mean action at a distance). Please feel free to comment if we are speaking of the same thing in causal backgrounds that are top down (least complex to most).
Regards,
Jeff
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 05:11 GMT
Thanks for that. I am also not sure if your paper really relates to my view.
Let me ask you the following: does your approach (a) somehow embody Mach's principle? (b) somehow relate to the arrow of time?
If yes, then yes!
George Ellis
Joel Rice wrote on Sep. 5, 2012 @ 13:22 GMT
There might be another way to look at this - structurally rather than in terms of causation. If allowed associations of particles are determined by (non associative) algebra, that would change the way one looks at bound states as determined by 'forces' - not that one gets rid of photons, but that photons have to be consistent with the associations demanded by algebra. The issue being that we would not expect pure algebra to know anything about coupling constants or 'fine tuning'. If associations are more fundamental, then forces have to be consistent with what is demanded structurally .. for example e(uud).
Algebra does not seem to take a position on reductionism or teleology, or demand that causation be bottom up. But if we say that algebra requires that the universe produce hydrogen - that looks 'top down' or teleological - then the 'constants' must be compatible with the future existence of stable atoms, even if the early universe is too hot. One might say that the universe Must cool off or else it can not produce what algebra presumably demands. And if it applies to simple associations like Hydrogen, one might expect that DNA is just a more elaborate association. Perhaps all stable-neutral associations are given apriori. That would give us a very Top Down view of the world, but rather intractable, given the complexity and subtlety.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 05:16 GMT
Well for me the issue is whether algebra can sensibly describe a modular hierarchical structure. If so, it might work, If not, then not. As you say, it's all in the structure.
We really need graph theory. I am a novice in that area.
George
Cristinel Stoica wrote on Sep. 5, 2012 @ 13:45 GMT
Dear Professor Ellis,
I found your papers about top-down causation very interesting, and the present essay inspiring. I think there are strong parallels between how causation may "propagate" from "top" toward "down", and the relation between quantum measurements and reality. I hope I will return here with details about this. IMHO, there are two main factors that shaped the "standard" perception of causation. First, we perceive the things as simultaneously present in the "now", and the correlations between various events make us believe that there is a causal connection from past to future, from parts to whole, from simple to complex. This was reinforced by the success of the second factor: the fact that when solving equations describing the time evolution, we usually start with initial conditions at a time t, and are able to develop the solution for subsequent times. In addition, the effects appear as propagating in a local manner. Correlations appear to us as cause-effect connections, because of their succession in time. They appear to us as bottom-up causation, because the interactions are local. The local constraints are given by the equations, and are correct. But the more global aspects of causation are unfairly much less researched, and much less understood.
I dare to hope that you would take a glance at
my essay, which is not connected to these aspects, but to the properties of singularities in general relativity.
Best wishes,
Cristi Stoica
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 05:21 GMT
Dear Cristi
" I think there are strong parallels between how causation may "propagate" from "top" toward "down", and the relation between quantum measurements and reality."
yes indeed. I think this is an area that will eventually be illuminated by this approach.
And yes a key to it all relates to constraints: this is how top down actin takes place, in physical terms. But as I state, they have very powerful properties: they can create, modify, and delete lower level entities. That is a key reason why a purely bottom up approach won't work.
Ah, singularities in GR: haven't worked on those for a long time. Will try to get time to look.
George
Jeffrey Nicholls wrote on Sep. 6, 2012 @ 00:31 GMT
Dear Prof Ellis,
I enjoyed your article and it gives me heart for my own project.
The next question (if the Universe started as something very simple) is how do we get a "top" to cause down from, how does a simple system bootstrap itself into a complex one, ie how does entropy (and corrresponding information) increase?
Your approach opens the way to an anwser I see: that random (symmetric) processes sometimes become concatenated into more complex processes which are capable of controlling their own foundations to ensure their own continued existence. This "algorithm" may work at all levels of complexity, and so is able to take us from a very simple unitary system to systems of unlimited complexity.
Thank you,
Jeffrey (/1435)
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 05:27 GMT
Dear Jeffrey
"that random (symmetric) processes sometimes become concatenated into more complex processes which are capable of controlling their own foundations to ensure their own continued existence" - nicely stated. Yes. But that is possible because the possibility space for such processes includes structures that have properties (e.g. crystal symmetries, molecular folding) that enables such top-down causation to happen. And this not only allows their own continued existence: it allows their building up of higher levels of complexity. This is possible via adaptive selection, choosing the higher level entities that work from those that don't.
George
Member Ian Durham wrote on Sep. 6, 2012 @ 02:36 GMT
Hi George,
Nice essay! I was expecting to find a complete refutation of reductionism but was pleasantly surprised to discover that we may actually agree on a few things. Notably, I prefer your definition of causality and have toyed with something similar myself. In particular, it provides for the possibility that entanglement is actually a causal phenomenon, just not in the way we...
view entire post
Hi George,
Nice essay! I was expecting to find a complete refutation of reductionism but was pleasantly surprised to discover that we may actually agree on a few things. Notably, I prefer your definition of causality and have toyed with something similar myself. In particular, it provides for the possibility that entanglement is actually a causal phenomenon, just not in the way we normally expect. We seem to be stuck in this relativistic paradigm that insists that causality be defined by special relativistic limitations. Of course, if space and time are emergent (as I believe they are), then this would seem to be a poor way to define causality. Hence, I find a less theory-dependent definition such as yours to be more palatable.
Regarding boundary conditions, you make some excellent points and, indeed, the ultimate example of top-down causation also happens to be the ultimate boundary condition: the universe itself (even within the context of the multiverse - indeed, one could argue that physical laws are constrained by the universe for, if they were different, it wouldn't be the same universe). On the other hand, you say at one point that no real system is truly isolated (and, in principle, I tend to agree), but what about the universe as a whole? If there is no multiverse (an open question), then the universe truly is an isolated system.
With that said, I have some comments:
1. I think the computing example with Word and Photoshop is a bit oversimplified.
2. You mentioned "uncaused changes" at one point. Are you saying there is no such thing as randomness or do you accept that random outcomes can still be causal (just not deterministic)?
3. Regarding logic, one could argue that physics partly emerges from logic itself in which case it would not be particularly unusual to have higher-level logic dictating physical processes since, in some sense, logic may be even more fundamental and universal.
4. Why can't multiple processes/paths lead to the same conclusion? I fail to see why this is a bad thing. (This question/comment refers to point D on pp. 5-6).
5. You suggest that interactions are necessarily higher-level phenomena from particles themselves, but one could fairly easily argue that quantum field theory says that they are, in some sense, *more* fundamental than particles (or, at the very least, *as* fundamental).
6. I'm still unconvinced by argument 6e on p. 7.
Finally, while I agree that there is most definitely some top-down causality in the universe, I tend to think that, in general, it tends to "drift" upward, if you will, i.e. if you were to model causal flow as a process, it would be like a random walk with drift with the drift going from the simple to the more complex.
Cheers,
Ian Durham
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 09:20 GMT
DearIan,
thanks for that. Multiple issues to deal with!
"the ultimate example of top-down causation also happens to be the ultimate boundary condition: the universe itself (even within the context of the multiverse - indeed, one could argue that physical laws are constrained by the universe for, if they were different, it wouldn't be the same universe). On the other hand, you say at one point that no real system is truly isolated (and, in principle, I tend to agree), but what about the universe as a whole? If there is no multiverse (an open question), then the universe truly is an isolated system."
Well one is there starting to deal with issues of existence of the laws of physics that underlie how things behave in our universe. If one extends the hierarchy from one of scale to one of causation (as one needs to do on the life sciences side) then the laws of physics causally lie above the largest scales of the universe (alternatively, whatever laws govern multiverses lie above the physical existence of multiverses).
These laws - which are not themselves physical entities - somehow intrude down on the physical levels (see e.g. Roger Penrose' writings on the large, the small, and the complex). IN this sense the physical universe is not closed: it is controlled by a causally higher level of non-physical entities such as laws of physics, with their mysterious basis in mathematics.
This takes us far from my essay, so I won't pursue it. I'll answer your other issues in another posting.
george
Author George F. R. Ellis replied on Sep. 6, 2012 @ 09:38 GMT
continued:
"1. I think the computing example with Word and Photoshop is a bit oversimplified."
- not sure why. It was Turing's genius to see that any application could be implemented by the same hardware, by changing its operating context; in this case, by loading different high level software. That software then determines the (data) ==> (output) relation.
"2. You...
view entire post
continued:
"1. I think the computing example with Word and Photoshop is a bit oversimplified."
- not sure why. It was Turing's genius to see that any application could be implemented by the same hardware, by changing its operating context; in this case, by loading different high level software. That software then determines the (data) ==> (output) relation.
"2. You mentioned "uncaused changes" at one point. Are you saying there is no such thing as randomness or do you accept that random outcomes can still be causal (just not deterministic)?"
- quantum physics tells us - if we believe the standard view - there is stuff out there which is neither causal nor deterministic. I find it strange that this fundamental discovery is still not accepted (at least implicitly) by many physicists today.
3. Regarding logic, one could argue that physics partly emerges from logic itself in which case it would not be particularly unusual to have higher-level logic dictating physical processes since, in some sense, logic may be even more fundamental and universal.
- I completely agree.
"4. Why can't multiple processes/paths lead to the same conclusion? I fail to see why this is a bad thing. (This question/comment refers to point D on pp. 5-6)."
- I'm not saying its a bad thing: on the contrary, it`s very positive because that is what underlies emergence of higher level entities that are independent of their lower level representations.
"5. You suggest that interactions are necessarily higher-level phenomena from particles themselves, but one could fairly easily argue that quantum field theory says that they are, in some sense, *more* fundamental than particles (or, at the very least, *as* fundamental)."
- well they arise from interactions of effective particles: however those arise.
"6. I'm still unconvinced by argument 6e on p. 7."
- I agree its debatable. But there is lots of evidence of the importance of random processes in microbiology and in brain microprocesses. It is a hypothesis that this might relate to quantum uncertainty. Needs development and testing.
"Finally, while I agree that there is most definitely some top-down causality in the universe, I tend to think that, in general, it tends to "drift" upward, if you will, i.e. if you were to model causal flow as a process, it would be like a random walk with drift with the drift going from the simple to the more complex."
- it starts off as a random walk, But then it is crucial that some outcomes of that random process get selected and others get rejected. It is that selection process (locally going against the grain of entropy growth) that underlies the growth of true complexity, and lifts causation from physical to biological. In physical terms, it is a non-unitary process, and it is not random: it is directed by the selection criteria. In biological terms, it is where useful information originates.
George
view post as summary
Member Ian Durham replied on Sep. 6, 2012 @ 15:18 GMT
George,
Thanks for the reply. Regarding, the computing example, on the one hand I see your point (and Turing's), but the reason I thought it was a bit oversimplified is because the program that one chooses to run is ultimately constrained by the underlying physics of the machine you're running it on. In quantum computing, for example, D-Wave's system (which Lidar's group at USC has shown has coherence times consistent with it being truly quantum) can really only run certain types of tasks (e.g. it happens to be best suited to machine-learning tasks). This is precisely because it is an adiabatic quantum computer. The way the adiabatic aspect of its implementation limits what it can do.
Sorry for leading us off-topic with the comments about universes, but it is something intriguing to consider at any rate.
Cheers,
Ian
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 16:48 GMT
Hi Ian
You say "the program that one chooses to run is ultimately constrained by the underlying physics of the machine you're running it on." This is a crucial claim you are making, and it's not true of Universal Turing machines - that is the whole point of Turing's discovery of this concept (unless you are talking about how long it will take to complete the job - that is indeed physically dependent).
Just for the record - when Turing developed his idea, "computers" were usually *people* who performed a specified task on data and then passed a slip of paper on to the next person down the line. That's the implementation context he had most in mind! (it was common in astronomy round the turn of the last century) The point of an algorithm is it does not matter how it is implemented, by people or electronics, the result is the same: and it can be any algorithm whatever, as long as it is well defined.
So if there are limits to what an adiabatic quantum computer can do, so much the worse for them: there are non-quantum computers that can do better. You can run *any* algorithm on a classical digital computer. Whether it will halt or not is another issue, but that's got nothing to do with the choice of physical implementation.
Cheers
George
Author George F. R. Ellis replied on Sep. 6, 2012 @ 16:55 GMT
Addendum:
"Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm .... The model of computation that Turing called his "universal machine"—"U" for short—is considered by some (cf Davis (2000)) to have been the fundamental theoretical breakthrough that led to the notion of the Stored-program computer."
From
wikipedia George
Member Ian Durham replied on Sep. 6, 2012 @ 19:17 GMT
Hi George,
Oh yes, a universal Turing machine should be able to run *any* program since all such programs boil down to a finite set of logic functions. What I meant was, they won't all run the same programs equally well. In the case of an adiabatic quantum computer (and other quantum computers) this is at least in part a direct result of the underlying physical implementation. So it's a matter of efficiency which, while not a flat-out restriction, is still an effect produced by the underlying physics.
Ian
report post as inappropriate
Author George F. R. Ellis replied on Sep. 7, 2012 @ 09:37 GMT
Sure I agree on that. Efficiency is determined by both the underlying physics, and how it is deployed (design issues enter here); possibility is not.
George
hide replies
Anton W.M. Biermans wrote on Sep. 6, 2012 @ 03:16 GMT
Dear George,
In your reply you say "the word "create" has no meaning if there are no causes"
As I argued, if when there is a cause, to rationally understand it, we must be able to reduce it a previous cause, then this chain of cause-and-effect goes on ad infinitum, or it stops at some primordial cause which, as it cannot be reduced to a preceding cause, cannot be understood by definition, then nothing has any meaning since we cannot find its ultimate cause.
You still haven't pointed out what is wrong with this reasoning.
The problem is that a universe which has a cause, by definition has been created by some outside interference and violates the conservation law according to which what comes out of nothing must add to nothing.
If "the word "create" has no meaning if there are no causes" means that according to you a bigbang universe must have a cause, that is, has been created by some outside intervention, then this universe cannot be understood even in principle, so I'm afraid that the bigbang hypothesis a fairy tale.
In contrast, as I argue
here (or, in
this study), a self-creating universe has no cause and nor does it violate conservation laws as it has no physical reality as a whole: since in this universe particles create, cause one another, it can be understood rationally.
Anton
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 05:50 GMT
Hi, you are giving exactly the same arguments over again. And my reply remains the same.
"As I argued, if when there is a cause, to rationally understand it, we must be able to reduce it a previous cause, then this chain of cause-and-effect goes on ad infinitum, or it stops at some primordial cause which, as it cannot be reduced to a preceding cause, cannot be understood by definition, then nothing has any meaning since we cannot find its ultimate cause. You still haven't pointed out what is wrong with this reasoning." As I stated before, you don't have to understand ultimate meaning in order to understand local meaning. For example, we can carry out this discussion without knowing if God exists or if random chance underlies all.
I am simply not debating ultimate causation in this paper. Please see "Is There “Ultimate Stuff” and Are There “Ultimate Reasons”?" by David Rousseau and Julie Rousseau for that debate, which is not the topic of my essay. If you are not willing to look at how causation works in local situations such as daily life, my essay is obviously of no interest to you and you should debate with them.
"I'm afraid that the bigbang hypothesis a fairy tale." Ok so present day cosmology goes out the window.
"In contrast, as I argue here (or, in this study), a self-creating universe has no cause and nor does it violate conservation laws as it has no physical reality as a whole: since in this universe particles create, cause one another, it can be understood rationally." So how do particles come into being that can create themselves? If that has any meaning, it has nothing to do with this essay. take it up with David Rousseau and Julie Rousseau.
George
Anton W.M. Biermans wrote on Sep. 6, 2012 @ 03:27 GMT
I see that the links in the above reply don't work: for the essay, see "Einsteins' error", for the study, see www.quantumgravity.nl
Anton
report post as inappropriate
There is nothing new under the sun wrote on Sep. 6, 2012 @ 19:31 GMT
You start your amusing essay with the outdated thinking of Dirac: "chemistry is just an application of quantum physics." But as the Nobel laureate in physics P.W. Anderson wrote in his famous paper "More is different", published in Science, "Chemistry is not applied physics and biology is not applied chemistry."
You then present your belief that bottom-up causation is wrong and promise us that you find many examples of top-bottom causation. You write that "There is nothing new in all this: it's just that we don't usually talk about this as top-down effects."
The problem here is not only that there is nothing new, but that you only provide examples of bottom-up causation. No need to review all your examples, but I will comment on the arrow of time in cosmology and the Caldeira-Leggett model in quantum physics. As is well-known, the cosmological arrow of time can be derived by applying the usual cosmological approximations to the arrow of time at macroscopic scale. There is nothing fundamental in an approximation of the fundamental microscopic description. The same criticism about the Caldeira-Leggett model. This is a well-known approximated model which is derived from the microscopic description (check the section "microscopic derivations" of the same book that you cite). Again there is nothing fundamental in an approximation of the fundamental microscopic description.
The conclusion here is that your top-down causation hypothesis is nothing but the bottom-up causation in disguise.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 6, 2012 @ 20:49 GMT
"As the Nobel laureate in physics P.W. Anderson wrote in his famous paper "More is different", published in Science, "Chemistry is not applied physics and biology is not applied chemistry." Well yes, that's reference [16] in my essay. Guess you failed to notice I'd referred to it.
"You then present your belief that bottom-up causation is wrong". Incorrect. I did not say its wrong, just that...
view entire post
"As the Nobel laureate in physics P.W. Anderson wrote in his famous paper "More is different", published in Science, "Chemistry is not applied physics and biology is not applied chemistry." Well yes, that's reference [16] in my essay. Guess you failed to notice I'd referred to it.
"You then present your belief that bottom-up causation is wrong". Incorrect. I did not say its wrong, just that it's not the whole story. Of course it occurs.
"You only provide examples of bottom-up causation". Incorrect. You choose to ignore all but two of the examples I give.
"The cosmological arrow of time can be derived by applying the usual cosmological approximations to the arrow of time at macroscopic scale. There is nothing fundamental in an approximation of the fundamental microscopic description." Actually the causation is the other way round. It's the *macroscopic* arrow of time that derives from conditions at the cosmological scale. At the microscopic scale that there is no preferred arrow of time, and approximating those interactions at the micro scale won't give you an arrow of time when there is none there to begin with. You have to get it from large scale properties of the distribution of matter in the distant past.
"The Caldeira-Leggett model.. is a well-known approximated model which is derived from the microscopic description." Well yes of course it's well known: I referred to it by name, with references. The question is how introduction of the "counter term" is justified. One of the standard physics texts phrases it "this term is added in for convenience". But you can't just add in a term for convenience: you have to derive it from the interactions in the problem. All the bottom up interactions are already covered by the other three terms in the Lagrangian. You have to add it in to account for the affects that are *not* derivable in a bottom up way from those interactions alone.
"The conclusion here is that your top-down causation hypothesis is nothing but the bottom-up causation in disguise." Well I choose to side with Nobel prize winner Bob Laughlin's analysis (reference [12]) rather than yours.
view post as summary
T H Ray replied on Sep. 6, 2012 @ 21:24 GMT
The O.P. didn't read the same essay as I. I saw a complex system model in which top down causation is linked to laterally distributed causality. Self organized order with feedback.
Tom
report post as inappropriate
Author George F. R. Ellis replied on Sep. 7, 2012 @ 05:02 GMT
Hi Tom
Agreed. It's a world away from the way fundamental physicists usually think, because they are unfamiliar with all that literature and with that way of thinking, so they find it difficult to relate to this viewpoint. The problem is that their restricted view, which excludes these effects, is supposed by them to encapsulate all forms of causation that occur in the real world. Not true.
Condensed matter physicists such as Anderson and Laughlin understand the crucial causal connections, which is why they take a broader view than this anonymous commentator (and win Nobel prizes in consequence). But their ideas are crucial to fundamental physics too: vide the key role Anderson's ideas on broken symmetries played in the development of the Higgs mechanism.
George
Thomas Howard Ray replied on Sep. 7, 2012 @ 15:36 GMT
George, I hope you find time to get involved with one of the several institutes devoted to complex system research, such as NECSI or SFI. I think you'd find the highly interdisciplinary climate very comfortable.
Tom
report post as inappropriate
There is nothing new under the sun replied on Sep. 7, 2012 @ 19:18 GMT
I introduced the quote from the Science paper, because this important quote cannot be found in your essay.
You have truncated part of what I wrote and then missed my point. *All* the examples that you believe show that bottom-up causation "is wrong" are compatible with ordinary bottom-up causation and invalidate your hypothesis.
The macroscopic arrow of time can be obtained from the microscopic description. The cosmological description is a coarse-grained approximation to the microscopic description. This is all well-known and explained in many excellent textbooks although ignored by some cosmologists.
The "counter term" in the Caldeira-Leggett model is an ordinary renormalization term. In the same textbook that you use as reference, the Caldeira-Leggett model is introduced in the section on quantum Brownian motion. As everyone knows quantum Brownian motion is compatible with ordinary bottom-up causation. You would also check the section "microscopic derivations" of the cited textbook before continuing posting such incorrect thoughts.
report post as inappropriate
hide replies
Author George F. R. Ellis wrote on Sep. 7, 2012 @ 22:40 GMT
"I introduced the quote from the Science paper, because this important quote cannot be found in your essay." There are numerous important quotes I could not include because of the length limits on the essay. I have no obligation to include any particular one that you prefer.
"*All* the examples that you believe show that bottom-up causation "is wrong" are compatible with ordinary bottom-up...
view entire post
"I introduced the quote from the Science paper, because this important quote cannot be found in your essay." There are numerous important quotes I could not include because of the length limits on the essay. I have no obligation to include any particular one that you prefer.
"*All* the examples that you believe show that bottom-up causation "is wrong" are compatible with ordinary bottom-up causation and invalidate your hypothesis." I deny this claim of yours. In particular it does not apply for example to the way that abstract algorithms control computerised robots. There is no way you can derive those algorithms from the underlying physics in a bottom up way. The relevant variables are not coarse grained versions of lower level variables, or derivable from them in any other way. It also does not apply to the physiology of the heart, as explained by Denis Noble in his writings on physiology, or to epigenetics, as explained by Gilbert and Epel.
"The macroscopic arrow of time can be obtained from the microscopic description. The cosmological description is a coarse-grained approximation to the microscopic description. This is all well-known and explained in many excellent textbooks although ignored by some cosmologists." Of course the cosmological description is a coarse grained approximation to the microscopic description; see my GR10 lectures from 1984 for a clear description of how this works. This feature is incapable of explaining the arrow of time, as there is no arrow of time in the micro level physical laws. This was known already to Loschmidt and Boltzmann (the key point is that Boltzmann's derivation of the H-theorem works equally well for both directions of time). It is for this reason that authors such as Roger Penrose and Sean Carroll relate the arrow of time to a global low entropy state in the early universe. That is a macro state that has to be described at a macro level of description.
"The "counter term" in the Caldeira-Leggett model is an ordinary renormalization term. In the same textbook that you use as reference, the Caldeira-Leggett model is introduced in the section on quantum Brownian motion. As everyone knows quantum Brownian motion is compatible with ordinary bottom-up causation." I will reconsider this when I have the chance, the issue being whether renormalisation can be regarded as representing a purely bottom up effect or not. It is conceivable this review could lead me to change my opinion in this particular case. But your claims of a purely bottom up explanation won't work for example in the case of superfluidity, as is carefully explained by Robert Laughlin in his Nobel lecture.
view post as summary
M. V. Vasilyeva wrote on Sep. 8, 2012 @ 08:48 GMT
Dear George, you agrue that causation is top down with those who argue it is bottom up. But even with feedback loops, either way, it is still a linear, one-dimensional view on things. Have you considered that it may be neither? (I know from your essay that you considered it could be both).
What if causation is a multidimentional, convoluted, worse yet, fractal thingie that defies all methodologies trying to trace it with a finger like a crack on the wall?
It so happens that all things in life and in physics are interconnected; nothing is ever a single, naked point in spacetime. Everything has many different causes, large and small, near and far, that converge into making that particular thing manifest. And it, in turn, also causes so many other things manifest, large and small, near and far -- after colliding, merging and parting ways with so many other things and causes, large and small, near and far..
Ah?
report post as inappropriate
Author George F. R. Ellis replied on Sep. 8, 2012 @ 10:02 GMT
I have sympathy with this view: "What if causation is a multidimentional, convoluted, worse yet, fractal thingie that defies all methodologies trying to trace it with a finger like a crack on the wall?" Yes the web of interactions in the real world is very complex. But it's not fractal: there seems always to be a lower level where in small enough domains the essential causal interactions are linear. Their outcome depends on context, but nevertheless we can understand the elements of causation by looking at such small local systems.
It is when you put them together to get really complex interaction networks that things get really complex: but even then there are identifiable hierarchical structures and network motifs that let us understand much of what is going on. It is in this context that we can reliably identify both bottom up and top down elements of causation.
So I am not as pessimistic as you.
George Ellis
M. V. Vasilyeva replied on Sep. 8, 2012 @ 15:53 GMT
lol good for you, George!
But I seriously think that information is fractal (and information is related to causation). I base it on my observation of the real world, when I was trying to understand some strange phenomena. To me it appeared that an event had many consequences, large and small, near and far, just as I said above. And I saw those consequences (I am a visual type) as paisleys of various sizes making up a flowery pattern on a fabric. And the interesting thing I saw was this: before an event arrives (a big paisley), there are many small paisleys (and of course a few medium size ones) that arrive before it, in a way, announcing the arrival of the main event. They do it many times, at various times. And they run in streaks, like it befits a fractal thing proper. Likewise, after the main event, there are many "aftershocks", large and small, running in streaks. Then a streak changes as if madam info got tired of her tune.
I hope I made sense. This view of events as paisleys on repeated patterns, large and small, made me think that info is indeed fractal. And information, you must admit, is related to causation.
Also, when we think of causation, we tend to oversimplify and consider real only the obvious things, like, 'heating up water causes it to boil'. Usually, what's left out are things like, why exactly did mom put the kettle on the stove. Was it because she wanted some tea? Or because she expected a company? Maybe a habit; she always does it around that time. Even when a person thinks that he or she had a clear, well defined intent, in reality it is virtually impossible to trace the "intended action" to something concrete. As long as an action appears reasonable in a given context, one can always find a reasonable explanation. The trouble is, reasonable explanations are rarely right. Worse yet, they rob us of our illusion of free will.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 8, 2012 @ 16:19 GMT
I find the first part of your response a bit mysterious: it may be to do with the mind rather than physics.
I very much like the second part about the kettle: it raises the real deep issues. The thing that's puzzling is how these deep questions relate to physics: how can we take the physics seriously, not underestimating it in any way, but not also not trivialising the deep issues? -- in particular, not trivialising life, consciousness, and the issue of free will. That's the real challenge that underlies this whole discussion.
You may have missed the post in this thread by J. C. N. Smith on Jul. 20. It's worth reading.
George Ellis
M. V. Vasilyeva replied on Sep. 8, 2012 @ 20:03 GMT
Yes, thank you, that was a very good reading.
About my mysterious response, I was just in a playful mood last night which continued into this morning (hope you don't mind). What I find mysterious is that I made both posts above _before_ I followed your lead and read "Patterns in the Fabric of Nature" by Steven Weinstein, which brings up the complexity of non-locality of causation and the question of free will. Coincidence?
It is tempting to see my largely unconscious actions (2 posts above) as precursors to the event about to unfold (reading Steve's essay), as if the arrow of time can be reversed. Since I do not believe in the reversal of the arrow of time, I see this sort of coincidences as the proof of the fractal nature of information (and causality), which I tried to convey above. How else can this be explained?
report post as inappropriate
hide replies
Ted Erikson wrote on Sep. 8, 2012 @ 17:16 GMT
GE:
Yours was an interesting and informative essay.. As a newcomer to the FQXi community, I feel few of the "community" grade, or even look at, my essay which approaches the problem very realistically, based on an internal view.. Might you look at it, comment if so inclined, and grade it?
To Seek Unknown Shores
http://fqxi.org/community/forum/topic/1409
Thank you
TE
report post as inappropriate
Author George F. R. Ellis replied on Sep. 9, 2012 @ 15:29 GMT
Ted it's a much more personal essay than is customary - that makes it interesting but also somewhat distant from present day fundamental physics. However I am glad to see you are getting quite a few responses.
George
Pentcho Valev wrote on Sep. 9, 2012 @ 11:32 GMT
George Ellis,
Your essay is empty - no wonder it gets top community rating. It is tautological - you just define as "top-down causation" influences that go from what you see (or mankind has defined) as higher hierachical level to what you see (or mankind has defined) as lower hierachical level.
Pentcho Valev pvalev@yahoo.com
report post as inappropriate
Author George F. R. Ellis replied on Sep. 9, 2012 @ 15:11 GMT
Dear Pentcho Valev
Your comments would carry much more weight if you chose to omit the insults and sarcasm.
You claim "It is tautological - you just define as "top-down causation" influences that go from what you see (or mankind has defined) as higher hierarchical level to what you see (or mankind has defined) as lower hierarchical level".
There is indeed an issue here: how does one define higher and lower levels? That has to be done depending on context: it is quite different in the physical sciences, the life sciences, and artificial sciences. It can be sensibly done, as is shown in various articles referred to in my essay (indeed one cannot understand complex systems without categorising such levels: see for example Tannenbaum's book on digital computers for that particular case).
Given that context, this is a sound definition. The scientific issue is whether there are instances of existence of such effects. My claim is that there are indeed many examples showing this is the case, across the sciences. There is much evidence to support this claim.
George Ellis
Pentcho Valev replied on Sep. 9, 2012 @ 15:41 GMT
Dear George,
Roughly speaking, in complex systems anything affects anything, so defining "higher levels" and "lower levels" and then finding instances where "higher levels" cause changes in "lower levels" is not very profound.
Pentcho Valev
report post as inappropriate
Author George F. R. Ellis replied on Sep. 9, 2012 @ 17:56 GMT
Dear Pentcho Valev,
"Roughly speaking, in complex systems anything affects anything" Yes that certainly is roughly speaking and unenlightening.
In biology and in computer systems, the key feature leading to real complexity is existence of Modular Hierarchical Structures, where each word is important: see particularly the books by H A Simon and by Grady Booch that I refer to for an enlightening discussion. Indeed you can't understand biology without taking this kind of structure into account (with cells being the key modules on which it all hinges), see Campbell and Reece for example, nor can you understand digital computers without taking it into account, see Tannenbaum. Example: for the hierarchical structure of genetic modules, see the book "Modularity in Evolution and Development" by Schlosser and Wagner. Then there are various interlocking hierarchies in the brain, even though there is no specific "top", see for example Leergard, Hilgetag and Sporns, "Mapping the connectome: multi-level analysis of brain connectivity" (Frontiers in Neuroinformatics, Volume 6 Article 14, May 2012).
So there certainly are hierarchical systems out there. It is then a key issue as to how causation works in such systems. That is what my essay tries to address. Again I particularly point to Booch's book at characterising clearly core features (such as inheritance and information hiding) that are needed for complexity to emerge .
You do not find it profound.Fine, point me to something that is indeed profound and clearly illuminates how genuinely complexity emerges on the basis of the underlying physics (and please don't just give me the statistical properties of complex networks: even if they are useful in some ways, they simply fail to get at the essence of what is going on. Uri Alon's book on network motifs gets much closer to the real dynamics).
George Ellis
Jonathan Kerr wrote on Sep. 9, 2012 @ 12:49 GMT
Dear George Ellis,
As a relativist who has written about the structure of spacetime, how do you get causality out of block time? You don't mention block time in your essay, but it would seem from your bio that you think Minkowski spacetime is right, and that leads unavoidably to block time.
I've argued in
my essay that if block time is right, and motion through time is an illusion, then the laws of physics, and crucial principles such as cause and effect, would have to exist within the illusion, because they require motion through time.
I've also argued that the two levels of time we seem to find, block time and motion through time, can't possibly co-exist as simply part of the nature of the time dimension, as many assume. Given the intrinsic unpredicability of quantum events, the two levels disagree over whether the future already exists. This means only one of these levels can be real - I've examined both possibilities.
Block time is one of these two possibilities. But it implies that you can't assume the spacetime interpretation of SR is right, and also have causality. This is one of many problems with block time that people often ignore. Because the time dimension is taken to be different from the other dimensions in Minkowski spacetime, people sometimes assume that the nature of the dimension somehow solves the problem. I've tried to show that this can't be the case. Do you have a way to get causality out of block time?
Best wishes,
Jonathan Kerr
report post as inappropriate
Author George F. R. Ellis replied on Sep. 9, 2012 @ 14:51 GMT
Dear Jonathan Kerr,
that is not the topic of this essay or thread. However I have just put an extensive paper on the archive [ink:http://xxx.lanl.gov/abs/1208.2611]
Space time and the passage of time looking at the issue in detail. In brief: there is no problem if one has an Emerging Block Universes (EBU).
George Ellis
Author George F. R. Ellis replied on Sep. 9, 2012 @ 15:13 GMT
typo: However I have just put an extensive paper on the archive
Space time and the passage of time looking at the issue in detail. In brief: there is no problem if one has an Emerging Block Universes (EBU).
Jonathan Kerr wrote on Sep. 9, 2012 @ 19:05 GMT
Dear George,
The issue is unavoidably relevant to your essay. Your essay discusses causality, which you admit in your arXiv paper can only exist if major adjustments are made to the standard view of spacetime. And spacetime is one of your fields. I'm sure you'll see the need to explain how the subject of your essay works in relation to this, as it's entirely dependent on it - and we're...
view entire post
Dear George,
The issue is unavoidably relevant to your essay. Your essay discusses causality, which you admit in your arXiv paper can only exist if major adjustments are made to the standard view of spacetime. And spacetime is one of your fields. I'm sure you'll see the need to explain how the subject of your essay works in relation to this, as it's entirely dependent on it - and we're meant to be looking at the foundations.
In your arXiv paper you've argued very well for a flow of time. I call it motion through time, but we agree absolutely that it must exist. You've shown Barbour and others to be wrong in the idea that motion through time is an illusion, as I have in other ways. And you've argued, as I have, that standard block time is wrong. That means we both think the Rietdijk-Putnam argument, which is a rigourous proof that a fixed future comes out of spacetime, is wrong. Perhaps you'd explain why you think it's wrong, I've done that by suggesting the flaw is in the assumption that simultaneity across a distance has meaning (beyond the light cone).
You've also come up with an excellent way of hooking the quantum randomness up to the large-scale world in a Schrödinger type way, but using it to show that the future is unfixed. That makes a neat distillation the issue of quantum uncertainty versus the block time fixed future, which is a key part of my argument as well.
So we agree on a lot. We only disagree on whether the spacetime interpretation of SR can be kept, given this need to adjust the block time picture - we agree that SR itself is right. You're one of a number of people who (comparatively recently) seem to have come round to the idea that time must flow, who've then tried to bring that idea into block time, while keeping spacetime much as it is. This risks being 'cake and eat it' - you might have to choose. Ideas such as the 'crystallysing block universe' are similar to your EBU, or emerging block universe. Basically, the idea is that the block sets as time moves through it.
I can show a major weakness in this approach, and I see it as an attempt to 'fix-up' the spacetime interpretation, when it simply may be the wrong interpretation. The crux of the Rietdijk-Putnam argument is that an event can be in the past to one observer, but in the future to another. If that idea is wrong, block time falls apart completely. I've argued that it's wrong (because it depends on long range simultaneity).
But in your language, or that of the EBU, this means that to one observer an event has already been frozen into the block, while to another observer it hasn't. The question of which events have gone into the fixed part of the block is observer-dependent. As you probably see, this greatly weakens that whole approach. How then are we to deal with the physics of how an event gets frozen into the block? Some say recently that the collapse of the wave function may be the process of an event going from future to past. But again, if this is entirely observer-dependent, and depends on how one is moving, then it doesn't look like a physical process. And that problem is exactly what led to standard block time in the first place, along with illusions and all.
I'd appreciate your thoughts on this, and any comments on my essay, thank you.
Best wishes, Jonathan
view post as summary
report post as inappropriate
Jonathan Kerr wrote on Sep. 9, 2012 @ 21:57 GMT
PS. I see the 'crystallising block universe' and the 'emerging (originally evolving) block universe' come from the same place. The question is the same about all - how they get round the problem of the fact that which events have been crystallysed is observer-dependent (frame-dependent), and therefore seems to reside in the observer's perception if spacetime is right, just as with standard block time. JK
report post as inappropriate
Author George F. R. Ellis replied on Sep. 10, 2012 @ 07:12 GMT
Dear Jonathan
"I'm sure you'll see the need to explain how the subject of your essay works in relation to this, as it's entirely dependent on it - and we're meant to be looking at the foundations." It works by taking into account what I say in my arXiv paper on the passage of time.
"The crux of the Rietdijk-Putnam argument is that an event can be in the past to one observer, but in the future to another. If that idea is wrong, block time falls apart completely." But I argue in my paper that what observers think is past or future on other world lines does not matter. What matters is (a) what happens on their own worldlines, which must have a proper temporal ordering, and (b) what happens in terms of interactions between events on different worldlines, which are mediated by timelike and null curves. Spacelike surfaces and instantaneity do not enter into it. Finally (c) there are preferred worldlines in spacetime, as I explain, so in fact the Lorentz symmetry of the theory is broken in realistic solutions of the equations.
In any case for the purpose of the present essay, what I need is that causality works in local situations as described by ordinary physics, with a well-defined flow of time as embodied in the standard equations of physics such as Newton's laws and Maxwell's equations and the Schroedinger equation, see the Feynman Lectures on Physics. All the evidence for ordinary physics shows this is true, indeed physics as we know it would not exist if it were not so. That is all I need for my essay.
George Ellis
Jonathan Kerr replied on Sep. 10, 2012 @ 09:22 GMT
Hello George,
Thank you for your reply. I didn't mean there was any need to justify your essay, I believe in causality too. I was just trying to show the relevance of the block time issue, and of bringing it into the discussion on this page.
You say that in the EBU "what observers think is past or future on other world lines does not matter". I'm sure it doesn't in many ways, but the point I've made it that it matters if we're trying to pinpoint the crystallation of an event, and look at what could cause it in that context. It's all very well saying that an event goes into the fixed past because of the collapse of the wave function. But if to some observers it already has, while to other observers moving differently it hasn't yet been crystallised, then we end up with what looks like a perception-based thing, and exactly the same setup that led to the problems of standard block time in the first place.
I think it's very good that you're argued for a flow of time, or motion through time, as strongly as you have. I'll look at your work further, and will almost certainly refer to it in mine. People are coming round to the idea that block time is wrong, partly because it seems that something has to give if we're to get to quantum gravity. To me, a slight tweak to spacetime is not enough, and the problems with it do not wash out so easily, as I've shown. I think we need to face up to the fact that we simply have the wrong interpretation in front of us. Spacetime looks right, and it has helped us simplify a lot of theories. There's a great reluctance to reverse out of the cul-de-sac. But it's an interpretation, and if one gets conceptual problems with an interpretation, then one probably needs a new one.
I'd appreciate any thoughts you might have on my essay,
Best wishes, Jonathan
report post as inappropriate
Author George F. R. Ellis replied on Sep. 10, 2012 @ 18:50 GMT
Hi Jonathan
I basically agree with you, and with what is in your essay. Block time is fine if it has a future boundary that keeps changing - that resolves the puzzles you point out in your essay. And spin foam models seem to be like this, in essence, so quantum gravity is not incompatible with this scheme.
best wishes
George Ellis
Jonathan Kerr replied on Sep. 10, 2012 @ 20:33 GMT
Hello George,
Thank you. It seems very premature to say that the puzzles I point out in my essay have been resolved. Because motion through time is not an illusion (in your view and mine), it needs a physical mechanism to explain it. And that mechanism should fit the clues well - it should show why motion through time is slowed down in certain situations, and why the equations that describe how it is slowed down apply.
And because a physical mechanism is needed, it can't be one that seems to point at a purely perception-based effect, as in what led to standard block time, and as in the weakness I've mentioned about the EBU picture. The observer must be somehow partly incorporated into the picture, but not in a way that seems to rule out anything other than the perception of the observer.
That's what standard block time does, and as you say, it simply doesn't work. But we may well find that more is needed than a repair job on the spacetime interpretation, to make it do the opposite of what it used to do. If the Rietdijk-Putnam argument is wrong, then spacetime in all forms may fail to fit the clues, as the same problems might keep on cropping up. To me it seems very likely that a new interpretation is needed, and that frequent failure to separate SR from spacetime has held us back.
Best wishes, Jonathan
report post as inappropriate
Georgina Parry replied on Sep. 11, 2012 @ 21:08 GMT
Dear Johnathan Kerr,
In my essay thread "What basic physical assumptions are wrong?" Georgina Parry, there is a high resolution version of diagram 1. which shows an explanatory framework for physics that addresses the problem you have just mentioned.
It isn't a block universe model but there is sequential iteration of the material aspect according to the existing relations and various constraints of physics and biology. Eg.conservation of energy, Pauli exclusion principle, minimisation of potential energy,inverse square law, relationships of volume and surface area, effects of concentration gradients, natural selection. The visible universe (including that which is visible using technology) is not that material one but a fabrication generated from received data, that was emitted or reflected from the material universe. This structure allows the universe of atoms and material things, going about their actions simultaneously, to coexist without contradiction with Einsteinian relativity.
It is an unusual structure because observer fabrications must exist wholly within the material reality even though they are different from it and show something different from what exists independently and simultaneously.For analogy: As Terry Pratchett's Disc World and Tolkien's Middle Earth exist within our world but are at the same time not our world.
The time that is shown by observation of the distant clock is different from the number or hand position that -is on the clock- independently, and the objects that are seen are also amalgamations of data formed by the observer not independent things-overcoming many paradoxes.
report post as inappropriate
hide replies
Eckard Blumschein wrote on Sep. 9, 2012 @ 23:21 GMT
Dear George Ellis,
Since you are calling yourself a relativist, I doubt that you are open for what the topic of the contest demands: "Questioning the Foundations". Doesn't this include Cantor's naive set theory, Einstein's special theory of relativity, Lorentz covariance, a priori existing Parmenidean spacetime, etc. too?
While I was initially fascinated by the many reasonable views you uttered, my doubts in correctness of relativity rose each time I read an essay or an other paper of you. I nonetheless accepted that there are many arguments in support of relativity.
Situation has suddenly changed after I tried and managed to understand an experiment by Norbert Feist. See Fig. 5 of my recent essay. I fear I cannot expect an other factual reply than silence from anybody who has to fear loss of reputation or who is simply lazy.
Do not get me wrong. I do not exclude that some relativity-related theories are about as useful approximations as is according to Ebbinghaus/Lessing Georg Cantor's naive set theory which was based on an obvious error.
Please do not feel hurt. You might understand my motivation and uncompromising rudeness when I tell you that some years ago a Hendrik van Hees blamed me for damaging the reputation of Otto-von-Guericke-University Magdeburg after I suggested that the ear performs a cosine rather than Fourier transformation. Fortunately my boss declared the matter undecidable because utterly foundational. MP3 works.
Respectfully,
Eckard Blumschein
report post as inappropriate
Author George F. R. Ellis replied on Sep. 10, 2012 @ 07:28 GMT
Dear Eckard Blumschein
" Since you are calling yourself a relativist, I doubt that you are open for what the topic of the contest demands: "Questioning the Foundations". Doesn't this include Cantor's naive set theory, Einstein's special theory of relativity, Lorentz covariance, a priori existing Parmenidean spacetime, etc. too? "
I chose to deal with a particular topic that I regard of importance, and you state you doubt that I am up to what the competition is about because I did not deal with a whole set of different topics. What is the point of this gratuitous insult?
"Situation has suddenly changed after I tried and managed to understand an experiment by Norbert Feist. See Fig. 5 of my recent essay. I fear I cannot expect an other factual reply than silence from anybody who has to fear loss of reputation or who is simply lazy." This kind of accusation does not motivate me to read your essay or respond to it. On the contrary.
"You might understand my motivation and uncompromising rudeness ..." I have no intention of responding to any further such rudeness. It is incompatible with my attempts to have cordial collegial discussions on the topic of my essay.
yours sincerely
George Ellis
Eckard Blumschein replied on Sep. 11, 2012 @ 07:32 GMT
Dear George Ellis,
While H. v. H. soon excused himself for his rudeness, it was demanding for me to force him by factual arguments to admit that he was wrong. I doubt that the question "Which of Our Basic Physical Assumptions are Wrong?" can be really answered in a cordial discussion among colleagues.
I see what you are calling bottom up causation the principle of superposition of influences. If you could really question it, then it then I was surprised. I will read you essay again. Maybe I overlooked something.
Yours sincerely,
Eckard Blumschein
report post as inappropriate
Author George F. R. Ellis replied on Sep. 11, 2012 @ 10:24 GMT
"I doubt that the question "Which of Our Basic Physical Assumptions are Wrong?" can be really answered in a cordial discussion among colleagues."
- what an extraordinary statement. I hope to never be in an institute where this is true. My own personal colleagues are able to behave in a collegial fashion. It is the hallmark of civilised discussion that you don't have to be rude to your opponent if you disagree with her/him.
"I see what you are calling bottom up causation the principle of superposition of influences. If you could really question it, then it then I was surprised. I will read you essay again. Maybe I overlooked something." Superposition is a linear interaction. Most real system in the universe are not linear. Yes they are based in linear interactions at the bottom level, where superposition holds, but these are put together in structures and complex interaction networks that result in non-linear behaviour at the higher level. These structures then act down in a non-linear ways on their component entities to allow them also to behave in a non-linear way. Example: state vector preparation (a non-unitary process). Example: superconductivity, where the lower level entities (Cooper pairs) only exist because of the context provided by specific crystal structure.
Many other examples are given in my more technical article on which this essay is based, see
here .
George Ellis
Author George F. R. Ellis replied on Sep. 11, 2012 @ 12:24 GMT
Just for information, re the previous interchange: this is the proposal for the Nature of physical reality I make in the technical article I referred to:
1. Combinatorial structure: Physical reality is made of linearly behaving components combined in non-linear ways.
2. Emergence: Higher level behaviour emerges from this lower level structure.
3. Contextuality: The way the lower level elements behaves depends on the context in which they are imbedded.
4. Quantum Foundations: Quantum theory is the universal foundation of what
happens, through applying locally to the lower level (very small scale) entities at all times and places.
5. Quantum limitations: The essential linearity of quantum theory cannot be assumed to necessarily hold at higher (larger scale) levels: it will be true only if it can be shown to emerge from the specific combination of lower level elements.
Eckard Blumschein replied on Sep. 11, 2012 @ 13:48 GMT
"wherever equivalence classes of entities play a key role, ... this is an indication that top-down causation is at play."
The real numbers are defined as equivalence classes. OK, perhaps "functional" equivalence classes are meant, and I do not know what functional means in this context.
Are standing waves a convincing example for top-down causation? As an EE, I would like to distinguish between waves that are really bouncing back and forth in a cavity and the abstract mathematical model which endlessly extends within the abstract fictitious time scale from minus infinity to plus infinity and ignores the trifle that a real "standing" wave always has a begin and an end, cf. my Fig. 1.
Let me go one searching for something truly basic that elucidates at least one out of the various enigmas and paradoxes I listed at the beginning of my essay.
Eckard
report post as inappropriate
Eckard Blumschein replied on Sep. 13, 2012 @ 17:13 GMT
Dear Professor George Ellis,
I reaffirm that my criticism only addresses Georg Cantor's naive set theory and what I consider implications of a revealed as wrong conclusion from experiment by Michelson and Morley. I respect your work and will go on trying to understand it. Unfortunately you seem to consider it not worth showing in what my essay is wrong. I envy you for facing rich criticism that provides you good chances to explain what you meant with top-down. While I am a bit familiar with the interplay of caudal to cortical and top-down propagation of information in case of auditory perception, I did never see this a reversal to the direction in the causal chain. Your essay seems to be too genial for an old EE like me.
With respect and sincere apologies,
Dr. E. Blumschein
Dear Daryl,
Thank you for giving me a hint. I did not forget that someone somewhere asked me for something. This was embarrassing to me. Where do you expect my reply?
Regards,
Eckard
report post as inappropriate
hide replies
Joel Rice wrote on Sep. 10, 2012 @ 14:13 GMT
You mentioned that the issue was whether algebra can deal with a Modular Hierarchical structure, and the need for graph theory. Just wondering if there are top down aspects to a supernova, just to have a nice physical example to chew on, especially for being so 'event-like', and anthropic questions ? If so, it seems like there is more going on than modular hierarchy and graph theory, like why can't elements build up without kicking leptons out of the neighborhood.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 10, 2012 @ 18:34 GMT
Hi Joel
well the supernova stuff you mention is mainly due to local reactions in the core, but the context is set by the star as whole, which generates huge temperatures at the centre due to its global structure and the resulting gravitational field. Hence that's a top-down effect from the star as whole to the nuclear reaction rates at the centre.
Leptons escaping is possible because the lepton density outside is much smaller than inside (a version of Olber's paradox for leptons): it could not happen if the star were immersed in a high-density lepton sea, just as the sun could not shine if it was immersed in radiation at the temperature of its surface. Such "non-interference" effects can be thought of as a causal relation: the relation is that a possible interaction does not take place! Isolated systems can only occur because the universe does not interfere with local systems; this will not be the case in some possible universes (e.g. ones that are always immersed in dense gravitational radiation).
George
There is nothing new under the sun wrote on Sep. 11, 2012 @ 23:00 GMT
This is a matter of preference. You cite the incorrect Dirac's phrase. I cite the modern correction by P.W. Anderson.
The Science paper does not confound reductionism with causality and shows how the emergence of new properties at higher levels is compatible with the ordinary bottom-up causation of physics. Murray Gell-Mann, another Nobel laureate who is now working in complexity at SFI, has an entire book devoted to such issues. The physiology of the heart is also compatible with ordinary bottom-up causation.
Anyone reading this amusing essay should look at Weinberg's proof of Boltzmann's H-theorem (p.150 of volume 1 of "The Quantum Theory of Fields"). This modern proof of entropy increase is formulated in the language of quantum field theory and avoids approximations, such as the Born approximation or time-symmetry invariance, which are used in ordinary statistical physics proofs. Cosmology is unneeded in the proof of the H-theorem.
You name two cosmologists. Their work is incorrect. One of them gave a talk in Santa Cruz promoting the idea that cosmology is the cause of the second law of thermodynamics. One expert at the audience said:
"Finally, the magnitude of the entropy of the universe as a function of time is a very interesting problem for cosmology, but to suggest that a law of physics depends on it is sheer nonsense. Xxxxxxx's statement that the second law owes its existence to cosmology is one of the [dumbest] remarks I heard in any of our physics colloquia, apart from [Rosenblum & Kuttner]'s earlier remarks about consciousness in quantum mechanics. I am astounded that physicists in the audience always listen politely to such nonsense. Afterwards, I had dinner with some graduate students who readily understood my objections, but Xxxxxxx remained adamant."
You write "It is conceivable this review could lead me to change my opinion". My goal was to expose some elementary facts ignored in your amusing essay.
report post as inappropriate
Anonymous wrote on Sep. 12, 2012 @ 04:46 GMT
1: "Murray Gell-Mann, another Nobel laureate who is now working in complexity at SFI, has an entire book devoted to such issues." That book is about adaptive selection, which is a form of top-down causation, as has been very clearly demonstrated in many writings since the seminal paper on the topic by Donald Campbell; see for example the book The Re-emergence of Emergence edited by Clayton and...
view entire post
1: "Murray Gell-Mann, another Nobel laureate who is now working in complexity at SFI, has an entire book devoted to such issues." That book is about adaptive selection, which is a form of top-down causation, as has been very clearly demonstrated in many writings since the seminal paper on the topic by Donald Campbell; see for example the book The Re-emergence of Emergence edited by Clayton and Davies.
2. "The physiology of the heart is also compatible with ordinary bottom-up causation." This dogmatic statement is simply wrong. Denis Noble is a world authority on the physiology of the heart, and there is no reason what ver to believe you know more about it than he does.
3. "Cosmology is unneeded in the proof of the H-theorem." Indeed. And the result is a theorem which does not resolve the problem of the arrow of time because it continues to hold when you reverse the direction of time. Extraordinary that you call time-symmetry invariance an approximation, when it is at the heart of all fundamental interactions except the weak interaction. Are you trying to say that the cause of the arrow of time is the very small weak time asymmetry of the weak interaction? If so please explain how this works. No one who has looked at it seriously believes this.
4. "You name two cosmologists. Their work is incorrect." I am totally unimpressed by someone hiding behind a cloak of anonymity making such a statement. What standing do you have to dismiss two of the deepest thinkers in cosmology? You then quote some unnamed other person who is supposed to an expert on the subject. The tone of the response shows they, like you, are simply unable to engage with the core issue: it is not that the second law owes its existence to cosmology, it is that the statistical derivation of the second law predicts equally that entropy will increase both to the past and the future because it applies whichever arrow of time you choose. The same applies to any derivation from quantum field theory, unless it claims to derive the arrow of time for all physics from the weak interaction, which is totally implausible. If they and you can't even admit the problem exists, then you have nothing useful to say on the arrow of time.
5. I note with interest how you fail to respond in any way to my remarks on the issue of superconductivity as discussed in Laughlin's Nobel lecture. It shows conclusively your sneering comments are simply wrong, and I do not believe for one minute that you understand it better than he does. No wonder you ignore it.
By the way as regards the pseudonym you hide behind ("There is nothing new under the sun"): this is of course also incorrect. Amongst the thousands of examples I could cite, one will do: digital computers and the internet.
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Sep. 12, 2012 @ 04:49 GMT
That previous post was me. I thought I was logged in.
George Ellis
George Ellis replied on Sep. 12, 2012 @ 17:30 GMT
Ok I have looked at the Weinberg derivation (pages 150-151 in his book). I agree it's good to have a derivation that depends only on unitarity. Unitary transformations however are time reversible. There is therefore nothing in the dynamics that can choose one time direction as against the other as far as any dynamical development is concerned, just as there is no intrinsic difference between the particles alpha and beta.
Consequently just as in the case of the Boltzmann derivation of the H-theorem, the H-Theorem (3.6.20) will hold for both directions of time (just reverse the direction of time and relabel alpha to beta: the derivation goes through as before). This is the point which is explained very clearly by Penrose in his various books as regards Boltzmann's derivation. Weinberg's derivation of the H-theorem does not determine a preferred direction of time from the underlying unitary dynamics. It can't do so, as there is no preferred direction of time in that dynamic.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 13, 2012 @ 06:56 GMT
The point of the above is that it is impossible to derive the arrow of time - one of the most important aspects of macro physics and biology - from microphysics alone. Both statistical physics and quantum field theory give you a beautiful H-theorem: and the derivation applies equally in both directions of time (this applies for example to Weinberg's derivation of the H-theorem: see my last comment). Supposing you break this symmetry somehow by random fluctuations: you have no guarantee the direction of time will be the same everywhere. We do not see opposing arrows of time in the real universe. Bottom up causation alone is incapable of giving an explanation of one of the most important features of everyday physics.
Consequently, as pointed out by Wheeler, Feynman, Sciama, Davies, Zeh, Carroll, Penrose, and many others, one needs some global boundary condition to determine a consistent arrow of time in local physics:a top-down effect from the cosmological scale to everyday scales. This is what the Santa Cruz "experts" have not understood, even though the issue has been known since the time of Boltzmann and Loschmidt.
This global coordination is plausibly provided by a macro-scale low entropy condition of the early universe, see the writings of Carroll (From Eternity to Here: The Quest for the Ultimate Theory of Time) and Penrose (Cycles of Time). For readers who have not encountered this debate, a useful summary by Sean Carroll is
here ; and
here he gives extensive quotes from Feynman on the issue. For my own summary of this and other top-down effects in cosmology, see
here .
George
Georgina Parry replied on Sep. 13, 2012 @ 09:36 GMT
Dear George,
these are just some things, pertaining to your last post, that I think are worth considering, not necessarily requiring an answer now. What is the real universe, in your opinion? Impossible is a very strong word, are you sure?
There is a high resolution, horizontal version of diagram 1 in my essay thread.A reality interface can be a 'simple' light sensitive material such as cine film which changes in chemical structure when exposed. It does not have to be the sensory system of a sentient organism or a complex artificial detection device. Respectfully, Georgina
report post as inappropriate
Author George F. R. Ellis replied on Sep. 13, 2012 @ 15:54 GMT
Hi Georgina
well the word "impossible" here relates to the relation between two theories that are supposed to describe the real universe on different scales.
If you have a theory T1 that describes it on a small scale L1, you can in principle coarse-grain to get a theory T2 that applies on a larger scale L2. This is what happens for example with the kinetic theory of gases. Now if the theory T1 is subject to a symmetry S1, for example time-reversal invariance, then unless the averaging procedure explicitly breaks that symmetry, for example with an averaging procedure that changes with time, the theory T2 must necessarily exhibit the same symmetry. That is what underlies the discussion above: we believe that fundamental physics is time symmetric (more accurately: it has a PCT symmetry, as Sean Carroll explains), so any derived coarse grained theory must also be time symmetric. There is no option about this: that logic is where the word "impossible" comes from.
However the time asymmetry can come not from the dynamic equations of the theory but from the boundary conditions. That is what Wheeler, Feynmann, and many others have explored. Then it is a top-down effect from the context (the environment) to the solutions.
I agree completely about the reality interface. It can also be a plant leaf (chlorophyll acts the way you describe). Indeed any photo detector will do, such as a CCD in a camera, where a free electron is emitted at a detection event. Indeed it is free electron emission that underlies also the detection events you mention. You might find interesting my discussion of detectors in sections 6.2.2 and 6.2.3 of
this paper (which also discusses the averaging procedure I mentioned above in some detail).
George
hide replies
Author George F. R. Ellis wrote on Sep. 12, 2012 @ 12:08 GMT
For those of you who are interested in the relation of this topic to the brain, Karl Friston's article
A theory of cortical responses is excellent. He emphasizes the key role of hierarchical structuring in the brain. His "forward connections" are what I call bottom up, and his "backward connections" are what I call top-down (the difference in nomenclature is obviously immaterial). He makes quite clear that a mix of bottom up and top down causation is key as to how the brain works; backward connection mediate contextual effects and coordinate processing channels.
This is how the theme works out in a genuinely complex case. It is obviously compatible with the underlying physics, because it does indeed work. As in the case of digital computers, which can run any algorithm whatever, in the case of the brain the underlying physics enables us to think, but does not constrain what we are able to think about.
George
Thomas Howard Ray replied on Sep. 12, 2012 @ 18:06 GMT
" As in the case of digital computers, which can run any algorithm whatever, in the case of the brain the underlying physics enables us to think, but does not constrain what we are able to think about."
Right on, George. Androids may dream of electric sheep, but only a human brain can dream of an android dreaming of electric sheep.
The capacity for infinite regress cannot be programmed into a finite state machine.
Tom
report post as inappropriate
Author George F. R. Ellis replied on Sep. 12, 2012 @ 19:13 GMT
Thanks Tom.
And you may like this one:
Natural Selection and Multi-Level Causation, by Maximiliano Martínez and Andrés Moya, see section 3 for how downward causation is key to adaptive selection, the topic of Murray Gell-Mann's book, and hence to evolution.
George
Anonymous replied on Sep. 13, 2012 @ 11:25 GMT
Hi George,
Indeed, I do appreciate the Martinez-Moya reference.
I have thought for some time that brain science is the next great frontier of knowledge, because my wildest conjecture is that the brainscape perfectly mirrors an isolated cosmoscape in a simply connected network. Not to be too sci-fi on the subject -- as you say, " ... as pointed out by Wheeler, Feynman, Sciama,...
view entire post
Hi George,
Indeed, I do appreciate the Martinez-Moya reference.
I have thought for some time that brain science is the next great frontier of knowledge, because my wildest conjecture is that the brainscape perfectly mirrors an isolated cosmoscape in a simply connected network. Not to be too sci-fi on the subject -- as you say, " ... as pointed out by Wheeler, Feynman, Sciama, Davies, Zeh, Carroll, Penrose, and many others, one needs some global boundary condition to determine a consistent arrow of time in local physics:a top-down effect from the cosmological scale to everyday scales." (In fact, that's what my essay in this competition is about.)
The introduction of multi-level causation to biological evolution, multi-scale variety (Bar-Yam) to all systems, IGUS (Gell-Mann & Hartle) to information theory, local arrows of time to cosmology (Ellis, et al) ... and more ... have persuaded me that a continuum of complex multi-scale connections reflect a deep truth of how the universe works.
Martinez and Moya allow, "By highlighting the mutual co-determination between levels of organization in the process of natural selection we have recovered and articulated a multilevel perspective that is absent from previous discussions." And they quote Hitchcock 2003, "The goal of a philosophical account of causation should not be to capture the causal relation, but rather to capture the many ways in which the events of the world can be bound together."
Bar-Yam puts it this way: "In considering the requirements of multi-scale variety more generally, we can state that for a system to be effective, it must be able to coordinate the right number of components to serve each task, while allowing the independence of other sets of components to perform their respective tasks without binding the actions of one such set to another." [Y. Bar-Yam, "Multiscale Variety in Complex Systems." Complexity vol 9, no 4, pp 37-45 2004]. In other words, distributed control -- lateral information -- increases variety. Increased variety increases the coordination strength of the network.
George, may your tribe increase. :-)
Best,
Tom
view post as summary
report post as inappropriate
Thomas Howard Ray replied on Sep. 13, 2012 @ 11:27 GMT
My post above. Can't seem to stay logged in.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 13, 2012 @ 16:07 GMT
Hi Tom
great, we agree on how living systems work: "a continuum of complex multi-scale connections reflect a deep truth of how the universe works."
My point then is that this extends to quantum physics as well: phenomena such as superconductivity are also dependent on such effects. It's a novel view to physics, or more correctly to some physicists, but not to biology.
be well,
George
hide replies
Yuri Danoyan wrote on Sep. 12, 2012 @ 12:28 GMT
Dear George
I would like to show interesting story where Top-Dawn approach get useful
http://vixra.org/abs/0907.0022
I mean trick with inversion dark green column
report post as inappropriate
Andrew H. Norton wrote on Sep. 13, 2012 @ 12:40 GMT
Dear Prof Ellis
I enjoyed reading your essay and was prompted to also read your other papers that you cite. In one of those you mention Wheeler’s delayed choice experiment as "a case of top down causation from the apparatus to the very nature of the particle/wave at the time it passed through the slits." In this instance, top down causation sounds a lot like retro-causation through the role of future boundary conditions (as modeling measurement processes) in selecting what actually comes about.
In this regard, I think you would find interesting Couder and Fort's bouncing droplet quantum analogues of single-particle diffraction and interference (refs [18],[21},[23] in
my essay). Retro-causality also plays a second role in this system: the analogue of de Broglie's pilot wave is a standing wave (not a phase wave), in other words the particle is the source of a semi-retarded plus semi-advanced radiation field.
I have some ideas on the classical to quantum cut that I tried to explain in
my essay, again related to retro-causality.
cheers
Andrew
report post as inappropriate
Author George F. R. Ellis replied on Sep. 13, 2012 @ 15:10 GMT
Dear Andrew
your paper is very interesting, and I like its seriousness and originality.
I agree on the possibility of retro-causality: please see
this paper for a view that is a bit similar to yours in that regard. I also like the multi-scales in your model, which accords with what I am trying to do here. However of course quantum theory has to do much more than just model an electron: it will be very interesting to see how you take it further.
Best wishes
George Ellis
Yuri Danoyan wrote on Sep. 13, 2012 @ 13:08 GMT
Dr. Ellis
Does The Crystallizing Block Universe mean one single cycle of the Universe?
report post as inappropriate
Author George F. R. Ellis replied on Sep. 13, 2012 @ 16:11 GMT
Not necessarily: it can have multiple cycles provided the transition between them is non-singular. No one has yet given a cyclic theory without singularities of one kind or another.
Frank Martin DiMeglio wrote on Sep. 14, 2012 @ 04:36 GMT
Hi George. Thank you for keeping an open mind in regard to my ideas.
You wrote that you agree with my statement: "If the self did not represent, form, and experience a comprehensive approximztion of experience in general by combining conscious and unconscious experience, we would then be incapable of growth and of becoming other than we are." DREAMS PROVE ALL OF THIS: That the self...
view entire post
Hi George. Thank you for keeping an open mind in regard to my ideas.
You wrote that you agree with my statement: "If the self did not represent, form, and experience a comprehensive approximztion of experience in general by combining conscious and unconscious experience, we would then be incapable of growth and of becoming other than we are." DREAMS PROVE ALL OF THIS: That the self represents, forms, and experiences a comprehensive approximztion of experience in general by combining conscious and unconscious experience...AND that dreams are demonstrative of our growth and becoming other than we are. I proved/showed the linked physics (of dreams with waking experience) in my essay as well.
Then you wrote: "The question is how physics allows this to happen. Modular hierarchical structures with both bottom up and top down causation is a key part of the answer." My essay shows exactly how the physics of dreams allows this to happen.
You asked about my statement: "FUNDAMENTAL gravitational and inertial equivalency and balancing fundamentally sits at the heart of physics" I also demonstrate fundamentally balanced and averaged acceleration with this, including balanced attraction and repulsion and instanataneity in the physics of dreams. Electromagnetism, gravity, acceleration, and inertia are in fundamental equilibrium and balance in dreams. I showed this clearly. This demonstrates/proves F=ma fundamentally as well. (Force/energy as F, actually.)
Gravity sits at the heart of fundamental/general unification in physics. Gravity is at the heart of our feeling, vision, and touch. Thoughts and emotions are differentiated feelings. Gravity felt at the VISIBLE ground, so no gravity at the very top of the head where INVISIBLE space enjoins. Inertia and gravity in balance, so VISION begins INVISIBLY inside the eye/body; as gravity (seen and felt) is fundamental to fundamentally stabilized distance in/of space.
Can you please rate and review my essay George? I would appreciate it.
view post as summary
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 14, 2012 @ 05:44 GMT
Hi Frank
dreams may well be important as to how the mind works; people like Freud and Mark Solms have investigated this. But dreams can't be significant for how physics operates:its the other way round, in the end physics underlies dreams somehow because physics underlies the brain.
Which physics? You claim "Gravity is at the heart of our feeling, vision, and touch." I side with biophyiscs in saying it is electromagnetism that plays this role. In fact the principle of equivalence supports this: our bodies function adequately for extended periods in free fall, where there is no effective gravitational force. So gravity can't underlie mind functioning.
I'll put some comment over there.
George
Edwin Eugene Klingman replied on Sep. 14, 2012 @ 17:22 GMT
Hi George,
You say: "...our bodies function adequately for extended periods in free fall, where there is no effective gravitational force. So gravity can't underlie mind functioning."
Unless you are excluding the gravitomagnetic component of gravity, this is not necessarily true.
Edwin Eugene Klingman
report post as inappropriate
Author George F. R. Ellis replied on Sep. 14, 2012 @ 19:21 GMT
Hi Eugene
It's a nice idea, but it won't work because those effects are so weak. For this to work, they'd have to be detected by physical systems on Earth. The most expensive and complex gravitational wave detectors have so far failed to detect the gravitomagnetic component of gravity, see Kip Thorne's discussion of the nature of these effects in two papers at http://xxx.lanl.gov/abs/1208.3038 and http://xxx.lanl.gov/abs/1208.3034. If those detectors can't detect them, then certainly our brains can't.
George
Edwin Eugene Klingman replied on Sep. 14, 2012 @ 20:44 GMT
Hi George,
That is true if Martin Tajmar's measurement's are false, but no one has yet shown this to be the case. If instead his measurements are correct, then the coherence factor (kappa) is as I describe in my essay, the Nature of the Wave Function, with potential effects I have described in my earlier essays. The fact that everyone has decided to ignore his results is par for the course, but proves nothing. Yet if he is correct, it is the most revolutionary discovery of recent times. My own position is to treat it as correct and investigate the consequences, which are many. In fact, there are other hi-ranked essays here that propose something similar, ignoring what has already potentially been discovered.
I don't expect to convince you, simply to record the fact that measurements exist that suggest an alternative.
Best,
Edwin Eugene Klingman
report post as inappropriate
Author George F. R. Ellis replied on Sep. 15, 2012 @ 07:41 GMT
Edwin, Martin Tajmar's measurement's have been disowned by his co-investigator. But in any case they are solar system measurements, which do not relate to what happens here on earth, where the effect is not discernible.
Frank, we simply have very different views of reality and causation. We will have to agree to disagree.
George.
Edwin Eugene Klingman replied on Sep. 15, 2012 @ 17:40 GMT
George,
You stated "Edwin, Martin Tajmar's measurement's have been disowned by his co-investigator. But in any case they are solar system measurements, which do not relate to what happens here on earth, where the effect is not discernible."
Not sure what you're talking about George. I haven't heard this of his co-investigator, and his measurements were most **definitely** performed on Earth, *not* elsewhere in the solar system. I think you're confused. There is a link to his experiment in my essay -- it is performed in a lab on Earth's surface.
Edwin Eugene Klingman
report post as inappropriate
Author George F. R. Ellis replied on Sep. 16, 2012 @ 07:06 GMT
Edwin,
yes you are right, I'd remembered it wrong. It was indeed a laboratory experiment. But it remains the case that the experimental relativity community does not believe it. The problem is that gravity is such a weak force, and to generate the gravimagnetic effect requires large masses moving at high velocity; if they were there, they'd have other much more easily measurable effects. But I agree its a nice idea and you have developed it well. It is a good idea.
Frank I have replied on your essay page. I simply don't agree that dreams can be the basis of any scientific theory of the way things are.
George
Edwin Eugene Klingman replied on Sep. 16, 2012 @ 18:43 GMT
George I appreciate your last comment on my thread and would like to make one final comment.
Please note that the source term is mass current density. Thus any quantum effect will depend upon mass density. For example Michael Goodband's particle model is a rotating black hole at the Planck scale which "drags space-time" [that is, produces a C-field]. Such a particle would have an electron mass and a radius of 10^-57 meters which is more than enough to result in a wave of the type I propose. My own model is far less dense, with a radius of ~10^-19 meters and does require the stronger C-field that Tajmar claims to have measured.
Thus the wave function model is mass density dependent and therefore particle model dependent. Also, since mass density is "an ill-defined concept" in general relativity, relativists may be less inclined to credit it. Finally, the mass density of atoms and molecules, and hence solar objects, is very low compared to elementary particles, so Gravity Probe B measurements are as expected, even if a coherence coefficient does exist.
Thanks again for your consideration,
Edwin Eugene Klingman
report post as inappropriate
hide replies
Frederico Pfrimer wrote on Sep. 14, 2012 @ 17:18 GMT
Dear George,
I am fascinated because the ideas you propose in your essay are essentially some insights I got some years ago. They were from a philosophical or spiritual point of view, and you brought the same ideas into the language of physics. I’m a physicist too, and I know how hard is to propose anything on this subject. But the connection you established with computer programming is...
view entire post
Dear George,
I am fascinated because the ideas you propose in your essay are essentially some insights I got some years ago. They were from a philosophical or spiritual point of view, and you brought the same ideas into the language of physics. I’m a physicist too, and I know how hard is to propose anything on this subject. But the connection you established with computer programming is perfect for it. Actually, I believe that once we could completely understand computation, programming and its relation with physics your ideas will be definitely proved right and also deeply clarified.
I think looking to this problem as problem of language and can give us many insights. Continuing or repeating what you say, follow my line: The only way to describe a high level program is using a high level language. A high level program can be implemented in a low level language, but it is not described by it. That is, there are several different implementations of a high level program in a high level language; and it contains more information than the high level one; so, you cannot say that you are describing the same program in the low level language.
To say that top-down causation happens is simply to say that physics can “run high level programs”. I believe quantum computation is just like assembly programming, so it is the lowest level possible. Therefore, a high level quantum programming language might me the missing ingredient for understanding this. It would make your analogy formally valid in physics!
I believe our mind and any other spiritual element would exist in high level layer and could only be described by high level languages. And then, as they exist, they would be able to provoke top-down causation in the lowest layer: physics. The interaction between mind and matter would be something of this form. That’s why it is not described by current physics: actually our mathematics does not really support top-down causation. For sure it is the missing ingredient for a revolution in physics!
In my essay,
“The Final Theory and the Language of Physics”, I discuss the relation between language and theory, and try to elucidate the nature of a physical theory. Please, give me some feedback. A computer language is just like a framework for writing physical theories. A high level language would be like a high level framework, with high level concepts and mathematics, and understanding top-down causation will at some time require this.
Best Regards
Frederico
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Sep. 14, 2012 @ 19:30 GMT
Hi Frederico,
you say "The only way to describe a high level program is using a high level language. A high level program can be implemented in a low level language, but it is not described by it. That is, there are several different implementations of a high level program in a high level language; and it contains more information than the high level one; so, you cannot say that you are describing the same program in the low level language. " Exactly, nicely put.
"I believe our mind [...] would exist in high level layer and could only be described by high level languages. And then, as they exist, they would be able to provoke top-down causation in the lowest layer: physics. The interaction between mind and matter would be something of this form." Exactly. " That’s why it is not described by current physics: actually our mathematics does not really support top-down causation. " Well it's beginning to do so, see the papers By Walker and Davies I cited some way above, and particularly Karl Friston's article "A theory of cortical responses" (I gave the link a while back). We're getting there. Any help in sorting this out appreciated.
I'll take a look at your essay.
geporge
Frederico Pfrimer replied on Sep. 15, 2012 @ 04:29 GMT
I didn’t know about these results. It is good to know we are making progress in this direction. I’m trying to understand these papers but they are not any simple for a first reading. I might work on this subject in the future; it is something that interests me. But first there are simpler things I would like to clarify. Our understanding of complex things may always be bounded by the clarity of simple things. Progress can be made, but is much harder than when the simple is already clearly understood.
All the best!
report post as inappropriate
Author George F. R. Ellis replied on Sep. 24, 2012 @ 19:23 GMT
I have now added the following comment on your thread:
Dear Frederico
your essay and associated paper are thought provoking and deep. It will take time to assimilate it. My main comment for the present refers to this statement of yours:
"I have means to say that the main wrong assumption of physics is not
a physical assumption, but a millenary logical assumption: the principle of excluded middle .. This principle says that a proposition is either true or false, in other words, either the proposition or its negation is true" I think that you might be saying that the truth or falsity of a proposition may depend on its context. That is very close to the concept of contextual effects that I discuss in my essay.
George Ellis
There is nothing new under the sun wrote on Sep. 14, 2012 @ 20:00 GMT
The works of Gell-Mann and Anderson show that emergence is compatible with ordinary bottom-up causation. Anderson writes on the first page of the Science paper: "The elementary entities of science X obey the laws of science Y". He gives a table of X and Y. The first row says that the elementary entities of "solid state or many-body physics" obey the laws of "elementary particle physics", the...
view entire post
The works of Gell-Mann and Anderson show that emergence is compatible with ordinary bottom-up causation. Anderson writes on the first page of the Science paper: "The elementary entities of science X obey the laws of science Y". He gives a table of X and Y. The first row says that the elementary entities of "solid state or many-body physics" obey the laws of "elementary particle physics", the second row says that the elementary entities of "chemistry" obeys the laws of "many-body physics", and so on. At the end of the table Anderson writes "But this hierarchy does not imply that science X is "just applied Y."" Gell-Mann offers a similar analysis in his famous book. Gell-Mann and Anderson criticize reductionism but defend the ordinary bottom-up causation. You confuse reductionism with bottom-up causation.
Denis Noble is not a world authority on the molecular basis behind the physiology of the heart. This molecular basis obeys the laws of physics and chemistry. This is well-known. Superconductivity is a collective phenomenon for a large number of entities that obey the laws of particle physics. This is explained by Anderson in his Science paper. You confuse reductionism with bottom-up causation.
The ideas of your "deepest thinkers in cosmology" about the second law are considered nonsense by all the experts. The reception at the Santa Cruz talk was one example. The proof of the increasing entropy due to Ludwig Boltzmann (who later committed suicide, being surrounded by people not unsimilar to your "deepest thinkers in cosmology" who were unable to appreciate the depth and validity of his key insights into thermodynamics) shows that the rest of the Universe and its history is irrelevant, because of the locality of the laws of Nature.
Time-symmetry is an approximation to other fundamental symmetries of our universe. You consider this "Extraordinary" because you are not familiar with modern physics. What I find extraordinary is that you didn't know the Weinberg derivation before writing your amusing essay. The H-theorem (3.6.20) holds for both directions of time and Weinberg writes about the theorem: "so we may conclude that the entropy always increases". This is very easy to check. Your misunderstanding of the H-theorem is typical of the "deepest thinkers in cosmology".
You repeat a wrong argument by Penrose, but the Weinberg derivation does not assume "time-reversal invariance, which would tell us that |M_{\beta\alpha}|^2 is unchanged if we interchange \alpha and \beta". This general derivation from quantum field theory does not need cosmological speculations.
Your spurious citations to personal blogs and promotional websites of the "deepest thinkers in cosmology" do not change the conclusion: All the examples found in your amusing essay are compatible with ordinary bottom-up causation.
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Sep. 15, 2012 @ 10:07 GMT
I have replied just below to this comment: the reply somehow got displaced.
Here is the kind of research you dismiss out of had. Must be amazing to live life with the ability to deny the validity of what so many other highly competent researchers are doing. I suppose you feel safer with your blinkers on. You should take note of the Feynmann quite I gave above (Jul. 21, 2012). Or do you look down on Fetnmann too?
Experimental application of top-down control analysis to metabolic systems.
(PMID:8438233)
Quant PA
Department of Biochemistry, University of Cambridge, UK.
Trends in Biochemical Sciences [1993, 18(1):26-30]
DOI: 10.1016/0968-0004(93)90084-Z
Abstract Metabolic control analysis (MCA) has provided the language and framework for quantitative study of control over flux, or over metabolites, by individual enzymes of a pathway. By contrast, top-down control analysis (TDCA) yields an immediate overview of the control structure of the whole system of interest, giving information about the control exercised by large sections of complex pathways. Unlike MCA, TDCA does not rely on the use of specific inhibitors or genetic manipulation to determine control coefficients. The method and an application of TDCA to ketogenesis are described.
Author George F. R. Ellis replied on Sep. 15, 2012 @ 16:04 GMT
Typos corrected:
Here is the kind of research you dismiss out of hand. Must be amazing to live life with the ability to deny the validity of what so many other highly competent researchers are doing. I suppose you feel safer with your blinkers on. You should take note of the Feynmann quote I gave above (Jul. 21, 2012). Or do you look down on Feynmann too?
Author George F. R. Ellis replied on Sep. 16, 2012 @ 14:06 GMT
In case you miss the post below, here is a key element I point out there. In view of your statements, I think it needs spelling out.
You state
"The H-theorem (3.6.20) holds for both directions of time and Weinberg writes about the theorem: "so we may conclude that the entropy always increases". This is very easy to check. Your misunderstanding of the H-theorem is typical of the "deepest thinkers in cosmology". "
So the key point is, if entropy always increases, in which direction of time does it increase? Weinberg's derivation has no answer. I'll explain step by step.
Choose a time coordinate t. The theorem as developed by Weinberg, according to you says
dS/dt > 0. (1)
Now choose the opposite direction of time:
t' = -t. (2)
As you admit, "The H-theorem (3.6.20) holds for both directions of time" (I show why in my post of Sep. 12, 2012 @ 17:30). Hence it holds also for t'. Therefore the Theorem as developed by Weinberg also says
dS/dt' > 0. (3)
Is (1) true or (3) true, or are both true, or is neither true?
Weinberg's derivation, like Boltzmann's says both are true. It does not pick out the preferred direction of time which underlies the 2nd law of macroscopic physics.
So in which direction of time does entropy increase? Weinberg's equation (3.6.20) does not provide the answer. It can't explain the most elementary fact about everyday physics.
So where is the misunderstanding in this elementary line of reasoning? Your sardonic comments are in tatters if you can't reply convincingly.
Maybe if you look at this carefully you'll at last understand what Wheeler, Feynman, Sciama, Davies, Zeh, Penrose, and Carroll and others were on about.
Author George F. R. Ellis replied on Sep. 19, 2012 @ 05:37 GMT
So come on. Which is it?
* Do you have a counter argument showing I'm wrong? If so what is it? Where is the mistake in this elementary logic?
or
* Do you have the stature to concede you and your Santa Cruz experts are simply wrong? - you did not grasp this elementary logic?
or
* will you lurk in the shadows, unable to answer and unable to admit you were wrong? -- proving you don't have the capacity to admit that you are wrong, nor the stature required to apologise for the insulting nature of your comments.
If you give no reply, you choose the last option. Wheeler, Feynman, Sciama, Davies, Zeh, Penrose, Carroll, and others including myself are vindicated, and your condescending comments are discredited.
hide replies
Saibal Mitra wrote on Sep. 14, 2012 @ 22:48 GMT
Dear George,
I found your essay to be the most interesting one here. I do think that the views expressed in your essay fit better in the Many Worlds interpretation of quantum mechanics, though. One can then argue that the sector an observer is located in is not a precisely defined branch, rather it is in some superposition of states that are in the same functional equivalence class (as defined on page 9 of your essay).
Such a superposition is a complicated entangled state of the system and the environment; such a state defines the computational state of the algorithm that the brain is executing. I explain this in
my essay.
The then means that the top-down causation is in principle visible from the microstate of the system as it exists at any given time (although you will in general have a superpositon of systems in different macrostates).
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 14, 2012 @ 22:51 GMT
You simply don't understand that at no point have I in any ways denied that the the laws of physics and chemistry apply at the lower levels. Of course they do. The point is that they do not by themselves determine what happens. That is determined by top down causation, as is abundantly clear for example in the case of the computer, which you conveniently continue to ignore.
It is also...
view entire post
You simply don't understand that at no point have I in any ways denied that the the laws of physics and chemistry apply at the lower levels. Of course they do. The point is that they do not by themselves determine what happens. That is determined by top down causation, as is abundantly clear for example in the case of the computer, which you conveniently continue to ignore.
It is also clear in the case of superconductivity. Yes indeed it is a "collective phenomenon for a large number of entities that obey the laws of particle physics". It is precisely the collective nature of the phenomenon which means it cannot be accounted for except in terms of the entities that make it a collective phenomenon - that is, the specific crystal structure, which is a higher level of structuration (i.e. at a larger scale of description) than the Cooper pairs. Indeed those pairs do not exist without that higher level structure. That is why superconductivity depends for its existence on that specific higher level structure, and is why you cannot deduce superconductivity in a bottom up way from the behaviour of electrons and protons, as Laughlin points out. I continue to believe he underztands it better than you do.
"Denis Noble is not a world authority on the molecular basis behind the physiology of the heart. This molecular basis obeys the laws of physics and chemistry" Of course it does, no one has ever denied that: that is a foundation he and I believe in. But by themselves they do not create a heart nor determine its functioning. Denis Noble is a world expert on the physiology of the heart, and it is when you study that physiology that you find it cannot be understood except in terms of the interplay of bottom up and top down causation, which determines which specific molecular interactions take place where and at what time. Bottom up physics alone cannot explain how a heart comes into being, nor what its design is, nor its regular functioning. Please take the trouble to read what Noble has written on this, instead of denying his understanding of the physics and biology involved.
The very small weak interaction time asymmetry makes no difference to the arrow of time issue, which is the point I was making. "The ideas of your "deepest thinkers in cosmology" about the second law are considered nonsense by all the experts." Who appointed these nameless people as experts in what? I have agreed with you that the Weinberg derivation is a nice one. You still don't get the point. You say "The H-theorem (3.6.20) holds for both directions of time and Weinberg writes about the theorem: "so we may conclude that the entropy always increases" ". Increase in which direction of time? The second law has no content until this question is answered - and Weinberg's derivation can't give an answer to that question, as the first part of the quote above makes clear (that derivation unhelpfully says it increases in both directions of time). "This general derivation from quantum field theory does not need cosmological speculations" --- and it does not solve the arrow of time problem. The nameless "experts" you rely on do not trump Wheeler and Feynmann, who understood that the problem is real and is not solved by quantum field theory per se.
"You confuse reductionism with bottom-up causation." Your continued repetition of this phrase shows you simply have not paid attention to the nature of my argument, nor the vary large number of examples confirming it, such as the whole subject of epigenetics. Here is a challenge for you. Explain to me in a purely bottom up way how state vector preparation is possible, as for example in the Stern Gerlach experiment. Quantum physics is unitary, as we all know: how does the non-unitary behaviour of state vector preparation emerge in a purely bottom up way from that unitary dynamics? You won't be able to explain this action without invoking the effect of the apparatus on the particles - which is a form of top down action form the apparatus to the particles.
Why don't you change your pseudonym to something that is not so blatantly false? If you get that so wrong, why should we believe anything you say? And you don't do yourself any favours by the sneering tone you adopt. It just comes across as arrogant.
view post as summary
Author George F. R. Ellis replied on Sep. 14, 2012 @ 22:55 GMT
This response was to the previous post. It somehow got displaced.
Author George F. R. Ellis wrote on Sep. 15, 2012 @ 05:59 GMT
Dear Saibal
that is an interesting perspective. Personally I think quantum theory is incomplete and that we need to find the mechanism that determines state vector projection; but I'll consider your proposal too. But I have never been able to understand what mechanism leads to splitting of the wave function, or what determines when it happens. Also as I understand it, this proposal can't account for the Born rule in any simple way. Deutsch's concept of uncountable infinities of fungible particles is hardly credible.
George
Saibal Mitra replied on Sep. 17, 2012 @ 04:40 GMT
Dear George,
The Born rule in this setting where you have a system that is perfectly entangled with the environment, can be derived from the symmetries of such a state. Zurek has given a derivation
here.
My personal idea on (effective) wavefunction splitting would be to first define the observer as some algorithm that can be in various computational states. E.g. a neural network that given the coordinates of some points can recognize certain certain shapes, like the points forming a square, a circle, or it doesn't recognize anything. Then if we were to give a microscopic description in terms of the electrons etc. then the generic state of this system would be some superposition, and you can then collect together the terms that correspond to "circle", "square", and "nothing".
report post as inappropriate
Author George F. R. Ellis replied on Sep. 17, 2012 @ 11:30 GMT
Hi Saibal
I really like Zurek's work on environmental decoherence, particularly because it is indeed a form of top-down action from the environment to the system. It embodyies one of the key forms of top-down action: namely adaptive selection. However I've never thought of it as being a form of the many-worlds view. I'll have to look at it again.
A key point to remember here is that any proposal to deal with the measurement problem must deal with individual cases: dealing with statistics in not enough. Statistical results only exist if individual events occur.
George
David Rousseau wrote on Sep. 16, 2012 @ 14:48 GMT
Dear George,
Your essay casts a valuable light from Physics on the complex way in which causal interactions play out in systems. These complex causal networks make reductionistic interpretations inadequate. Although 'top-down' processes have been recognised in biology and social science (as you point out), this idea cannot find a secure footing in the paradigm until physicists take it on...
view entire post
Dear George,
Your essay casts a valuable light from Physics on the complex way in which causal interactions play out in systems. These complex causal networks make reductionistic interpretations inadequate. Although 'top-down' processes have been recognised in biology and social science (as you point out), this idea cannot find a secure footing in the paradigm until physicists take it on board. Such a foundational understanding is much needed for progress both within and beyond physics. For example, as you rightly point out, deep problems in philosophy of mind hinge on such conceptions of causation.
As you suggested on our essay page, there is a significant overlap between the 'metaphysical drift' of your essay and ours, even though you and we target different problems in foundational knowledge. I think that we could contribute to the position that you are developing, in terms of conceptual clarifications we are developing in our work in
Systems Philosophy, which we touch on in
our essay. We are working on articulating conceptual understandings for terms such as 'existing thing', 'physical thing', 'concrete thing', 'abstract thing', 'property', 'causal power' and so on, in a way that is broadly consistent with their usage in metaphysical debates and (critically) mutually consistent. Formalizing the definitions you give in your essay in this way would make your point even stronger and clearer, and remove possible misinterpretations of your argument, such as assigning causal powers to 'patterns' rather than to the systems that realize them. This would enable important further distinctions to be made between the things you identify as "existing" and having causal consequences yet being "non-physical". For example, minds would be what they are whatever sense we make of them, but arguably computer programs exist only in terms of the sense we make of them. Such clarifications might be important in the future development of your argument.
Meanwhile, congratulations on writing a clear essay about a perspective that will be important for how our fundamental understandings will develop. We're glad to see it doing so well in the rankings already, and will add our own positive rating!
Best wishes,
David
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Sep. 16, 2012 @ 15:16 GMT
Dear David
Many thanks for that, I'm glad to know about your project.
" Formalizing the definitions you give in your essay in this way would make your point even stronger and clearer, and remove possible misinterpretations of your argument, such as assigning causal powers to 'patterns' rather than to the systems that realize them. This would enable important further distinctions to be made between the things you identify as "existing" and having causal consequences yet being "non-physical"." Yes indeed. Any help in such clarification will be welcome.
George
Anthony DiCarlo wrote on Sep. 16, 2012 @ 17:41 GMT
Hi George,
Thanks you for reading and commenting on my essay. This was most appriciated. I have read your essay many times and feel that many of your ideas correlate well with those I have also contemplated.
You state:
"A key assumption underlying most present day physical thought is the idea that causation is bottom up all the way: particle physics underlies nuclear...
view entire post
Hi George,
Thanks you for reading and commenting on my essay. This was most appriciated. I have read your essay many times and feel that many of your ideas correlate well with those I have also contemplated.
You state:
"A key assumption underlying most present day physical thought is the idea that causation is bottom up all the way: particle physics underlies nuclear physics, nuclear physics underlies atomic physics, atomic physics underlies chemistry, and so on."
Yes! however, recall that as we became the ultimate reductionists, we stumbled on the idea that the degrees of freedom for obtaining information at the tiniest level is dual physics to the grandiose level ie., ADS/CFT. Information we measure at the microscopic scale is the same as that from the dual information measured on the grandiose scale. It may be that the informational physics at the center of these two measures is where "life" does its measuring. At this central point location in "life measure" we get top/down & bottom/up in fairly equal proportion. It is a kind of life information model having "Feynman-Wheeler 1/2 amplitude (retarted and advanced) summed information" with causal being the measurable actions and the non-causal being our thoughts that came before the action. The causal actions are subject to be measurable information while the non-causal actions are the "result" of thoughts that come prior to the action. Currently thoughts are not subject to be measurable information and this may be why the advanced wave is not subject to any dielectric affects in the space between emit and receive, a tackyonic "think space," for Feynman-Wheeler to accurately predict radiation reaction.
Like all physics, Maxwell's equations to Einstein's equations we get converging and diverging wave solutions, and, BOTH can and should be used in the proper manner for obtaining total information (think what would happen if Dirac threw awat his positive solution... no positron prediction). We tend to throw away the solution that we just can't put our "causal measurement finger" on, but, it may be that we are throwing away the top/down or bottom/up information that must be used to accomodate all the information obtained from things we do "measure" ... possibly clearing up all the misconceptions in entanglement, etc, as Feynman-Wheeler actually addressed quite well when proper boundaries are applied.
In your statement:"Hypothesis: bottom up emergence by itself is strictly limited in terms of the complexity it can give rise to. Emergence of genuine complexity is characterised by a reversal of information flow from bottom up to top down"
If physical characteristics of "life" is what emerges in the top/down approach we will likely also require information from bottom/up to explain the utmost detail in how it evolved. Could "life information" be a superposition of both top/down and bottom/up, and, thoughts and measureable actions compromise the top/down and bottom/up information, respectively?
Have a great day,
Tony DiCarlo
view post as summary
report post as inappropriate
Author George F. R. Ellis replied on Sep. 16, 2012 @ 19:16 GMT
Hi Anthony
Thanks for that.
"Could "life information" be a superposition of both top/down and bottom/up, and, thoughts and measureable actions compromise the top/down and bottom/up information, respectively?"
Yes indeed. May essay kind of takes the bottom up for granted. Maybe I need to emphasize that it occurs too!
George
Author George F. R. Ellis wrote on Sep. 17, 2012 @ 18:15 GMT
Regarding “There is nothing new under the sun”:
Readers of this thread will have noticed I am under persistent attack by an anonymous theoretical physicist who hides behind this ludicrously false pseudonym (counterexample: the internet). He repeatedly claims I have not given one single valid example of top-down causation that cannot be explained by bottom up causation alone. I’m going...
view entire post
Regarding “There is nothing new under the sun”:
Readers of this thread will have noticed I am under persistent attack by an anonymous theoretical physicist who hides behind this ludicrously false pseudonym (counterexample: the internet). He repeatedly claims I have not given one single valid example of top-down causation that cannot be explained by bottom up causation alone. I’m going to do a summary response to his claims here, not because I believe there is any chance he will actually listen to what I am saying and comprehend it, but so that he cannot mislead those of you who have not followed the details of my responses to him.
He gets this result by ignoring or denying inter alia the following examples I have cited:
• The way the human brain functions, as evidenced for example by Chris Frith in Making up the Mind, Dale Purves in Brains: How they seem to work, Eric Kandel in The Age of Insight, and Karl Friston in
A Theory of Cortical Responses .
• The physiology of the heart, as described by Denis Noble, FRS, in his book The Music of Life and his article
A Theory of biological relativity . To counter these writings, the anonymous commentator insinuates [Sept.14@20:00 GMT] that Noble does not understand the molecular basis behind the physiology of the heart: a truly pathetic claim.
Here is Noble’s citation record, which attests both to his standing and his understanding.
• Patricia A Quant: “Experimental application of top-down control analysis to metabolic systems.” Department of Biochemistry, University of Cambridge, UK. Trends in Biochemical Sciences [1993, 18(1):26-30].
• The way digital computers work, as briefly mentioned in my essay, and developed in more detail in my
talk at the recent Manchester Turing meeting.
• The arrow of time issue that has preoccupied many great physicists since the time of Boltzmann. He believes that Weinberg’s quantum field theory derivation of the H-Theorem solves this problem, even though (as he himself admits) that derivation is time symmetric (just as in the case of the Boltzmann derivation of the H-theorem, the H-Theorem (3.6.20) Weinberg gives in his book The Quantum Theory of Fields I will hold equally for both directions of time: just reverse the direction of time and relabel alpha to beta: the derivation goes through as before). This result cannot therefore solve the arrow of time problem in a bottom up way [see my post of Sept 16 @ 14.06 GMT], no matter how emphatically he and his mentors deny this basic fact. They are apparently ignorant for example of the Wheeler and Feynman
absorber theory of radiation , which would be unnecessary if quantum field theory by itself solved the problem.
• The fact that the mechanism of superconductivity cannot be derived in a purely bottom up way, as
emphasized by Nobel Prize winner Bob Laughlin; see the Appendix to my essay for Laughlin’s statement in this regard. The reason is that existence of the Cooper pairs necessary for superconductivity is contingent on the nature of the ion lattice, which is at a higher level of description than that of the pairs; they would not exist without this emergent structure.
• The fact that state vector preparation, as for example in the Stern Gerlach experiment, cannot be explained in a purely bottom up way, because it is non-unitary; see
here for an analysis and many other examples.
It only requires one of these examples to be true for his whole dismissive thesis to fall apart. But they are all true.
He apparently believes I am denying the validity of the bottom level physics. This is of course incorrect: what I say is based in an uncompromising stand that that lower level physics is indeed valid, as is quite clear in my
paper on quantum physics. The key issue is what determines which
specific aspect of the underlying physics is deployed when and where; and that is where top-down effects from the context come in, embodied in constraints on what happens at the lower levels. This is for example extremely clear in the case of epigenetics: see Gilbert and Epel’s excellent book Ecological Developmental Biology.
What I am pointing out in my essay is that physics does not by itself determine what happens in the real world, see also my
Nature article . Physics per se cannot account for the existence of either a teapot or a Jumbo jet airliner, for example. You need to have a somewhat larger causal scheme to understand where they come from. Please see the quote from David Deutsch by J C N Smith on my thread on July 20@13:06 GMT for a great comment in this regard. Another example is particle collisions at the LHC at CERN: these are the result of the top –down effect of abstract thoughts in the minds of experimenters to the particle physics level. Without these thoughts, there would be no such collisions.
The postings by this anonymous commentator are a textbook example of the enormous arrogance that infects part of the theoretical physics community, who live in intellectual silos disconnected from the rest of physics, let alone the rest of science, and then look down on those outside these silos in the belief that they themselves are superior to all around (readers of this thread my find relevant my comments on fundamentalism in academia, see sections 2-4 of
this paper ).
Please do not delete his postings: sociologists of science will have a field day analysing them in years to come. The casual insults that are taken to be a normal part of scientific discourse, replacing rational argument, are classic. The idea of respecting those you disagree with – the basis of civilised discourse - is non-existent, as is the idea one might have to revise one’s own ideas in the face of the counter evidence. Extraordinary that he is willing to present this as his public face.
George Ellis
view post as summary
Anonymous replied on Sep. 18, 2012 @ 12:27 GMT
Hi George,
Sociological implications aside, your withering rebuke of the O.P. has much value for the scientific implications of a fully relativistic theory at multiple scales. That " ... existence of the Cooper pairs necessary for superconductivity is contingent on the nature of the ion lattice, which is at a higher level of description than that of the pairs ..." conveys the physical reality of uncollapsed potential; i.e., the information exchange between particles in the dynamic Cooper state has the particles conspiring to maintain zero angular momentum -- which IMO is fully translatable to higher levels of organization as pure unitary wave function. E.g., conceivably able to deal with questions of large scale phenomena, such as posed by Tanmay Vachaspati "What does an observer who falls into the collapsing object experience?" and Vesselin Petkov, "Can gravity be quantized?"
Point is, the distribution of causality at all levels of organization blurs the distinction between particles -- the "bottom" of the hierarchy -- and systems of particles interacting with other systems to create top down causality.
Back in May, I wrote a short piece that I never submitted or posted anywhere, "A fermionic condensate test of Bell's Inequality & local realism" that agrees with Lucien Hardy's statement, "I anticipate that quantum gravity will be a theory having indefinite causal structure whereas quantum theory has definite causal structure." I will attach it to a post on my own essay site ("The Perfect First Question"). I hope you get a chance to read it, as well as my essay.
George, your forum has become quite a clearinghouse for state of the art research in interdisciplinary science! I think it represents the best of what I perceive that FQXi is about.
All best,
Tom
report post as inappropriate
Thomas Howard Ray replied on Sep. 18, 2012 @ 12:30 GMT
Author George F. R. Ellis replied on Sep. 18, 2012 @ 15:05 GMT
Hi Tom
thanks for that. Your article is very interesting: will take me time to digest. I am a slow thinker, but quiet thorough when I get there!
"Point is, the distribution of causality at all levels of organization blurs the distinction between particles -- the "bottom" of the hierarchy -- and systems of particles interacting with other systems to create top down causality." Yes indeed, but it is subtle. The ions create the lattice with electrons imbedded and this then creates the possibility of existence of phonons, Cooper pairs, and so on (if its properties are just right). So an inevitable conjecture is, is the existence of the electrons and protons also an outcome of some top-down effect? Maybe those who hold that only fields are real and particles are just excitations of fields are already saying that, but I've always had difficulty getting my head round that one - particularly because particles seem so solid and durable at the macro level.
The thing is that physics has so many aspects, and each aspect can be described in so many different ways (as Roger Penrose points out in The Road to Reality), and each of us is an expert in some corner of the thing - but seeing how it fits together coherently is the difficult part. Yes its easy to learn specific formalisms and calculation tricks and apply them to some part of the whole. That does not necessarily give enlightenment.
Cheers
George
Author George F. R. Ellis wrote on Sep. 18, 2012 @ 05:57 GMT
Correction: "Extraordinary that he is willing to present this as his public face" should read "Extraordinary that he is willing to present this as the public face of theoretical physics". He himself, being anonymous, has no face at all: maybe that's why he feels safe making these derogatory remarks.
Roger Penrose has been one of the most able and creative thinkers in mathematical physics for the past 50 years, inter alia transforming general relativity theory. For someone with no discernible academic record of any kind to denigrate him in this way is outrageous. The senior physicists who have mentored this anonymous commentator have truly failed him by letting him think this behaviour is acceptable - and they have badly let down theoretical physics too. Do you really want to present the subject in this extraordinarily negative light? Is this the atmosphere you want to encourage? It is actually possible to do better.
George Ellis
Yuri Danoyan wrote on Sep. 18, 2012 @ 14:15 GMT
What is your attitude to Gerard 't Hooft
Discreteness and Determinism in Superstrings ?
arXiv:1207.3612 (replaced) [pdf, ps, other]
report post as inappropriate
Author George F. R. Ellis replied on Sep. 18, 2012 @ 15:25 GMT
Yuri,
I think he is an original and interesting thinker, and this is certainly worth pursuing. However I think cellular automata are rather limited in what they can do, despite Wolfram's propaganda. Yes I now they are Turing equivalent - but not in any practical way.
The usual concept of cellular automata, as I understand it, relies on symmetry: no neighbour is distinguished from any other. That's precisely what is *not* the case when top-down constraints are in place, e.g. the wiring in a computer channels causation at the lower levels in precisely specified ways between the components. Effective potentials also break lower level symmetry in a similar way. It is this symmetry breaking that creates possibilities of higher level complexity (Anderson points out the key role symmetry breaking has on emergence). So cellular automata is not the way I'd go - or at least not simple versions of that idea. Additionally I'm not a great fan of string theory, so I don't find the combination of the two ideas compelling.
Actually what I do find very intriguing is t'Hooft's work on conformal gravity - but that's another story.
George
Yuri Danoyan replied on Oct. 13, 2012 @ 03:20 GMT
Is the quantum gravity problem or pseudoproblem?
report post as inappropriate
Viraj Fernando wrote on Sep. 18, 2012 @ 15:07 GMT
Dear Dr. Ellis,
I have read your essay with interest. As you have pointed out there are hierarchies of structures which influence the structure at lower levels.
In my view, interactions occur within a given structure only with a limited independence. In every interaction there is an overriding-interaction forming an organic linking with the next higher level of the hierarchy. This...
view entire post
Dear Dr. Ellis,
I have read your essay with interest. As you have pointed out there are hierarchies of structures which influence the structure at lower levels.
In my view, interactions occur within a given structure only with a limited independence. In every interaction there is an overriding-interaction forming an organic linking with the next higher level of the hierarchy. This will be seen from the following. Why we have not realized this in regard to the Lorentz transformation is because hitherto it has always been given a kinematic interpretation. (In my essay I explain and derive it from dynamic principles).
IS LORENTZ TRANSFORMATION A MANIFESTATION OF A TOP DOWN CAUSATION?
We live in a state of indoctrinated amnesia in regard to earth’s gravitation, when we apply Newtonian mechanics or SRT to problems of motions of particles. But occasionally we wake up to the fact that earth’s gravitational field matters. But it never comes to mind that earth’s gravitational field is enveloped within the Sun’s field.
So the question is, is there a top down causation on the motions of particles on earth by the Sun’s gravitational field?
As we know Earth and all its part ‘fall’ towards the Sun as it orbits. When a particle is set in motion at velocity v relative to earth, it not only has its own intrinsic energy Mc2 (which was falling towards the Sun even at rest), now it has an additional quantity of energy p’c = Mvc, and it too has inertia Mv/c. This p’ too must orbit with the earth as well as ‘fall’ towards the Sun along with the particle on which it acts.
This means that p’ must create out of itself a component, that renders itself to co-move with the earth at the orbital velocity u. So the component momentum is (Mv/c).u. (energy Mvc.u/c). Then what is left of p’c for motion relative to earth is only p”c = Mvc(1-u/c).
Just like the earth when it orbits at the velocity equal to the sq rt of the gravitational potential, it develops a centrifugal force equal and opposite to sun’s attraction, so the energy p’c in motion (along with the particle) is equipped now with a centrifugal force opposing sun’s attraction.
The momentum that is available motion relative to earth is
p” = Mv(1-u/c) of which the velocity v’ = v(1-u/c) and hence the displacement x’ = v’t = vt(1-u/c).
How this displacement x’ turns out to be x’ = gamma (x –ut) is explained in my essay: http://fqxi.org/community/forum/topic/1549
I also attach the MS Word Doc version, since the diagrams have not come out well in the pdf version above
You would find that within the immediate space interactions occur seemingly independently, as if there is no interference from the next envelope of the hierarchy. Or so we assume. But there is ‘top down causation’ always a top down interaction occurring to form an organic link with the background energy field.
I have shown in my essay the parallel between Lorentz transformation and the impossibility of the perpetuum mobile in terms of this top down interaction. It is in regard to this connection Einstein wrote: . “By and by I despaired of the possibility of discovering the true laws by means of constructive efforts based on known facts. The longer and the more despairingly I tried, the more I came to the conviction that only the discovery of a universal formal principle could lead to assured results. The example I saw before me was thermodynamics. The general principle was there given in the theorem: laws of nature are such that it is impossible to construct a perpetuum mobile” ( p.53).
Best regards,
Viraj
view post as summary
attachments:
6_A_TREATISE_ON_FOUNDATIONAL_PROBLEMS_OF_PHYSICS2.doc
report post as inappropriate
Author George F. R. Ellis replied on Sep. 19, 2012 @ 05:52 GMT
Hi Viraj
yes I agree: "So the question is, is there a top down causation on the motions of particles on earth by the Sun’s gravitational field?" - indeed we live in that environment and it has some effect. But it is a small effect, because of the equivalence principle: the Earth and all on it fall together freely in the the Sun's gravitational field, so we feel that effect of that field only through the tidal force due to the Sun. This is mediated by its free gravitational field: the Weyl tensor it generates here on Earth.
"But there is ‘top down causation’ always a top down interaction occurring to form an organic link with the background energy field." - in principle yes; but it will not have an local discernible effect if it is a uniform gravitational field. Only inhomogeneity will be effective (that is the equivalence principle).
George Ellis
Author George F. R. Ellis wrote on Sep. 19, 2012 @ 10:52 GMT
Addendum:
It is the electric part of the Weyl tensor that represents tidal forces. If the magnetic part were non-zero, that would generate the kind of GEM effect that Edwin Eugene Klingman considers in his essay, a different form of top-down effect from rapidly moving massive objects to the local environment. However the magnetic Weyl tensor components are are probably very much smaller than the electric ones: there are no rapidly moving (in relativistic terms) massive objects in the solar system vicinity.
George
Jose P. Koshy wrote on Sep. 19, 2012 @ 13:24 GMT
Dear George,
I read your essay.It is well written and your argument is very clear. However, I would like to point out that the statement "The concepts that are useful at one level are simply inapplicable at other levels" is an assumption. It may not be true.It is that assumption that leads to the question whether it should be from bottom to top or reverse.
Bottom-top and top-down causations are equally possible, and you have pointed out examples. This may indicate that the system is in equilibrium and the ongoing process is a reversible one.The universe may be in a state of equilibrium at any instant, and the expansion may reversible process.
Another point where I diagree is with the statement "random events take place at the micro level". If what happens at microlevel is random, then surely top-down causation will not take place. But we find that in a given situation,the events happen in a pre-determined way. Otherwise when we type A, we can expect any character to appear on the computer screen. The whole computer programming becomes possible just because the events are deterministic.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 19, 2012 @ 18:33 GMT
Dear Jose
You say "the statement "The concepts that are useful at one level are simply inapplicable at other levels" is an assumption. It may not be true." Agreed. There are a few concepts that remain valid at higher levels: energy and momentum for example. But in most cases the relevant variables are very different at different levels. I look at this in some detail in my paper
here .
"Another point where I disagree is with the statement "random events take place at the micro level". If what happens at microlevel is random, then surely top-down causation will not take place." Well random events are what happens at the bottom level, whether we like it or not, inter alia because of the validity of quantum theory. Additionally there are classical statistical fluctuations at the lower levels, and this plays quite a role in biology, as is now becoming evident. This actually facilitates adaptive selection (a key form of top-down causation) because it provides an ensemble of items or behaviors that can be selected from to attain some higher level goal.
The remarkable thing, as you point out, is that reliable behaviour can emerge at higher levels from this unfirm lower level foundation. Basically both engineers and biology have learned to construct robust devices in the face of this fluctuating lower level behaviour. That's largely an effect of large numbers, combined with the stability of macro structures in energetic terms, plus the fact that classical structures do indeed emerge from the underlying quantum dynamics. It's the power of coarse-graining: lower level details are usually irrelevant as far as higher level structures are concerned.
George Ellis
Viraj Fernando wrote on Sep. 19, 2012 @ 15:41 GMT
Dear Dr. Ellis,
Thank you for your response. I would like to mention that Physicists who are ‘imbued with their mother’s milk’ into believing in Einstein’s principle of relativity, are blinkered not to see the top down structure, between the motion of a particle relative to earth and earth’s GRAVITATIONAL MOTION round the sun.
One of the great damages that happened to...
view entire post
Dear Dr. Ellis,
Thank you for your response. I would like to mention that Physicists who are ‘imbued with their mother’s milk’ into believing in Einstein’s principle of relativity, are blinkered not to see the top down structure, between the motion of a particle relative to earth and earth’s GRAVITATIONAL MOTION round the sun.
One of the great damages that happened to physics is due to Poincare’s rejection of Galileo’s principle of relativity, whose BASIS is that an object moving relative to earth shares also a MOTION IN COMMON with the earth.
Galileo: “Then let the beginning of our reflections be the consideration that whatever motions comes to be attributed to the earth must necessarily remain IMPERCEPTIBLE to us and as if non-existent, so long as we look only at terrestrial objects; for as inhabitants of the earth, WE CONSEQUENTLY PARTICIPATE IN THE SAME MOTION” (p. 114).
As an analogy he wrote: “The cause of all these correspondences of effects is the fact that the ship’s motion is common to all the things contained in it” (p. 187).
Poincare culled off the common motion with the local reference frame from Galileo’s principle of relativity, and MISCONSTRUED that all IFR are equivalent, and that laws of physics are the same in all IFR. And Einstein accepted it uncritically in the formulation of SRT.
The absence of COMMON MOTION in our frame of thinking is why we cannot conceive the top down effect of Earth’s gravitational motion round the sun on a motion of a particle relative to earth.
What Galileo has stated is that the top down effect is imperceptible. The reason why they thought it to be imperceptible is because at the low level of velocities of objects that Galileo and Newton were dealing the effect is extremely minute, since the MATHEMATICS DETERMINING THIS EFFECT IS NON-LINEAR. It is at high velocities the non-linearity comes out of its relative dormancy and turns to manifest effects exponentially. So although both Galileo and Newton made mention of the COMMON MOTION, physics developed as if this is of no consequence because of the relative dormancy of the mathematics determining the top down effect.
But when experiments with very fast moving began at the turn of the 20th century, new phenomena arose, which seemed to contradict classical views. See Lorentz 1904 paper which led to the formulation of empirical equation which came to be known as the “Lorentz transformation”. He begins the paper: “The problem of determining the (top down) influence exerted on electric and optical phenomena by a translation, …. IN VIRTUE OF EARTH’S ANNUAL MOTION ….”.
Thus Lorentz came very close to giving the top down interpretation to the “Lorentz transformation”. If a determined and a consistent effort was continued to get to basics in terms of dynamics, then the problem would have been solved. But unfortunately Einstein turned it into a kinematic postulate of SRT and the matter has got buried under this theory ever since.
There is one question I would like to ask you in regard to SRT’s position on the LT’s. SRT never discovered or predicted LT. It took hold of Lorentz’ empirically developed equation and turned it into a postulate. And when subsequent experimental results confirm LT (“space co-ordinate”), the credit goes to SRT. But if SRT’s interpretation about LT is correct, then along with the confirmation of the ‘space co-ordinate’, the ‘time co-ordinate’ too must be experimentally confirmed. The question I ask you is that has there been even a single experiment which has confirmed, that when x’ = gamma(x –ut) the corresponding time is t’ = gamma(1- ux/c2)gamma?
I have pursued the on the Top Down influence of Earth’s gravitational motion on the relative motions of objects on earth. In my essay: http://fqxi.org/community/forum/topic/1549
I have followed Einstein’s efforts (besides relativity) to develop what he called the ‘Right Way’ by extending the approach of thermodynamics to whole of physics. I have explained why Einstein could not succeed in spite of his convictions – This is because he left out the possibility of top down effect of earth’s motion from his frame of thinking.
Once the top down effect is properly taken into account (as I show in my essay), the general equation of motion of a particle derived from first principles becomes:
x’ = gamma[vt(1 –u/c)] and t = t,
I have shown that a) when v tends to c, the equation turns to LT and b) when v is very much less than c, x = vt.
Hence this general equation holds for all velocities (very low to near light velocities). By this the schism in physics as regards Newtonian physics is valid only for low velocities and SRT is valid only for near light velocities disappears.
I am attaching the MS Doc version of my essay because the diagrams pdf version in this website have not come out properly. I request you to comment on my essay.
Best regards,
Viraj
view post as summary
attachments:
8_A_TREATISE_ON_FOUNDATIONAL_PROBLEMS_OF_PHYSICS2.doc
report post as inappropriate
Author George F. R. Ellis replied on Sep. 19, 2012 @ 18:45 GMT
Dear Viraj
"The question I ask you is that has there been even a single experiment which has confirmed, that when x’ = gamma(x –ut) the corresponding time is t’ = gamma(1- ux/c2)gamma? " Yes - the decay of cosmic ray particles. This is discussed in most standard texts on special relativity, for example Flat and Curved Spacetimes (Ellis and Williams).
Special relativity is an extraordinarily well verified theory, within its domain of applicability; apart from predicting nuclear energy and nucleosynthesis, all those collider experiments at places like SLAC and CERN verify it millions of times over each time they do a run. I don't think there is much mileage in trying to show it is a wrong theory. It's not something I'd spend time on.
George
Viraj Fernando replied on Sep. 19, 2012 @ 21:56 GMT
Dear Dr. Ellis,
1. I am bring to your attention that in my essay, going on EINSTEIN'S TRAIL on the search for the parallel between the perpetuum mobile in TD and Lorentz transformation that LT is THE TOP DOWN EFFECT of earth’s motion on a relative motion of a particle on earth. By this I am giving you the greatest gift to you to confirm your essay on “Top Down Causation”. But perhaps...
view entire post
Dear Dr. Ellis,
1. I am bring to your attention that in my essay, going on EINSTEIN'S TRAIL on the search for the parallel between the perpetuum mobile in TD and Lorentz transformation that LT is THE TOP DOWN EFFECT of earth’s motion on a relative motion of a particle on earth. By this I am giving you the greatest gift to you to confirm your essay on “Top Down Causation”. But perhaps because of your dogmatic acceptance of SRT (in spite of Einstein's own misgivings about it) you refuse even to read my essay and consider whether it could be the case. You say: “I don't think there is much mileage in trying to show it is a wrong theory. It's not something I'd spend time on”.).
http://fqxi.org/community/forum/topic/1549
2. "The question I asked you is that has there been even a single experiment which has confirmed, that when x’ = gamma(x –ut) the corresponding time is
t’ = gamma(1- ux/c2)t?
(I correct my typo in my earlier post)
And your answer: “Yes - THE DECAY OF COSMIC RAY PARTICLES. This is discussed in most standard texts on special relativity, for example Flat and Curved Spacetimes (Ellis and Williams)”.
I AM SORRY DR. ELLIS, YOU ARE MAKING A VERY GRAVE ERROR. YOU ARE CONFUSING BETWEEN THE SO-CALLED TIME DILATION EQUATION (1) AND LORENTZ TIME TRANSFORMATION EQUATION (2)
t’ = t/(1 – v2/c2)1/2 --------------------(1)
t’ = gamma(1- ux/c2)t ----------(2)
You have said “all those collider experiments at places like SLAC and CERN verify it millions of times over each time they do a run”, I agree with you subject to what is stated below.
In those “millions of times” of verifications, what was verified was
a) that displacement is given by x’ = gamma(x –ut) where U IS THE VELOCITY OF ORBIT OF THE EARTH and gamma determined by u. Hence the gamma-factor (for all experiments conducted on earth) is a constant. Gamma = 1.000000005.
b) And in the experiments to verify the decay time of a muon at CERN it confirmed the ‘time dilation equation’ t’ = t/(1 – v2/c2)1/2. In this v is the velocity of the particle in this gamma-factor. In this equation gamma is a variable. In the CERN experiment gamma v = 0.99c and gamma = 7.088, and when moving in a cosmic ray (as in Feynman example below), v = 0.9c and gamma = 2,294.
To quote from Feynman: . For example, before we have any idea at all about what makes the meson disintegrate, we can still predict that when it is moving at nine-tenths of the speed of light the apparent time that it lasts is
(2.2x10-6)/ sq rt [ 1- (9/10) squared] sec; and that our prediction works …” (Vol I Ch 15 – 7).
For SRT to be correct on its fundamental contention on the Lorentz transformations, when the muon decays after moving through a displacement given by x’ = gamma(x –ut) the corresponding time has to be given by (2)
t’ = gamma(1- ux/c2)t.
But this time is given by (1).
Here is the fundamental contention of SRT in Einstein’s own words: :.. “The insight which is fundamental for special theory of relativity is this: The assumptions 1)[constancy of the velocity of light] and 2) [principle of relativity] are compatible if relations of a new type (‘Lorentz transformation’) are postulated for the conversion of co-ordinates and THE TIME.”(1, p. 55).
I WILL ASK THE QUESTION AGAIN. CAN YOU GIVE EVEN ONE EXPERIMENT THAT HAS CONFIRMED THE EQUATION t’ = gamma(1- ux/c2)t?
(I will follow this with another post giving a brief introduction to the content of my essay).
Best regards,
Viraj
view post as summary
report post as inappropriate
Anonymous wrote on Sep. 20, 2012 @ 05:15 GMT
Regarding the issue of the arrow of time and the H-theorem:
When I wrote my essay, I assumed that any competent present day physicist would be aware of the basic issues arising as to the time reversiblity of fundamental physics and the arrow of time. It has become painfully obvious through this thread that this is not the case. Yes of course it is PCT invariance rather than just T invariance that underlies present day particle physics: that makes no difference whatever to Weinberg's derivation of the H-theorem, which is based in unitarity. Unitary transformations are T-invariant. Weinberg's derivation of the H-theorem consequently does not solve the arrow of time issue in a purely bottom up way (my posting of Sep. 16, 2012 @ 14:06 GMT explains this in painful detail). If Weinberg had introduced some element related to collapse of the wave function into his argument, the situation would be different: but he does not do so.
For those of you who want a concise analysis of the issue by someone other than Penrose or Carroll, who it turns out are regarded with total disdain by some Californian physicists,
here is a clear presentation of the issue by Craig Callender. This carefully explains, in the proper historical context, why some kind of cosmological condition is necessary to resolve the arrow of time issue, as has been realised by many great physicists including Einstein, Feynman, and Schroedinger. If you take the trouble to analyse it properly, bottom up effects alone are not able to resolve the arrow of time issue.
George Ellis
report post as inappropriate
James Lee Hoover wrote on Sep. 20, 2012 @ 05:26 GMT
George,
You say,
"One of the basic assumptions implicit in the way physics is usually done is that all causation flows in a bottom up fashion, from micro to macro scales. However this is wrong in many cases in biology, and in particular in the way the brain functions."
My essay speaks of empirical evidence such as the trapping of anti-matter in space and the perceived weightlessness of thousands of sightings of UFOs in our atmosphere as good hard evidence. Macro and micro studies try to address this mystery. Is it an exception regarding your thoughts?
Jim
report post as inappropriate
Author George F. R. Ellis replied on Sep. 20, 2012 @ 05:41 GMT
Dear James
I'm afraid I don't take UFO sightings seriously as evidence about fundamental physics.
George
Author George F. R. Ellis wrote on Sep. 20, 2012 @ 05:28 GMT
Dammit the system logged me out. That was me.
And
here is the correct link, I hope. This websystem should allow one to look at the posting in its final form before putting it up: then these errors could be avoided.
George
Viraj Fernando wrote on Sep. 20, 2012 @ 07:07 GMT
Dear Dr. Ellis,
1. First of all I must let you know that I am not a ‘special relativity’ denier, in the sense that I reject the a) principle of constancy of velocity of light, b) the validity of the displacement equation of Lorentz transformation, c) the slowing down of internal processes of a particle when in motion, d) transverse Doppler effect (TDE)of light, d) matter particles...
view entire post
Dear Dr. Ellis,
1. First of all I must let you know that I am not a ‘special relativity’ denier, in the sense that I reject the a) principle of constancy of velocity of light, b) the validity of the displacement equation of Lorentz transformation, c) the slowing down of internal processes of a particle when in motion, d) transverse Doppler effect (TDE)of light, d) matter particles cannot move at the velocity c, etc.
I am glad to say that I not only accept these empirical facts which you call as the “tightly integrated package”, but I have also derived these by extending principles of TD into whole of physics as Einstein intended.
. “By and by I despaired of the possibility of discovering the true laws by means of constructive efforts based on known facts. The longer and the more despairingly I tried, the more I came to the conviction that only the discovery of a universal formal principle could lead to assured results. The example I saw before me was thermodynamics. The general principle was there given in the theorem: laws of nature are such that it is impossible to construct a perpetuum mobile” (Einstein’s Autobiography, p.53).
By extending the principles of TD (as Einstein intended), I have proved a) how the velocity of light remains constant in a given medium. b) Shown how the TDE occurs c) With TDE, I have shown how null result of the MMX comes to be. d) shown extremely accurately how an atomic clock in a GPS orbit loses 7.213 ns/day. e) Using the same algorithm which is used to calculate the above time delay, proved why a matter particle cannot move at velocity c.
So if you like, I have provided a “tightly integrated dynamic foundation” to the “tightly integrated package” which has so far been collection of ad hoc kinematic assertions. Thus fulfilling Einstein's dream of having a theory of principles in place of the makeshift constructive theory he created provisionally.
I am not a relativity denier in the sense of rejecting the ‘package’. I don’t throw the baby with the bath water. But you must admit that the LT time equation falls into the category of ‘bath water’. It is not an item in the ‘tightly integrated package’.
2. You wrote: “yes I agree that that specific equation per se has not been verified but time dilation has, which is its core element”.
No it is more than that. In the millions of experiments you mention, which have proved the LT space equation right, these have at the same time proved the LT time equation to be false.
If the ‘specific equation’ has not been 'verified', and there is the other equation which is the core element, does it not mean that the whole contention around the ‘specific equation’ is false?
I am glad that you have the honesty and courage to effectively admit that no experiment has proved the fundamental contention of SRT, which is: :.. “The insight which is fundamental for special theory of relativity is this: The assumptions 1)[constancy of the velocity of light] and 2) [principle of relativity] are compatible if relations of a new type (‘Lorentz transformation’) are postulated for the conversion of co-ordinates and the TIME.”(1, p. 55).
You wrote: “I don't have to have a test of that one specific equation in order to test the theory as a whole”.
But according to Einstein this equation is a FUNDAMENTAL premise for SRT. It is this time equation that transcends the contradiction between his other two postulates as you can see from the above quote from Einstein. So does not the theory fall apart on this account?
But the 'integrated package' remains with the "Right Way" - the TD interpretation!!
3. You wrote: “I call it putting my attention to items that are likely to lead to progress”.
It is for progress towards what Einstein indicated as the “Right Way” I am drawing your attention to. Einstein wrote: “If, then, it is true that the axiomatic basis of theoretical physics cannot be extracted from experience but must be freely invented (fictitiously), can we ever hope to find the right way? Nay more has the right way any existence outside our illusions? ……”. We need to note that in answering the above question , Einstein firmly asserted that the right way will be based on simplest of mathematical ideas: “ ..without a hesitation that there is, in my opinion a right way, and that we are capable of finding it (in the future) …Our experience hitherto justifies us in believing that nature is a realization of the simplest conceivable mathematical ideas. (thus quite in contrast to the abstruse mathematical formalisms of SRT and GRT), I am convinced that we can (i.e. WILL be able to) discover by means of purely mathematical constructions, the concepts and laws connecting them with phenomena” (Philosopher-Scientist, p. 398).
Best regards,
Viraj
view post as summary
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 21, 2012 @ 09:08 GMT
Another example:
Here is a study of top down effects (the role of environment on galaxy evolution) in astronomy, from a seminar here today.
Title: The MASSIV Survey
Abstract:
The MASSIV survey is composed of 84 star-forming galaxies at 0.9 < z < 1.8 selected from the VVDS. I will present its selection and focus on the main results of this survey: the kinematic diversity, the discovery of inverse metallicity gradients, the evolution of scaling laws and the role of environment on galaxy evolution as deduced from the study of the merger rate from MASSIV.These results will be put in regard to other integral fields surveys at larger (e.g. LSD/AMAZE, SINS or OSIRIS) and lower redshifts (e.g. GIRAFFE).
Author George F. R. Ellis wrote on Sep. 22, 2012 @ 04:55 GMT
Some of the essays in this competition relate to the relation between models and reality: a key feature of the way science works. Those who want to think about this in depth may find
this article on models in science useful.
Section 5.2 of that article is related to my essay, because we when we consider the hierarchy of structure and causation as discussed in my essay, we are actually using many different models, involving different representation/coarse graining scales, to represent the same physical reality. The issue is how they relate to each other. In general relativity, this leads to the issue of coarse graining and backreaction; in general it leads to the issue of what relations exist between these different models of the same system - that is, bottom up and top down relations between them.
George
Author George F. R. Ellis wrote on Sep. 22, 2012 @ 05:00 GMT
The further essay
here , discussing inter-theory relations, also takes up the same theme in a useful way.
George
this post has been edited by the forum administrator
Fred Diether wrote on Sep. 23, 2012 @ 05:35 GMT
Hi George,
Very nice essay. I am wondering if top-down / bottom-up causation is a duality? Could one exist without the other? I think that is what you are saying or the point you are trying to make.
Best,
Fred
report post as inappropriate
Edwin Eugene Klingman replied on Sep. 23, 2012 @ 05:43 GMT
Fred,
What an excellent observation!
Edwin Eugene Klingman
report post as inappropriate
Fred Diether replied on Sep. 23, 2012 @ 05:56 GMT
Thank you Edwin,
I suppose it is the same or similar to the argument that classical-quantum is a duality. Hopefully a good discussion point here.
Best,
FredPS. I am going to try to answer your questions about my essay tomorrow. Sorry it has taken so long; I will explain why over there.
report post as inappropriate
Edwin Eugene Klingman replied on Sep. 23, 2012 @ 06:37 GMT
Dear Pentcho,
I agree that the definitions of "top" and "down" are both fuzzy and arbitrary. To this extent it is may be a triviality. But I think it's deeper than that.
Edwin Eugene Klingman
PS. Some people get tired of your cutting and pasting, but I think that you have a knack for getting to relevant points of view. Keep it up.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 23, 2012 @ 08:28 GMT
Dear Fred,
that is an interesting question.
There are two kinds of duality in physics. The one is such cases as the Ads/CFT duality and the duality between Lagrangian and Hamiltonian formalisms, where they are different descriptions of the same physics. The other is the kind of duality that occurs in quantum physics as expressed in Dirac's bra and ket notation, based in the duality between vectors and co-vectors, where these are complementary aspects of the physics that work together to give the final outcome.
I suggest that the case of bottom up and top down causation is a duality of the second kind. This dual working allows interlevel feedback loops, which are the underlying enabling factor for the emergence of true complexity in biology.
George
Author George F. R. Ellis replied on Sep. 23, 2012 @ 08:55 GMT
Dear Eugene
I have carefully explained in my essay that I do not claim there is an identifiable topmost or bottom most layer:my comments relate to relations between any two neighbouring levels (and hence to any further levels related via neighbouring levels).
Is the idea of levels vacuous or arbitrary, as suggested by Pentcho Valev? Well please tell me what is wrong with Tables 1 and 2 in my essay. The very existence of different academic subjects such as physics, chemistry, and biology is testimony to the reality of the hierarchy. I give a much fuller exposition of its nature
here . In my reply to Pentcho on Sep. 9, 2012 @ 17:56 GMT, I give references that show the reality of the hierarchy in many different contexts.
I agree there are complications in determining its nature in very complex contexts; there is a whole literature on how to determine it in complex networks. Remember I am only claiming existence of local hierarchies; and there is abundant evidence there are a great many such local hierarchies in physics, biology, computers, and the brain. Please see for example the book by Campbell and Reece on biology and the book by Tanenbaum on computers (the references are in my essay).
A note to readers of this thread: I am no longer prepared to stand for the personal insults that Pentcho Valev insists on including in his postings to my thread. The administrator of this competition, who decides whether to accept or reject requests for deletions of posts, agrees with me that the tone of his postings has been intolerable.
If further such personal attacks are made in postings to my thread by anyone at all, I may well cease reading what is posted here and stop answering all postings. Why on earth should I put up with this kind of behaviour?
George Ellis
Author George F. R. Ellis replied on Sep. 23, 2012 @ 11:34 GMT
The reference was
this GE
Thomas Howard Ray replied on Sep. 23, 2012 @ 11:53 GMT
Hi George,
You replied to Fred, "I suggest that the case of bottom up and top down causation is a duality... (that) ... allows interlevel feedback loops, which are the underlying enabling factor for the emergence of true complexity in biology."
Indeed. I have not been able to formulate a more general definition of "organization" -- whether biological or inorganic -- than "order with feedback." In fact, how could causality be more clearly implied, independent of an infinite regress?
You mentioned in a post to the anonymous respondent the barrier of state preparation in the Stern-Gerlach experiment. If you're not familiar with it, you might be interested in
Leslie's Lamport's treatment of the experiment. As he notes: "No real experiment, having finite precision, can demonstrate the presence or absence of continuity, which is defined in terms of limits."
Order with feedback is a demonstrably self-limiting process.
Best,
Tom
report post as inappropriate
Author George F. R. Ellis replied on Sep. 23, 2012 @ 12:17 GMT
Hi Tom
thanks for that, that is an interesting paper.
You quote "No real experiment, having finite precision, can demonstrate the presence or absence of continuity, which is defined in terms of limits." I agree. In fact I have a closely related strong position, based on a statement by David Hilbert:
(a) no physics theory or proof that relies on infinity in an *essential* way describes the real world;
one implication is that spacetime must be quantised at a small enough scale (which is supported by many other arguments);
(b) the claimed existence of infinities of anything whatever in any physical theory is not a scientific statement, as there is no way that this claim could ever be observationally or experimentally proved.
However I did not see any close link of Lamport's paper to state vector preparation. Did you mean another paper? Or is it related to the fact that state vector preparation is never perfect? - even if so, it's still non-unitary (consider a wire polarizer).
Best,
George
Author George F. R. Ellis replied on Sep. 23, 2012 @ 12:45 GMT
So I perused Lamport's very interesting set of papers, and would like to comment on #31: "On-the-fly Garbage Collection: an Exercise in Cooperation" (with Edsger Dijkstra et al.)
The point here is that garbage collection is a crucial part of computing, and is an example of the top down process of adaptive selection: that is of selecting a set of elements to be deleted, leaving behind the ones that are meaningful. This depends on a selection criterion, which (in my terms) lives at a higher level of abstraction than the elements to be selected. That is made explicit here: " starting from the roots, all reachable nodes are marked; upon completion of this marking cycle all unmarked nodes can be concluded to be garbage, and are appended to the free list". The selection criterion is reachability.
This is the key process by which meaningful information is garnered: you delete the stuff that is not meaningful, thereby creating a smaller set of stuff which has meaning. This is related for example to deleting emails and unwanted files n your computer, as well as to the fact that this process (it's essentially clearing memory) is where entropy is generated - because it's an irreversible process (assuming that deleted stuff is gone). Just like biology (deleted animals are past history) -- and like state vector preparation.
Best,
George
Author George F. R. Ellis replied on Sep. 23, 2012 @ 14:13 GMT
Ok I'm very slow. Now I see your relation of Buridan's Ass to selection, which is of course a binary decision.
Yes it's very nice. The idea of quantisation of the decision process is needed: it's based in discrete rather than continuous variables - which fits in well with Hilbert's maxim: “the infinite is nowhere to be found in reality, no matter what experiences, observations, and knowledge are appealed to.” There is no continuum of any variables in the real world. Another example of the disjunction between mathematical models and reality.
George
Thomas Howard Ray replied on Sep. 23, 2012 @ 15:08 GMT
George, it was highly interesting to see you go through the same process of comprehending Lamport's formulation of Buridan's principle as I! Whereas you did it in minutes, however, it took me years. (So you're not that slow, after all.) I had long worked with the mathematical ramifications of Buridan's Ass before stumbling across Lamport's then-unpublished paper ("Buridan's Principle") written in 1984, a couple of years ago. I found it deeply subtle. I suggested to Leslie last year that "Foundations of Physics" is a suitable venue, and after some months of refereeing, it was published in April.
I hold now, as strongly as ever, that this physical principle impacts every continuous measurement function at every scale. When Leslie and I corresponded by Email last February, he indicated that he thinks the reference he added that was suggested by a referee -- no. 7 by Busch, et al, is a "really profound analysis" of the Stern-Gerlach experiment. I bought the book and read the chapter a couple of times, though I am not sure I agree that the analysis is as profound as Lamport's, perhaps because I am looking at things from a classical viewpoint rather than that of computer design.
In another communication, he attached a PDF of a then-unpublished book by
David Kinniment that I just learned has now been published posthumously (sadly, Prof. emeritus Kinniment passed away in May), as *Synchronization and Arbitration in Digital Systems* which came to me as *He Who Hesitates is Lost.* You might be interested in that one, too.
All best,
Tom
report post as inappropriate
Thomas Howard Ray replied on Sep. 23, 2012 @ 15:12 GMT
Correction: I misunderstood the information on the site I linked. *He Who Hesitates is Lost" is a separate work.
report post as inappropriate
Edwin Eugene Klingman replied on Sep. 23, 2012 @ 19:04 GMT
Dear George,
When I commented above that "the definitions of "top" and "down" are both fuzzy and arbitrary. To this extent it is may be a triviality. But I think it's deeper than that", I was not referring specifically to 'your' definitions, which I had not bothered to lookup, but to the generic definition of 'top' and 'bottom'. In fact, my mind was still on Fred's excellent question and my response to him. There was no criticism of your essay intended, none at all.
And the rest of my remark to Pentcho was due to the fact that I had recently read a comment of his on another thread that contained quotes I found very interesting. I did not realize that he had been insulting you, although I do know that he pushes his own view with minimum tact. Again, there was no criticism of you implied.
These side issues distract from what was, I thought, an excellent question by Fred. I find dualities both fascinating and deep.
Edwin Eugene Klingman
report post as inappropriate
James Putnam replied on Sep. 23, 2012 @ 19:35 GMT
George F. R. Ellis,
Edwin's message of today drew my attention to your message from which I quote this:
"If further such personal attacks are made in postings to my thread by anyone at all, I may well cease reading what is posted here and stop answering all postings. Why on earth should I put up with this kind of behaviour?"
I hope that you decide to ignore and remove offending messages instead. Earlier this morning I was thinking about your forum and felt motivated to write a thank you message. I haven't participated in discussion here that I recall, but, my thought earlier was that I appreciate that you submitted an essay and also that you give generously of your time to particpate in discussions with other authors. Your essay has been at the top or very near the top in community votes. It is the participation of the professionals that makes this contest work. Anyway, this is a thank you from someone who's ideas of which I believe you would strongly disapprove and prefer to not be bothered with. :)
James
report post as inappropriate
Domenico Oricchio replied on Sep. 23, 2012 @ 20:00 GMT
I wish to thank George F.R. Ellis for the immense quality of the author's thread; that I, and the community, read ever with pleasure.
There is a sea of knowledge, a complete and deep quality of the answers, that all of us appreciate.
I cannot avoid that a person can attack Ellis, so I ask a great favor for the community: I don't consider important to read the attacks to Ellis, so that he can erase the attacks.
Saluti
Domenico
report post as inappropriate
hide replies
Author George F. R. Ellis wrote on Sep. 23, 2012 @ 08:05 GMT
“There is nothing new under the sun”: one open question and two failed challenges.
The one respondent on this thread who has seriously challenged the scientific content of my essay is an anonymous physicist operating under the pseudonym “There is nothing new under the sun”, claiming I have not given a single genuine instance of top-down causation that could not be explained in a...
view entire post
“There is nothing new under the sun”: one open question and two failed challenges.
The one respondent on this thread who has seriously challenged the scientific content of my essay is an anonymous physicist operating under the pseudonym “There is nothing new under the sun”, claiming I have not given a single genuine instance of top-down causation that could not be explained in a purely bottom up way. I have responded to him in various posts, particularly one on Sept 14:2012@22:51 GMT, and in a summary response on Sept 17th:2012 @18h15GMT give a series of counter examples to his claims.
I have conceded that one point in my argument can be queried. He has failed two challenges I have set him.
The point that remains open is the validity of my arguments regarding the Caldeira-Leggett model, which I claim is a top-down effect; he claims it can be explained in a purely bottom up way via the renormalisation group. I still believe that my argument, set out in detail
here , is valid, but I have to look into the link with the renormalisation group when I have time. That study might make me withdraw my claim about the Caldeira-Leggett model, or it might lead me to claim that renormalisation group descriptions, like superconductivity theory, embody an essential top-down element. This is work in progress; but I acknowledge that there is a legitimate query to be answered.
The first challenge he has failed is as follows: on Sept 14:2012@22:51 GMT, I wrote the following: “Here is a challenge for you. Explain to me in a purely bottom up way how state vector preparation is possible, as for example in the Stern Gerlach experiment. Quantum physics is unitary, as we all know: how does the non-unitary behaviour of state vector preparation emerge in a purely bottom up way from that unitary dynamics? You won't be able to explain this action without invoking the effect of the apparatus on the particles - which is a form of top down action from the apparatus to the particles.” He has not responded to this in any way. He has failed that challenge.
The second challenge he has failed is as regards the arrow of time. He has strongly insisted in various posts that Weinberg’s quantum field theory derivation of the H-Theorem resolves the arrow of time issue in a purely bottom up way, because it shows that entropy always increases. As well as referring him to
other sources that support my view, I have responded to this claim with a step by step demonstration that this is not the case: see my posting of Sept.16:2012@14:06 GMT, which definitively shows the arrow of time issue cannot be resolved in a purely bottom up way, because Weinberg’s derivation -- just like Boltzmann’s -- works in both directions of time.
In my follow up posting on Sept19:2012 @ 05:37 GMT, I said the following:
“ So come on. Which is it?
* Do you have a counter argument showing I'm wrong? If so what is it? Where is the mistake in this elementary logic?
or
* Do you have the stature to concede you and your Santa Cruz experts are simply wrong? - you did not grasp this elementary logic?
or
* will you lurk in the shadows, unable to answer and unable to admit you were wrong? -- proving you don't have the capacity to admit that you are wrong, nor the stature required to apologise for the insulting nature of your comments.
If you give no reply, you choose the last option. Wheeler, Feynman, Sciama, Davies, Zeh, Penrose, Carroll, and others including myself are vindicated, and your condescending comments are discredited.”
He has chosen the third course. He has failed that challenge as well.
I only need one example to prove that top-down processes do indeed occur in physics, just as they do in many other contexts such as in digital computers and in
the human brain and in
evolutionary theory . My case (elaborated
here ) stands undefeated.
George Ellis
view post as summary
Author George F. R. Ellis replied on Sep. 23, 2012 @ 11:56 GMT
Trying again!! This linking system does not seem to work.
I only need one example to prove that top-down processes do indeed occur in physics, just as they do in many other contexts such as in digital computers and in
the human brain and in
evolutionary theory . My case (elaborated
here ) stands undefeated.
this post has been edited by the forum administrator
Yuri Danoyan wrote on Sep. 23, 2012 @ 15:07 GMT
Dear George
Did you get my e-mail?
Yuri
report post as inappropriate
Author George F. R. Ellis replied on Sep. 23, 2012 @ 16:51 GMT
Yuri Danoyan replied on Sep. 23, 2012 @ 18:52 GMT
O.K.
I sending just now again
Dear Dr Ellis,
First of all I would like reminding to you one quote from famous neurophysiologist Warren McCulloch, known for his work on the foundation for certain brain theories and his contribution to the cybernetics movement .
In the last century he wrote:
''As I see what we need first and foremost is not correct theory, but some
theory to start from, whereby we may hope to ask a question so that we will
get an answer, if only to the effect that our notion was entirely
erroneous. Most of the time we never even get around to asking the question
in such a form that it can have an answer."(Discussion with John von Neumann
John von Neumann Collected works, Volume 5,p.319)
It was about mind - body relationship and brain function
My question is the following:
I think this is applicable to modern physics?
I put forward 3 questions:
1) 4D space-time?
2) Gravity as a fundamental force?
3) 3 fundamental dimensional constants(G, c, h)?
My attempts to get answers see my essay
http://fqxi.org/community/forum/topic/1413
Sincerely
Yuri
Danoyan
report post as inappropriate
Author George F. R. Ellis replied on Sep. 23, 2012 @ 21:22 GMT
Dear Yuri
this is off the topic of my thread, but still:
1) 4D space-time? -- yes!
2) Gravity as a fundamental force? -- of course: but it's not a force like other forces, it's an expression of spacetime curvature, because of the principle of equivalence. Its the gravitational field (the Weyl tensor) that is more fundamental.
3) 3 fundamental dimensional constants(G, c, h)? -- well it's the dimensionless constants that really count. The "Living Review" by J-P Uzan is great on the topic: see
here I'll try to get to your essay
George
Yuri Danoyan replied on Sep. 24, 2012 @ 16:03 GMT
I think that due to the lack of time you do not have time to read all essays. I sending to you only cosmological conclusion from my essay.
As a cosmologist you can assess at a glance.
Appendix 1 Cosmological picture of one cycle
Big Bang; Present; Big Crunch
c=10^30; c=10^10; c=10^-10
G=10^12; G=10^-8; G=10^-28
h=10^-27; h=10^-27; h=10^-27
alfa =10^-3; 1/137; 1
e=0,1 ; e=e ; e=11-12
report post as inappropriate
Author George F. R. Ellis replied on Sep. 24, 2012 @ 18:32 GMT
Yuri
I have read your essay and still do not understand the set of numbers you give above. It is completely unclear what they refer to. Nevertheless I have two comments:
1. Your theory seems mainly numerological. I can't see what the underlying theory is that is supposed to lead to those numbers. Is it based in M theory, or general relativity, or loop quantum gravity, or what?
2.Your proposal is I think a form of cyclic universe. But no one has yet provided an unproblematic mechanism for a bounce between cycles, despite many attempts to do so.I did not see any mechanism presented in your essay that will resolve this problem (which is one I once spent many years thinking about).
George
Yuri Danoyan replied on Sep. 24, 2012 @ 19:23 GMT
Once again, why G and c not fundamental.
Because in the same space - time they vary synchronously, but in Planck units of length and Planck unit of time they have different dependencies, and therefore none of them are true.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 24, 2012 @ 20:26 GMT
Yuri
You don't provide a coherent theory, just a set of numerological statements. Additionally those are dimensional statements, and so entirely based in the choice of units. You can get any other result by changing the units, so they have no physical meaning.
That's as much response as I'm going to give: this subject is not the topic of this thread.
George
Yuri Danoyan replied on Sep. 24, 2012 @ 20:44 GMT
My approach close to John moffat proposal a variable speed of light approach to cosmological problems, which posits that G/c is constant through time, but G and c separately have not been. Moreover, the speed of light c may have been much higher during early moments of the Big Bang.
You might not think John serious scholar?
Then another thing ..
report post as inappropriate
Author George F. R. Ellis replied on Sep. 24, 2012 @ 21:33 GMT
John Moffat is a serious scholar, but he got the varying speed of light effect wrong. What he proposed was not a physical effect, it was just a change of coordinates. It can be eliminated by a change to more suitable coordinates.
See
here for a detailed analysis.
Yuri Danoyan replied on Sep. 24, 2012 @ 22:09 GMT
Quote from http://arxiv.org/pdf/gr-qc/0305099v2.pdf
"By contrast, the theories of Bekenstein [46], Clayton and Moffat [47], and Bassett et al. [48] are genuine bimetric theories...."
O.K.
In my approach in duration cosmological time
Appendix 2 Cosmological values of mass
Mp =10^-24; 10^-24; 10^-24
Me =10^-28; 10^-28; 10^-28
Mpl=10^-4; 10^-4; 10^-4
Mhbl=10^16; 10^16; 10^16
See Scale invariance http://en.wikipedia.org/wiki/Scale_invariance
Scaling law has not been canceled.
report post as inappropriate
Yuri Danoyan replied on Sep. 24, 2012 @ 22:16 GMT
See also http://fqxi.org/community/forum/topic/1554
Scaling Laws in Particle Physics and Astrophysics
report post as inappropriate
Yuri Danoyan replied on Sep. 24, 2012 @ 22:32 GMT
I am also used George Gamov idea
G. G a m о w, Phys. Rev. Lett. 19, 759 (1967). е2 ~ t.
report post as inappropriate
Author George F. R. Ellis replied on Sep. 25, 2012 @ 06:40 GMT
Moffat's later bimetric theory was OK, it was his first varying speed of light theory that was wrong. I did not see in your essay that you are proposing a bimetric theory.
There has been a huge amount of work on the possibility of varying constants since Gamov. Please see for example J P Uzan et al
here and links therein: there are many constraints on such theories. You'll need to tie in to this literature in order to be taken seriously nowadays.
That is my final comment on your essay on this thread.
hide replies
Viraj Fernando wrote on Sep. 23, 2012 @ 16:28 GMT
Dear Fred and George,
SECOND LAW OF THERMODYNAMICS - A 'TOP DOWN CAUSATION'.
Fred Diether wrote:“I am wondering if top-down / bottom-up causation is a duality? Could one exist without the other? I think that is what you are saying or the point you are trying to make”.
‘Top Down’ concept is not something marginal as the author of the essay seems to think. (For instance...
view entire post
Dear Fred and George,
SECOND LAW OF THERMODYNAMICS - A 'TOP DOWN CAUSATION'.
Fred Diether wrote:“I am wondering if top-down / bottom-up causation is a duality? Could one exist without the other? I think that is what you are saying or the point you are trying to make”.
‘Top Down’ concept is not something marginal as the author of the essay seems to think. (For instance he thinks the top down causation of the Sun on earth manifests in marginal effects like the tides. Well then lunar tides have to be considered as 'Bottom Up!!!') 'Top down' concept is far, far deeper. It is one of the basic principles in Nature.
Nature’s processes are a hierarchy of self-similar structures. (Sergey Fedosin brings this out in his essay). If they are a ‘hierarchy’ how is the hierarchic dominance and organic links established between two adjacent levels?.
Here is Newton for you: “And thus Nature will be very conformable to herself and vey simple, performing all the great Motions of heavenly Bodies, by the Attraction of Gravity, which intercedes those Bodies, and almost all the small one of their Particles by some other attractive and repelling Powers which intercede the Particles. …… To tell us that every Species of Things is endow’d with an occult specifick Quality (of Gravity and of magnetick and electrick Attractions and of fermentations) by which it acts and produces Effects, is to TELL US NOTHING: But to derive TWO OR THREE GENERAL PRINCIPLES of Motion from Phaenomena, and afterwards to tell us how Properties and Actions of all corporeal Things follow from those manifest Principles, would be a VERY GREAT STEP IN PHILOSOPHY….” (Query 31)
One of those GENERAL PRINCIPLES: The process below forms an organic link with the next higher level in the hierarchy. Or looked at it the other way, the two processes form an interface between the two levels by usurping a fraction of energy from the lower level.
The second law of thermodynamics comes into effect by way of this process of interfacing of the two levels of energy.
Let us look at Carnot’s ideal engine, where not all the heat energy (Q = S1T1) generated gets converted into work. It is found that a fraction Q = S1T2 gets ‘lost’, and what is available for conversion to work is S1(T1 –T2) where T2 is the temperature of the background field. This is why the perpetuum mobile of the second kind is impossible.
Einstein understood that there is a analogical connection between the perpetuum mobile and the Lorentz transformation. (See my essay: http://fqxi.org/community/forum/topic/1549 )
“The universal principle of the special theory of relativity is contained in the postulate: The laws of physics are invariant with respect to Lorentz transformations, ….. This is a restricting principle for natural laws, comparable to the restricting principle of the non-existence of the perpetuum mobile which underlie thermodynamics” (1, p.57).
Well if there is “an analogical connection”, there has to be a GENERAL PRINCIPLE underlying both processes. Hence Einstein wrote: . “By and by I despaired of the possibility of discovering the true laws by means of constructive efforts based on known facts. The longer and the more despairingly I tried, the more I came to the conviction that only the discovery of a universal formal principle could lead to assured results. The example I saw before me was thermodynamics. The general principle was there given in the theorem: laws of nature are such that it is impossible to construct a perpetuum mobile” (1, p.53).
So what is this GENERAL PRINCIPLE: In general terms, the fraction of energy Q usurped to form the organic link with the background is given by the product of the extensive component Ea of the energy in action and the intensive component Ib of the energy of the background. Thus the fraction of energy forming the organic link with the background
Q = Ea x Ib.
When this general principle is applied to the motion of a particle relative to the background velocity field of the earth’s orbital motion, a similar fraction of energy will be required to form the interface. Lorentz opens his 1904 paper (which is on the ‘Lorentz transformation’) recognising such a process. "The problem of determining the influence exerted on electrical and optical phenomena ..... in virtue of the Eath's annual motion....".
But the problem was how to account for the gamma-factor. See my paper to find out how the gamma-factors comes into being in equations -.http://fqxi.org/community/forum/topic/1549
Best regards,
Viraj
view post as summary
report post as inappropriate
Author George F. R. Ellis wrote on Sep. 23, 2012 @ 21:31 GMT
Viraj
I like that statement: "One of those GENERAL PRINCIPLES: The process below forms an organic link with the next higher level in the hierarchy. Or looked at it the other way, the two processes form an interface between the two levels by usurping a fraction of energy from the lower level. The second law of thermodynamics comes into effect by way of this process of interfacing of the two levels of energy. "
Not sure about the application to the particle motion. take it more to deal with systems of particles rather than individual particle.
George