CATEGORY:
Questioning the Foundations Essay Contest (2012)
[back]
TOPIC:
Underlying Assumptions in Physics: The Relationship Between Evidence and Theory by Emily Christine Adlam
[refresh]
Login or
create account to post reply or comment.
Author Emily Christine Adlam wrote on Aug. 30, 2012 @ 12:41 GMT
Essay AbstractTraditionally, scientific enquiry has presupposed a relatively simple relationship between our empirical evidence and the facts of reality. For example, we presume that our memories and records are approximately correct representations of the actual course of past history, and we presume that most of the events which actually occur are events made either certain or probable by underlying physical laws. Such assumptions seem to be necessary if ordinary scientific methods of formulating deterministic or probabilistic laws are ever to get off the ground. But developments in the fields of both statistical mechanics and quantum mechanics have begun to give us specific reasons to question these assumptions, since both theories explicitly undermine our beliefs about the causal links between records of the past and actual past events. In light of such considerations, questions about the reliability of memories and records cannot be relegated to philosophical scepticism, but must be taken seriously as part of contemporary science and as indicators of possible new directions for the development of science.
Author BioEmily Adlam is reading Physics and Philosophy at the University of Oxford.
Download Essay PDF File
post approved
Pentcho Valev wrote on Aug. 31, 2012 @ 05:02 GMT
Clausius' famous principle "ENTROPY ALWAYS INCREASES" (which, according to A. Eddington, holds "the supreme position among the laws of Nature") was deduced in 1865 in the way presented by Jos Uffink on p. 37 in his "Bluff your Way in the Second Law of Thermodynamics":
http://philsci-archive.pitt.edu/archive/00000313/
Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state."
Clearly Clausius' deduction is based on three premises:
PREMISE 1: The entropy is a state function.
PREMISE 2: Clausius' inequality (formula 10 on p. 33) is correct.
PREMISE 3: Any irreversible process can be closed by a reversible process to become a cycle.
All the three premises are unproven; PREMISE 3 is almost obviously false:
http://philsci-archive.pitt.edu/archive/00000313/
Jos Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."
Pentcho Valev pvalev@yahoo.com
report post as inappropriate
Pentcho Valev replied on Sep. 1, 2012 @ 06:55 GMT
50 years ago the following challenges to the second law of thermodynamics would have produced a frenzy atmosphere in the scientific community. Nowadays scientists couldn't care less:
http://arxiv.org/abs/1203.0161
Self-Charged Graphene Battery Harvests Electricity from Thermal Energy of the Environment, Zihan Xu et al: "Moreover, the thermal velocity of ions can be maintained by...
view entire post
50 years ago the following challenges to the second law of thermodynamics would have produced a frenzy atmosphere in the scientific community. Nowadays scientists couldn't care less:
http://arxiv.org/abs/1203.0161
Self-Charged Graphene Battery Harvests Electricity from Thermal Energy of the Environment, Zihan Xu et al: "Moreover, the thermal velocity of ions can be maintained by the external environment, which means it is unlimited. However, little study has been reported on converting the ionic thermal energy into electricity. Here we present a graphene device with asymmetric electrodes configuration to capture such ionic thermal energy and convert it into electricity. (...) To exclude the possibility of chemical reaction, we performed control experiments... (...) In conclusion, we could not find any evidences that support the opinion that the induced voltage came from chemical reaction. The mechanism for electricity generation by graphene in solution is a pure physical process..."
http://arxiv.org/ftp/arxiv/papers/1207/1207.6599.pdf
"We have studied the Si devices to generate electricity from thermal motion of ions in aqueous electrolyte solutions at room temperature. (...) However,, this finding does not agree with the second law of thermodynamics, which limits the utilization of the random thermal motion of ions to be spontaneously collected to produce 10 electricity. We cannot explain why either this experiment or the previous experiment of graphene did not agree with the traditional theory. More research will be required to fully understand this phenomenon."
http://physics.aps.org/synopsis-for/10.1103/PhysRevLett.108.
097403
"Physicists have known for decades that, in principle, a semiconductor device can emit more light power than it consumes electrically. Experiments published in Physical Review Letters finally demonstrate this in practice, though at a small scale. (...) Decreasing the input power to 30 picowatts, the team detected nearly 70 picowatts of emitted light. The extra energy comes from lattice vibrations, so the device should be cooled slightly, as occurs in thermoelectric coolers. These initial results provide too little light for most applications. However, heating the light emitters increases their output power and efficiency, meaning they are like thermodynamic heat engines..."
http://www.dailytech.com/An+Incredible+Discovery+Graphene+Tr
ansistors+SelfCool/article21285.htm
"Overcoming technical challenges, the University of Illinois team used an atomic force microscope tip as a temperature probe to make the first nanometer-scale temperature measurements of a working graphene transistor. What they found was that the resistive heating ("waste heat") effect in graphene was weaker than its thermo-electric cooling effect at times. (...) Further, as the heat is converted back into electricity by the device, graphene transistors may have a two-fold power efficiency gain, both in ditching energetically expensive fans and by recycling heat losses into usable electricity. Professor King describes, "In silicon and most materials, the electronic heating is much larger than the self-cooling. However, we found that in these graphene transistors, there are regions where the thermoelectric cooling can be larger than the resistive heating, which allows these devices to cool themselves."
Pentcho Valev pvalev@yahoo.com
view post as summary
report post as inappropriate
Emily replied on Sep. 4, 2012 @ 07:13 GMT
Thank you for your comments! It's certainly true that both the concept of entropy and the second law of thermodynamics are plagued with difficulties both in derivation and interpretation, and this essay is not intended to be a defence of either. My discussion of 'the entropy of the universe' is mainly a way of pointing to the appearance of temporal asymmetry - according to our memories and records there seems to be a kind of directedness in the way that events come about, yet that direction doesn't seem to come from the underlying microdynamics and therefore needs to be accounted for in terms of a further assumption about the initial conditions. It's convenient to frame that assumption in terms of the initial low entropy of the universe, but the argument isn't dependent on specific assumptions about the nature of entropy and/or the status of the second law.
report post as inappropriate
Pentcho Valev replied on Sep. 4, 2012 @ 14:45 GMT
Emily,
You wrote: "It's convenient to frame that assumption in terms of the initial low entropy of the universe, but the argument isn't dependent on specific assumptions about the nature of entropy and/or the status of the second law."
But "initial low entropy" already presupposes some "specific assumptions about the nature of entropy and/or the status of the second law". By the way, at the end of his paper, Uffink in fact rejects the law of entropy increase:
http://philsci-archive.pitt.edu/archive/00000313/
Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 94: "This summary leads to the question whether it is fruitful to see irreversibility or time-asymmetry as the essence of the second law. Is it not more straightforward, in view of the unargued statements of Kelvin, the bold claims of Clausius and the strained attempts of Planck, to give up this idea? I believe that Ehrenfest-Afanassjewa was right in her verdict that the discussion about the arrow of time as expressed in the second law of the thermodynamics is actually a RED HERRING."
Pentcho Valev pvalev@yahoo.com
report post as inappropriate
Emily replied on Sep. 7, 2012 @ 08:58 GMT
I agree that great care needs to be taken with the second law - in particular, I would reinforce that it shouldn't be taken as a 'fundamental' law (whatever that means!) but rather as a statistical generalisation which holds as a matter of high probability.
I invoked entropy in this essay mainly as a simple way of pointing out the asymmetry that exists between our predictions and retrodictions - the problem is that if we accept the time-symmetry of the microdynamical laws we seem to have no good reason not to believe that the 'entropy' (using this concept as a way of formalising closeness to thermal equilibrium, without presupposing any substantive claims about irreversibility or the second law) should increase in both the past and future direction. Only the past hypothesis gives us an adequate basis for retrodicting lower 'entropy' states in our past, yet our principal reason for accepting the past hypothesis is our belief that entropy was in fact lower in the past - hence we seem to have a circularity in our reasoning, and I don't think that alternative definitions of entropy or revisions of the status of the second law will do enough to make the problem go away.
report post as inappropriate
hide replies
Jose P. Koshy wrote on Aug. 31, 2012 @ 12:12 GMT
Entropy and quantum mechanics are purely mathematical. In my opinion, applying such mathematical concepts to explain any physical system is incorrect.This leads to such ideas that question our belief regarding 'relationship between our empirical evidence and the facts of reality'.A clear demarcation between physics and mathematics will remove all such non-classical concepts.
We require a physical definition of entropy.When the universe expands, the stars contract. This can be regraded as the entropy of the stars decrease when the entropy of the universe increases. This may be a reversible process.As long as the expansion continues, past will be past and future, future. If it starts contracting, it will be only a cyclical change and not a going back into the past.
report post as inappropriate
Author Emily Christine Adlam replied on Sep. 4, 2012 @ 07:18 GMT
Thank you for this comment! I agree that it's certainly important to make a distinction between actual physics and the mathematics we use to formalize that physics, and it's true that the physical definition of entropy is problematic. I also agree that we would not start 'going back into the past' if the universe were to start contracting - the difficulty about the past that I wished to point to was merely that, based on standard dynamical theories and without addditional assumptions like the past hypotheses, retrodiction would suggest that 'entropy' (whatever that means physically) was probably higher in the past than it is now, and moreoever that it's difficult to see how our memories/records can directly testify against this.
Member George F. R. Ellis wrote on Sep. 3, 2012 @ 15:10 GMT
Dear Emily Adnam
I appreciate your skeptical view of the nature of evidence, and how we must be cautious in this regard. However I do believe you are giving much too much credence to the Boltzmann Brain argument.
You talk about the histories stored in our memories, but there is much more to history than that: it is for example also stored in the geological record, as well as in the...
view entire post
Dear Emily Adnam
I appreciate your skeptical view of the nature of evidence, and how we must be cautious in this regard. However I do believe you are giving much too much credence to the Boltzmann Brain argument.
You talk about the histories stored in our memories, but there is much more to history than that: it is for example also stored in the geological record, as well as in the radiation reaching us from the end of the hot big bang era nearly 14 billion years ago. You state "it is vastly more likely that the essay you are currently reading was produced by a random convergence of molecules than that at some time in the past another conscious person existed and wrote the essay." Do you really believe that? This depends on a whole series of extremely unlikely assumptions: inter alia equilibrium that lasted for almost an eternity of time, when there is no evidence this ever happened; the assumption that complex structures such as the brain can emerge from random fluctuations, when we've never even seen an amoeba emerge in this way, let alone a fly or a frog. It's a fun speculation, but I can't see how one can take it as very likely. The most that we can realistically expect to emerge from fluctuations are particles, or conceivably atoms.
I fully agree with your criticism of the Everett interpretation, but then you kind of recant from that criticism. Why not assume there is indeed a collapse mechanism, we just don't know how it works? You say "the Everett interpretation might still, after all, be true." Well yes, but it might be false; the collapse hypothesis might be true. I can't see why that statement carries the day.
It is good to have the reminder "transcendental considerations are frequently
an important part of our scientific practice, but also they have their limits: even when we choose to rule out certain possibilities for practical reasons, we should view our avoidance of them not as rigid prohibitions but as useful heuristic guides for the present." Wise advice. This kind of meta analysis is very welcome: it is so often missing in physics writing.
George Ellis
view post as summary
report post as inappropriate
Member Ian Durham replied on Sep. 4, 2012 @ 01:07 GMT
Hi Emily and George,
I seem to be in the strange position of having to disagree with both of you. On the one hand, I agree with George that it seems a bit absurd to assume that the appearance of your essay from random fluctuations must be much more probable than were it to have been written by a conscious person. On the other hand, I agree with Emily that random fluctuations can nevertheless give rise to complex phenomena. My reason for both stances is the same: "highly improbable" is not the same thing as "impossible."
This seems to be a common trait among humans in regard to our interpretation of statistical phenomena: an event that occurs is a priori assumed to be highly likely by dint of the fact that it has occurred in the first place. In light of additional information, the event's relative likelihood may be revised downward, but the fact of the matter is that it is initially assumed to be high simply because it happened. But even highly unlikely events still happen. My neighbor has been struck by lightning. If he grew up ignorant and isolated, he might be led to conclude that *everyone* gets struck by lightning which is absurd.
In short, an equally valid interpretation is that our memories are perfectly valid and correct but that the universe simply evolved in a highly unlikely (but not impossible) way. For example, suppose there is a spectrum of *possible* universes (these would be *actual* universes in an Everettian interpretation). Even if only a single universe occurs, nothing says it absolutely must be one of the more likely candidates. That's the point of a random process.
Also, one other point I wanted to make: entropy depends on how you define it. It is entirely possible to define it in such a way that a low entropy in the early universe is not unexpected.
Regardless of my aforementioned gripes, it was a nice essay. I found it to be well-written and carefully considered.
Ian Durham
report post as inappropriate
Author Emily Christine Adlam replied on Sep. 4, 2012 @ 07:51 GMT
Thank you both for your replies!
First, I'm sorry if I gave the impression that the difficulty with evidence is primarily concerned with memory. The same arguments apply to any kind of record of the past which is accessible in the present, including geological records and so on - it's still more likely that such records were produced by spontaneous fluctuations than that the past they...
view entire post
Thank you both for your replies!
First, I'm sorry if I gave the impression that the difficulty with evidence is primarily concerned with memory. The same arguments apply to any kind of record of the past which is accessible in the present, including geological records and so on - it's still more likely that such records were produced by spontaneous fluctuations than that the past they purport to record actually happened (at least on a certain view of 'likelihood').
I don't actually believe that the present state is just a fluctuation (I'm not sure it would be psychologically possible to believe that) but the difficulty is that it's hard to give a good justification for my conviction that the past actually happened as I remember it, given that all my reasons for believing that are based on memories and their veridicality is precisely the point at issue.
We certainly can assume that there's some unknown collapse mechanism in QM, but again, there's a difficulty with justifying the claim. If the Everett interpretation is the simplest way of accounting for our present evidence, why not believe it? The logical difficulties that I point to in accepting the Everett interpretation don't seem to be the right kind of considerations to justify substantive physical claims about a collapse mechanism - surely we need more direct empirical evidence for something like that.
Finally, I'd reinforce that the argument concerning entropy is really just a way of flagging up facts about temporal asymmetry, and isn't dependent on any particular definition of entropy. The problem is simply that in order to get retrodiction which match our memories/records of the past we apparently need to make some substantive assumptions about the nature of the initial state of the universe, and we don't seem to have an adequate justification for those assumptions, given that the theory suggests it's more likely those memories/records were formed in some alternative way. Perhaps the conclusion to draw is that we need to reassess our understanding of probability and typicality in these sorts of theories.
view post as summary
Stefan Weckbach replied on Sep. 6, 2012 @ 06:39 GMT
Dear Emily,
i read your essay and i realized that you thought *deeply* about the consistence of our theories with our human experience of time, constancy, space and retrodiction. It was a joy to read your lines of reasoning!
May i comment that all your questions about the real physical circumstances could be answered by introducing my concept of "physical retrodiction". How this works is outlined in my own essay. You don't need to assume Many Worlds or a universally valid wave function. The wave function only does "collapse", because every measurement is both - an initial state and a final state. These states get rendered permanently to be consistent to each other via entanglement - and this is the reason why it *seems* for us that someting like a wave function does collapse. Its only our biased classical, mechanical view that induces the reasoning about a collapse.
Thank you again for your very exciting essay!
Stefan Weckbach
report post as inappropriate
Yuri Danoyan wrote on Sep. 3, 2012 @ 21:32 GMT
Dear Emily Adnam
What do you think about victimization second law of termodinamics?
see http://fqxi.org/community/forum/topic/1413
report post as inappropriate
Yuri Danoyan replied on Sep. 6, 2012 @ 13:31 GMT
Emily
You are ignoring my post.
Why?
report post as inappropriate
Emily replied on Sep. 6, 2012 @ 14:31 GMT
Hello! I'm sorry that I took some time to reply, I have been busy.
I'm sorry, but I don't entirely understand your question - what do you mean by 'victimization' with regard to the second law?
report post as inappropriate
Yuri Danoyan replied on Sep. 6, 2012 @ 21:00 GMT
Drew attention to quote from Dirac in my essay:
"It seems very likely that sometime in the future there will be an improved quantum mechanics, which will include a return to the causation and which justify the view of Einstein. But such a return to the causality may be possible only at the cost of failure of some other fundamental ideas, which we now accept undoubtedly. If we are going to restore causality, we shall have to pay for it and now we can only guess what idea must be sacrificed.” P.A.M. Dirac. Directions in Physics
I mean to sacrifce second law of thermodynamics
Victimization of second law....
report post as inappropriate
Author Emily Christine Adlam replied on Sep. 7, 2012 @ 09:16 GMT
I think the second law of thermodynamics certainly has to be 'sacrificed' in the sense that we no longer view it is fundamental and universally true - we take it to be a statistical generalisation which holds with a high degree of probability. The reasons for that need to be derived from the underlying theories which govern the constituents of the relevant systems, particularly quantum mechanics. In particular, I think it's unlikely that the second law is the source of temporal asymmetry, since it's true (insofar as it is) in virtue of microdynamical laws which are apparently themselves temporally symmetric.
Yuri Danoyan replied on Sep. 11, 2012 @ 01:42 GMT
You didn't see contradiction?
Temporal asymmetric symmetry..or symmetric asymmetry.
report post as inappropriate
Yuri Danoyan replied on Oct. 2, 2012 @ 16:23 GMT
Please don't forget please impartially evaluate my essay
report post as inappropriate
hide replies
Robert H McEachern wrote on Sep. 4, 2012 @ 03:13 GMT
Emily,
In your abstract, you state that:
"scientific enquiry has presupposed a relatively simple relationship between our empirical evidence and the facts of reality... Such assumptions seem to be necessary... But developments ... give us specific reasons to question these assumptions."
The simple relationship between evidence and reality, need not be questioned. Physical theories merely produce numerical predictions, that either agree or disagree with observations. They do nothing else. In particular, they provide no evidence, either for or against, all the metaphysical "interpretations" that have the attached to the theories. The theories can do little more than "fit curves to data", and they can only even do that, in cases where the data has an extremely low information content - that is what makes the data "predictable", by the theory, in the first place.
Since the "interpretations of the theory" invariably have a higher information content than the theories themselves, the "interpretations" cannot possibly be contained within the theories themselves; they have simply been made-up and slapped-on. Hence, while experiments may confirm that the theory "fits" the data, they cannot provide any evidence that the "interpretation" fits the theory.
Rather than questioning the "reliability of memories", physicists need to question the "meaning" and "significance" that they have attached to them.
report post as inappropriate
Author Emily Christine Adlam replied on Sep. 4, 2012 @ 08:02 GMT
While I would certainly agree that we need to be careful about differentiating between the content of a theory and its interpretation, I'd argue that theories can't be divorced entirely from 'interpretation' without rendering them incapable of making any predictions that can be compared to observation - at the very least, we need some specification of which mathematical features of the theory are meant to correspond to particular features of our evidence.
Saying that physicists need to question the 'significance' of memory is a nice way of putting the point I want to make - that memories are just a form of data, and perhaps we need to stop interpreting that data quite so literally.
Robert H McEachern replied on Sep. 4, 2012 @ 13:18 GMT
Your statement that "we need some specification of which mathematical features of the theory are meant to correspond to particular features of our evidence" is exactly on target. It is discussed extensively in my own essay. Unfortunately, as I indicated there, making such a correspondence provides no evidence that it is correct.
The problem is that the "meaning" of a high information content signal cannot be deduced from any observation of the signal, for the simple reason that "high information content" is synonymous with the fact that the signal itself is devoid of meaning. In effect, the signal is nothing more than a "serial number", whose "meaning" can only be deduced by "looking it up" within the memory of an entity that knows, a priori, the correspondence between the serial number and its "meaning's" address in memory.
Physicists do indeed need to "stop interpreting that data quite so literally." Complex entities respond to data observations "symbolically" as well as "physically." Physical responses behavior as though data measurements are "real numbers", but symbolic responses behave as though they are "serial numbers." The entire information content of Physical behaviors can easily be represented, by short sequences of symbols, known as "equations." The vastly larger information content of the initial conditions, in the memory of a complex observer, cannot. For such observers, it is the initial conditions, not the equations, that determine all "interesting" behaviors, because that is the only thing that gives any meaning to observed "serial numbers."
report post as inappropriate
S Halayka wrote on Sep. 4, 2012 @ 15:54 GMT
I believe that the act of contemplating the possibility of past intervention in human affairs by demons (aliens, gods, God, whatever the label shall be, just as long as they evolved over a great period of time like we did) is no less scientific than the act of contemplating the possibility of the many worlds scenario.
I say this wholeheartedly, because even if one were to somehow logically disprove many worlds here, there is still the possibility that there is another world in which this logic was proven false because it was based on some incomplete information. I also say this wholeheartedly, because the simplest thoughts about the origin of life point directly to the laws of thermodynamics themselves -- life is special, but not that special.
Anyway, who knows? Perhaps one day we will be able to communicate with aliens, as well as be able to hop between the branches of the many worlds. Until then, my bet is on aliens first, and possibly last.
report post as inappropriate
Member Benjamin F. Dribus wrote on Sep. 6, 2012 @ 03:35 GMT
Dear Emily,
You write exceptionally well. You give a balanced and mature analysis that reveals a strong grasp of the issues you address, without being carried away by any particular argument. I have a few thoughts for you to consider.
1. Of course you are correct that classical microdynamics is time-symmetric, but we know beyond reasonable doubt that classical statistical...
view entire post
Dear Emily,
You write exceptionally well. You give a balanced and mature analysis that reveals a strong grasp of the issues you address, without being carried away by any particular argument. I have a few thoughts for you to consider.
1. Of course you are correct that classical microdynamics is time-symmetric, but we know beyond reasonable doubt that classical statistical mechanics is not fundamental. A general mechanism that produces time-like asymmetry across a broad range of “fundamental” physical theories is asymmetry in configuration space. My own favorite version is causal configuration space, as described in my essay:
On the Foundational Assumptions of Modern PhysicsThe idea is that different possible universes are related to each other in ways that make time-like asymmetry inevitable. I say "possible" here because I don’t believe one has to be a committed Everettian to make use of configuration spaces and Feynman’s sum over histories method. I also explain in the essay precisely what I mean by "time-like" in this context.
Julian Barbour’s essay in this contest mentions a different type of configuration-space asymmetry, shape space asymmetry, which is relevant under different assumptions. He can explain his approach better than I can.
2. As you point out, there is a self-referential difficulty associated with doubting one’s own memory; you mention this by remarking that your essay is “more likely” of random origin than produced by a conscious person. To your credit, the essay itself is strong evidence against this supposition, but more seriously, I believe that the pragmatic assumption you mentioned is necessary, if only as a last resort. There is no incompatibility between pragmatism and idealism in this regard, unless one is certain that one can never do better than the pragmatic assumption. You can continue to do science and seek better foundations at the same time.
3. On the subject of decoherence, I will mention that Jorge Pullin and Rodolfo Gambini have an essay in this contest that attempts to refine the decoherence approach to the measurement problem. I will also repeat that ascribing some degree of reality to the various histories in Feynman’s sum doesn’t necessarily imply full-blown Everettianism; in particular the relationship between observers and the configuration space admits several possible interpretations.
4. Einstein’s objection to nonlocality need not have been totally wrongheaded, even if it was misapplied in the case of quantum theory. In particular, it relies on assumptions about the structure of spacetime. I discuss this point in my essay, as well.
Thanks for the great read! Take care,
Ben Dribus
view post as summary
report post as inappropriate
Emily replied on Sep. 7, 2012 @ 09:25 GMT
Thank you very much for your comments!
1) I've read your essay and Dr Barbour's with interest - if anything, I would say that the points I've raised here give reason to take these sorts of speculations seriously, since the problems with classical statistical mechanics would make it unsatisfactory even if we didn't have other good reasons to view it as non-fundamental.
2) I certainly wouldn't advocate giving up all or even most of the pragmatic assumptions that we need to get physics started. However, I do think we should keep in mind that they are assumptions, and be willing to question them (judiciously) in circumstances where that becomes appropriate, such as our current predicament with regard to statistical mechanics and quantum mechanics.
3) Bringing in alternative interpretations of quantum mechanics certainly complicates the issue here, but I think the problem of probability remains pressing for any interpretation which ascribes reality to more than one outcome of a measurement, since it's then no longer possible to make the straightforward pragmatic assumption that the (single) course of events that actually happens is one rendered highly probable by the theory.
4) I agree - I think there's a prevailing idea that Einstein disliked nonlocality mainly because it disagreed with his own theory of relativity, and that's doing him an injustice, because he clearly had good independent philosophical reasons for opposing it. I think he's right to worry that if we were to get rid of locality altogether we'd simply end up with chaos; but what quantum mechanics demonstrates is that we can sometimes weaken underlying assumptions like locality without completely undermining the practice of physics.
report post as inappropriate
John Merryman wrote on Sep. 8, 2012 @ 03:33 GMT
Emily,
One way to resolve the Everett hypothesis is to eliminate the external timeline of events and allow the process to proceed atemporally. Sound impossible? How can you have process without time? It emerges from the process, but it's dynamic, not dimensional. It's not the past proceeding into the future, but the future becoming the past. Not the earth traveling a narrative dimension from yesterday to tomorrow, but tomorrow becoming yesterday because the earth rotates. As an effect of action, time then becomes the collapse of probabilities into actualities. Duration is not external to the present, but is the state of the present between measured events. It is only when we consider time in retrospect that it emerges as narrative. Yet that past is receding, rather then the present moving.
As an effect of action, time is similar to temperature. Time as rate of change, while temperature as level of activity. When we change the level of activity, such as in gravity fields, or at significant speed, this affects the rate of change. Which is why clock rates vary. Not because they travel alternate time vectors.
This way, the past is determined, but the future is probabilistic, since the lightcone of input is not complete until the event happens.
report post as inappropriate
Ted Erikson wrote on Sep. 8, 2012 @ 17:25 GMT
EA:
Very interesting and informative essay as philosophy.. As a newcomer to the FQXi community, I feel few of the "community" grade, or even look at, my essay which approaches the problem very realistically, based on an internal philosophical view.. Might you look at it, comment if so inclined, and grade it?
To Seek Unknown Shores
http://fqxi.org/community/forum/topic/1409
Thank you
TE
report post as inappropriate
Peter Jackson wrote on Sep. 13, 2012 @ 17:56 GMT
Emily
Have you considered entropy with respect to a cyclic universe model? Perhaps consider a larger model of an AGN accreting and re-ionizing all the matter in the disk as quasar jets (or any other you may prefer). To me this would demand a re-evaluation of the assumption or concept of entropy. Do you?
I also wonder, considering the evidence, if it really is the case that;
"we have in fact been able to construct a coherent and successful quantum theory which violates locality, and its laws certainly seem susceptible to empirical test." Do the 'empirical tests' really tell us that or is it just our interpretation, as I suspect?
A well written essay no less, and an easier read than some. That possibly includes mine, which I do hope you'll read anyway. It does add some theatre to a very intense mechanistic analysis which addresses some of the questions you raise and offers some logical mechanistic solutions. I'd value your thoughts.
Many thanks, and well done.
Peter
report post as inappropriate
Author Emily Christine Adlam replied on Sep. 16, 2012 @ 09:47 GMT
Addressing the problem of entropy in a cyclic universe is interesting. I don't think the concept of entropy should be taken too seriously - if we regard the Second Law merely as a statistical generalistion, as modern statistical mechanics seems to indicate, then we should presumably regard the concept of entropy as a useful way of talking about the statistical facts rather than anything particularly fundamental, so in a cyclic universe we might well find that other ways of talking about the facts are more productive.
Indeed, the possibility of cyclic time seems to be another reason we might want to ask questions about the nature of our evidence - in particular, our beliefs about the distinction between past and future, beliefs which play an important role in determining our attitude to scientific evidence.
I agree that quantum theory in its simplest formulation doesn't necessarily violate locality - the mathematics alone can't imply something like that, so we need to add in some 'interpretation.' My point was merely that it's possible to construct a coherent theory (i.e. quantum theory together with one of the interpretations which do imply that locality is violated) where locality does not always hold, and therefore the practice of science is still possible even in the absence of strict locality assumptions.
Peter Jackson replied on Sep. 16, 2012 @ 17:15 GMT
Emily,
Thanks. I agree, but suggest that if a consistent interpretation exists that DOES allow local reality and derive the effects of classical physics, then it would be a unifying theory. The test may be it's effectiveness in resolving anomalies.
I suggest that because I seem to have chanced across such an ontological construction, built from many epistemological elements, to bridge the divide. I hope you may do a careful read of my essay assembling those parts with dynamic logic foundation, and let me know where it is I went wrong.
Many thanks.
Peter
report post as inappropriate
Jonathan Kerr wrote on Sep. 30, 2012 @ 10:02 GMT
Hello Emily,
It seems very strange to argue for the possible unreliability of evidence/memory, by pointing out problems in fields where our knowledge is very limited. Problems will always arise in fields like that anyway. Cosmology is sometimes portayed as a field within which we have a good understanding, but it isn't. We found out in the '90s how little we know.
You can't point out that some things about entropy don't make sense, and then say that this means our memories may be deceiving us somehow. Penrose pointed out some absolutely major problems with entropy in cosmology 30 years ago, and basically said that no-one except him seemed to see these problems. But our idea of entropy may be flawed, or the concept may be limited, or our undertanding of it may be incomplete. There are all kinds of unknowns surrounding these questions - our cosmology may be partly wrong, it's certainly incomplete. That's science, there are things that don't add up, you have to try to solve them. You may be right to re-examine the relationship we have with evidence and memory, but you can't present problems like those with entropy as reason to think this or that in your argument.
Best wishes, Jonathan
report post as inappropriate
Jonathan Kerr wrote on Sep. 30, 2012 @ 22:31 GMT
Just to put that in a wider context, and explain why to me it seems premature to question the reality of the information we have because of the entropy problem - many areas of physics has had problems that at first seem impossible to deal with. Some people seem to run away from the unsolved puzzles, or try to diminuish their importance.
I'm not saying you do that, but I do think we should take these puzzles on, and be prepared to say 'this is baffling, we don't know what's going on'. In the past, those who have been prepared to look right into the cracks in our picture - like Einstein - have found the best clues waiting there, while others spend their time papering over them. I don't think you do that, but it seems to me there's nothing wrong if we're baffled, puzzles like interpreting QM have an interesting way of ruling out a large range of solutions, leaving us little or nothing that seems to work. That means it's a good puzzle, and that's why we struggle with them! But when a solution appears, it often seems less weird than it looked beforehand. Anyway, that's how I see it.
But also, when you look at the entropy problem, don't forget the possibility that motion through time exists, as George and I both think. (See
my essay for evidence and reasoning that suggests it does.) If it did exist somehow, there would surely be some missing pieces of the puzzle still to be found, and that area looks connected with the bit of the puzzle you're looking at. So as I said, there are many unknowns in that area.
I'd like to see more about the mechanism that you think might be making our data contain unsolvable puzzles, if it's there I want to know about it.
Anyway, good luck,
Best wishes, Jonathan
report post as inappropriate
Emily replied on Oct. 1, 2012 @ 01:00 GMT
I certainly agree with you that our knowledge is limited in these fields, and indeed, I'd suggest that the problems I point out are one symptom of that fact. My intention is not to use the difficulty with entropy to argue that it must be the case that our memories are deceiving us: rather, the argument is comparable to a reductio ad absurdum, to the effect that our usual scientific practice, applied to the evidence we have, leads us to a theory which apparently tells us we shouldn't rely on that evidence in the first place. As you rightly say, 'there are things which don't add up, you have to try and solve them,' and my suggestion is merely that given this problematic relationship with evidence, perhaps one direction of investigation is to look more carefully about the assumptions we are making about evidence in the construction of our theories. I certainly don't want to suggest this is a problem which is impossible to solve - using your metaphor, I think these difficulties with evidence are among the 'cracks in our picture,' into which we ought to look, as Einstein did, in order to find fruitful directions for future progress. Indeed, I'd say that what Einstein did was very similar to what I'm advocating: by relaxing certain assumptions once thought necessary to the practice of science (in his case, about the nature of space and time), it becomes possible to see issues in a new light and open up new avenues for scientific theorising.
report post as inappropriate
Jonathan Kerr wrote on Oct. 1, 2012 @ 11:44 GMT
Hello Emily,
Thank you for your reply. I agree that if we have to question evidence or assumptions, it's often better to questions assumptions.
I just reread your essay - to me you show the limitations of certain theories very well, by showing what happens when they're applied outside their domains of validity. One reductio ad absurdam you set out is that according to statistical...
view entire post
Hello Emily,
Thank you for your reply. I agree that if we have to question evidence or assumptions, it's often better to questions assumptions.
I just reread your essay - to me you show the limitations of certain theories very well, by showing what happens when they're applied outside their domains of validity. One reductio ad absurdam you set out is that according to statistical mechanics your essay was more likely to have arisen by chance than be written. I think one more positive side of your work is that it might contribute to defining the boundaries, as we probably need a more exact understanding of what that sort of physics can and can't describe. To me, having shown these limitations to that sort of physics, it's a pity if you then take the theories as being actually applicable in those domains, as you sometimes seem to.
I'm trying to understand your idea about memory - you say:
"For although the claim that we are at a local minimum of entropy is inconsistent with the history stored in our memories, it is not inconsistent with the existence of those memories - they, together with the order we perceive around us, could have been created by a spontaneous fluctuation rather than by the events they apparently report."
So a random fluctuation might have caused all of our memories, and the order we percieve around us, to come into existence? I just don't understand how so much order, and consistent order - our memories tend to agree on things - could arise from a random fluctuation, whatever one thinks about the way in which memories are stored. Wouldn't a random fluctuation be likely to create something very much more... random? Douglas Adams once described a planet covered entirely with luxery hotels and casinos, that had all been 'carved out of the rock by the natural processes of wind and erosion'. To me, your idea looks a bit like that. Surely the order we find around us is more likely to have appeared in the kind of way we think it did, but with gaps in our knowledge about it.
And where you talk about applying probabilities to the history of the universe - to do that you tend to need to know everything. It seems to me that because we don't, we're not in a position to do that.
I think what I may be seeing underneath your essay is a different version of something I find in many places nowadays - the implicit assumption that we now have all the pieces of the jigsaw in front of us, and only need to arrange them correctly. No-one would actually say that, but people nevertheless think and write as if it were the case. Many people are 'shuffling the principles' at present, making basic principles that were thought to be fundamental become emergent, and vice versa. I've had a discussion with Ben Dribus about this - personally, I think rearranging what we have will not be enough. In my essay, I remind people that there must be missing pieces, and that we need to allow for them, and try to guess - from the clues we do have - what the clues we don't have might look like.
And this principle of 'allowing for unknowns' would be very helpful in your discussion about QM and the Everett interpretation. It seems likely to me, and to many, that the real interpretation of QM is something different from all of our 5 or 6 present alternatives, all of which have their own problems. If so, feeding in what we have now and assuming it's everything will only give nonsense out, which is I think what you do, and what you get. I know you intend to show that many of these avenues of thought simply don't work, and I think you're right, you show that very well - to me only your suggestion about why they don't work is wrong.
Anyway, best wishes, Jonathan
view post as summary
report post as inappropriate
Jonathan Kerr wrote on Oct. 1, 2012 @ 16:38 GMT
Sorry Emily, just to correct a mistake in the first line above, you were talking about questioning assumptions about evidence, not questioning assumptions.
The crucial point I forgot to make about QM is that the "problematic relationship with evidence" which you claim exists arises largely from a theory that has no clear interpretation, and has simply not been understood. We've also had trouble applying statistical mechanics, and yet you assume we can rely on our understanding of these two theories when you form that initial premiss. JK
report post as inappropriate
Sergey G Fedosin wrote on Oct. 4, 2012 @ 05:46 GMT
If you do not understand why your rating dropped down. As I found ratings in the contest are calculated in the next way. Suppose your rating is
and
was the quantity of people which gave you ratings. Then you have
of points. After it anyone give you
of points so you have
of points and
is the common quantity of the people which gave you ratings. At the same time you will have
of points. From here, if you want to be R2 > R1 there must be:
or
or
In other words if you want to increase rating of anyone you must give him more points
then the participant`s rating
was at the moment you rated him. From here it is seen that in the contest are special rules for ratings. And from here there are misunderstanding of some participants what is happened with their ratings. Moreover since community ratings are hided some participants do not sure how increase ratings of others and gives them maximum 10 points. But in the case the scale from 1 to 10 of points do not work, and some essays are overestimated and some essays are drop down. In my opinion it is a bad problem with this Contest rating process. I hope the FQXI community will change the rating process.
Sergey Fedosin
report post as inappropriate
Richard William Kingsley-Nixey wrote on Oct. 5, 2012 @ 18:05 GMT
Emily
I don't agree with all your propositions, but scoring is not about that. I do however strongly recommend you read Peter Jackson's essay very carefully, his mechanism knits neatly with my figures.
Well done, You deserve to be in the top 35.
Rich
report post as inappropriate
Login or
create account to post reply or comment.