Search FQXi

If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Forum Home
Terms of Use

Order posts by:
 chronological order
 most recent first

Posts by the blogger are highlighted in orange; posts by FQXi Members are highlighted in blue.

By using the FQXi Forum, you acknowledge reading and agree to abide by the Terms of Use

 RSS feed | RSS help


Thomas Ray: "(reposted in correct thread) Lorraine, Nah. That's nothing like my view...." in 2015 in Review: New...

Lorraine Ford: "Clearly “law-of-nature” relationships and associated numbers represent..." in Physics of the Observer -...

Lee Bloomquist: "Information Channel. An example from Jon Barwise. At the workshop..." in Physics of the Observer -...

Lee Bloomquist: "Please clarify. I just tried to put a simple model of an observer in the..." in Alternative Models of...

Lee Bloomquist: "Footnote...for the above post, the one with the equation existence =..." in Alternative Models of...

Thomas Ray: "In fact, symmetry is the most pervasive physical principle that exists. ..." in “Spookiness”...

Thomas Ray: "It's easy to get wound around the axle with black hole thermodynamics,..." in “Spookiness”...

Joe Fisher: "It seems to have escaped Wolpert’s somewhat limited attention that no two..." in Inferring the Limits on...

click titles to read articles

The Complexity Conundrum
Resolving the black hole firewall paradox—by calculating what a real astronaut would compute at the black hole's edge.

Quantum Dream Time
Defining a ‘quantum clock’ and a 'quantum ruler' could help those attempting to unify physics—and solve the mystery of vanishing time.

Our Place in the Multiverse
Calculating the odds that intelligent observers arise in parallel universes—and working out what they might see.

Sounding the Drums to Listen for Gravity’s Effect on Quantum Phenomena
A bench-top experiment could test the notion that gravity breaks delicate quantum superpositions.

Watching the Observers
Accounting for quantum fuzziness could help us measure space and time—and the cosmos—more accurately.

January 21, 2018

CATEGORY: Blog [back]
TOPIC: Is the Second Law a Meta-Theorem? [refresh]
Bookmark and Share
Login or create account to post reply or comment.

John Merryman wrote on Aug. 11, 2011 @ 16:43 GMT
Does this apply to past information, as well as future information? In other words can we truly explain reality and the laws and constants governing it, or are they lost in a haze of infinite input?

The working assumption seem to be that we will eventually be able to explain reality in a self constant set of principles, arising from an initial state, but does that assumption contain presumptions not as evident as some would assume?

Maybe there are not multi-verses to explain the constants of this one, but just far more influences than we assume.

The old saying that the more you know, the more you know you don't know, might even apply to physics.

report post as inappropriate

Oscar Dahlsten replied on Aug. 12, 2011 @ 10:29 GMT
Thanks for the question.

Think of the coin tossing scenario. It is like that in general, i.e. that you have some information initially about a particular system, and then let it interact with lots of other things (in this case your arm etc.). Then the initial knowledge you had about the coin being heads or tails is typically rendered useless for predicting the state of the coin. (You can still learn the coin's state however by doing a new measurement.) Our result essentially generalises that statement to a wide range of theories, not just classical theory and quantum theory.

Thinking about this in a cosmological sense is very interesting and we will hopefully have something more concrete to say about that at some point.

report post as inappropriate

John Merryman replied on Aug. 12, 2011 @ 14:17 GMT

It's not so much a macrocosmic question, as a open, versus closed set question.

It seems to me that there is a networked, root/trunk/branch process, where information/structure is constantly organizing and dispersing. Such that the coin toss is only one particular, linear action, proceeding from one state to another,without fully taking into account the non-linear effects generated that create other states.

This seems to relate to a point I make about time, in that while the effect is a progression from past events to future ones, the process is a constant change of configuration,so that it is the events going future to past. The present doesn't travel from yesterday to tomorrow. Tomorrow becomes yesterday because the earth rotates.

Since we cannot know all input into any event prior to its occurrence, as it might be arriving from opposite directions at the speed of light, the total cause of any event remains in the future, until it occurs, then the resulting event, the effect of this input, recedes into the past.

So the input is indeterministic, but the results are deterministic. The state of the coin is always in the present.

We should consider time as an emergent effect of motion, rather than imposing a formal dimension on it, whether it is quantum or classic systems.

report post as inappropriate

Eckard Blumschein replied on Dec. 27, 2012 @ 17:31 GMT

Is your idea of back-propagation summarized in this sentence of you? "Since we cannot know all input into any event prior to its occurrence, as it might be arriving from opposite directions at the speed of light, the total cause of any event remains in the future, until it occurs, then the resulting event, the effect of this input, recedes into the past."

I too see the possibility that a future influence might be unseen because it can reach us as fast as light. However, isn't it also always impossible to know and consider ALL of the virtually infinitely many influences that are surrounding the modeled structure?

How did you and Strominger come to the silly idea to seek causes in the future?

To me this seems to be pretty understandable. Ordinary people say e.g. the children are our future. What they are meaning is a bit more complicated but of course not a reason to question the common sense on causality. Strictly speaking the future is a mental construct. It does not exist in advance in reality.

By the way, could you please more elaborate the open/clopen/closed question?

You should know, I do not consider the present something extended between past and future.


report post as inappropriate

T H Ray wrote on Aug. 11, 2011 @ 17:00 GMT
Timely subject. Gregory Chaitin showed the coin-toss probability exists even in arithmetic -- he calls it "maximal unknowability." So our conventional numerical models may not be long useful for physical theories, at least in the straightforward manner we assume.


report post as inappropriate

James Putnam wrote on Aug. 11, 2011 @ 21:35 GMT
Oscar Dahlsten,

I don't know if Oscar Dahlsten will respond to my messages. Some bloggers in the past have chosen to not respond. So, this message is open to anyone, including Tom, whom I respectfully disagree with about some things, but value his thoughts along with others who are clearly professionals.

"It is in this set of theories that we ask whether this data-becoming-useless version of the second law holds universally. We find that we have to add some additional restrictions on the set of theories to make the question well-defined, but given those restrictions, the answer is yes: it holds in all such theories.

There are several versions of the second law and it remains to be shown that this claim can be made for all of them. Moreover one should try to make even more minimal assumptions."

I don't see how this claim can be applied to thermodynamic entropy as defined by Clausius. While it is true that lost energy to the environment increases the number of energy states that can be occupied in that environment, Clausius' entropy, which is the beginning of the concept of entropy, has not been shown to result in data becoming useless. It results in a precise value of something that has not yet been explained. Even Boltzmann skipped past it to offer a new definition that is not the same thing. The only connection between the two versions is Boltzmann's constant which actually looses its own usefulness in his definition of entropy. Corrections to this are welcome.


report post as inappropriate

Oscar Dahlsten replied on Aug. 12, 2011 @ 10:38 GMT
Thanks for the question James.

Our result does not a priori deal with this instance, as we have not talked about energy and what energy is in the general setting, but rather about data asociated with clicks in detectors. There is some case for optimism as people have linked this more information-theoretic argument we are dealing with here with energy-related versions of the second law in the setting of quantum theory. But you are right, it is not clear.


report post as inappropriate

James Putnam replied on Aug. 12, 2011 @ 18:04 GMT
Oscar Dahlsten,

Thank you for responding. I am very interested in the subject.


report post as inappropriate

Pentcho Valev replied on Aug. 14, 2011 @ 19:12 GMT
James Putnam wrote: "I don't see how this claim can be applied to thermodynamic entropy as defined by Clausius."

The principle "entropy always increases" has been DEDUCED by Clausius but the deduction is invalid (this can be shown on a few pages) so the principle, in the Clausius' interpretation at least, is just utter nonsense. Jos Uffink calls it "red herring" at the end of the following...

view entire post

report post as inappropriate

John Merryman wrote on Aug. 12, 2011 @ 02:58 GMT
What would a completely entropic state look like anyway?

Presumably it would have some level of energy fluctuating around a medium.

How can we be completely sure that isn't exactly what we currently have, with these intergalactic filaments and quantum fluctuations. Yes, there are galaxies pulling in mass and radiating energy, but it seems quite likely the energy radiating away is about equal to the mass falling in, so that's a wash.

Entropy only applies to a closed system, otherwise what is lost is made up by what enters from adjacent systems.

It seems the intellectual necessity of imposing limits creates problems of input, as it provides a framework of organization.

report post as inappropriate

Florin Moldoveanu wrote on Aug. 12, 2011 @ 05:07 GMT
Dear Oscar,

This is a very interesting research area and your post wetted my appetite to understand more what you accomplished in the paper. However, I feel that the post was too short and a bit too dummied down. There are many ways one can define entropy: Shannon, Renyi, Havrda-Charvat and so your point about the universality of the second law is very expectable. What I would be interested to find out more though is how you manage to define purity. (I want to be lazy and and avoid spending the time to understand your preprint.) Therefore I would be very interested in an overview of your purity result, not as high level as the post.



report post as inappropriate

Oscar Dahlsten replied on Aug. 12, 2011 @ 10:46 GMT
Thanks for the question Florin,

I am glad you find it interesting. It is not easy to satisfy everyone with regards to the level of detail. The paper by the way comes with an extended introduction listing the main results, so you do not need to read the whole thing to get the key points.

In the framework of generalised probabilistic theories a 'state' lists the probabilities of measurement outcomes for all possible measurements. Different theories allow different states. All probabilistic mixtures (meaning with some probability I have state 1, with another state 2 et.) of states are allowed. Any state which is not a mixture of other states is termed 'pure'. We show there is a natural way to quantify this such that pure states have purity 1 and the maximally mixed state (least pure state) has purity 0. Other states lie somewhere in between. This quantification of purity is much like a generalisation of the quantum purity Tr (rho^2), which has the operational meaning of the (largest) likelihood that two identical measurements on two copies of the state give the same outcome.


report post as inappropriate

Lawrence B. Crowell wrote on Aug. 12, 2011 @ 17:51 GMT
A particle, string or qubit that falls onto a black hole becomes entangled with the black hole. The nature of this entanglement is unknown to the exterior observer. Hence the entropy of a black hole can be regarded as due to an entanglement of qubits. The three units of physics G, c, ħ, and k are unified in this way with the Bekenstein bound

S = kA/4L_p^2, L_p = sqrt{Għ/c^3}.

If the area of a black hole is A = 4πR^2, and R = GM/c^2, the radius is then N units of L_p. The entropy is then

S = Nπk.

This then corresponds to the equipartition theorem for N states on a sphere of two dimensions, where the two dimensions are the event horizon.

In general we have Stochastic Local Operations and Classical Communication (SLOCC) in entanglement and the teleportation of states. Two states are SLOCC related by a teleporation if they can be inter-converted to each other in a reversible manner with some probability of success. This uses group theory, where the group G_{SLOCC} for this process is an N-partite system of q-bits with some group GL(2,C). The states further transform as a (2,2,...,2).

G_{SLOCC} = SL(2,C)_1(x)SL(2,C)_2(x) ... (x)SL(2,C)_N

where the composite state

|ψ_{12...N}> = SL(2,C)_1(x)SL(2,C)_2(x) ... (x)SL(2,C)_N|φ_{12...N}>

So this is an N-partite quantum information system where the entanglements are determined by the group element G_{SLOCC} and polynomials of this group. This is the moduli space for black holes composed of Q-bits and the U-duality group.

For a 2 Q-bit system this construction is apparent. You have a stat of the form sum_{ij}a_{ij}|i,j> for i and j running form 0 to 1. The elements a_{ij} transform as (2,2) of the G_{SLOCC}. The invariant element is the determinant of these matrices so det(a_{ij}) transformed under the G_{SLOCC} into

det(a_{ij}) - -> det(a'_{ij}) = det(U_{i'i}a_{ij}U'*_{j'j}) = det(a_{ij})

with the obvious result on the determinant of a product that the transformation elements have unit determinant. The entanglement entropy is given by this measure so S_{ij} = 4|det(a_{ij})|^2. For multipartite systems the same rule generally applies, but the matrix interpretation is different. For an N-partite system the entanglement entropy is given by a 2x2x...x2 (N times) set of elements. This then leads to the entangled states |00> + |11> and |01> + |10> (without normalization) for singlet and triplet entangled states.

The 3 and 4 partite entanglements the determinant is a hyperdeterminant. These entanglements are isomorphic to a real valued algebra for the moduli space of BPS and extremal black holes. For further reading on this look at Four-qubit entanglement from string theory, L. Borsten, D. Dahanayake, M. J. Duff, A. Marrani, W. Rubens..

Cheers LC

report post as inappropriate

Steve Dufourny wrote on Aug. 12, 2011 @ 19:13 GMT
Entropy increases indeed, the same time this entropy(this universal maximum energy if you prefer)is in all at this same maximum,paradoxal but so fascinating.

A good tool.....some words can resume this entropy, "we could nourrish our planet with one water drop dear scientist and even during a long time"

It is that the entropy, the energy, after it is just a transformation into works respecting different steps of E. The most important is to understand that all possesses this maximum entropical which increases furthermore due to evolution and increasing of mass.There is in fact the same entropy in a water drop than in a planet or a galaxy or a flower, or a particle.....after it is just a serie of interactions where we can class like a taxonomic classment towards this maximum. It is not complicated the entropy in fact, it is even very spiritual and universal. But of course it is an other story.We are on a scientific platform and not on a religious platform of course. The codes are fascinatings that said !

Meditate dear all on that....all possesses the same maximum entropy......but after all, is it necessary to use all the energy of this universe? of course no,just a very small part is sufficient....the consciousness of course becomes like a sister of this entropy.The Universal sphere, this entropy then builds simply and evolves towards a perfect equilibrium between all gravitational spherical systems.

Entropy ......this entropy builds simply but it is a secret of course.


report post as inappropriate

James Putnam replied on Aug. 12, 2011 @ 19:36 GMT

"Entropy increases indeed, the same time this entropy(this universal maximum energy if you prefer)is in all at this same maximum,paradoxal but so fascinating."

No I do not prefer that. What I prefer is a clear definition of a property. Thermodynamic entropy certainly is not 'maximum energy'. What is your definitional basis for that conclusion?


report post as inappropriate

Steve Dufourny replied on Aug. 12, 2011 @ 23:13 GMT
Buy a book of thermo James please, aftetr we shall speak perhaps, if i have touched you, forget me, thanks.And also if you are on this patriotic team, forget me still more.

If you do not understand what is the entropy, then don't insist simply. My answer was clear and all global rationalist, general understands that like the realativity,apparently you do not understand neither the principle of entropy of a system with its different steps and pure thermodynamical correlations, nor the realtivity, of course I say that because i saw your posts since several years, just a philosophy, that's all.Don't insist then. Entropy is God if you prefer poor thinker. And now let's me laugh please because anybody here understands this two important things, the entropy and the realtivity, how could you understand my theory of spherization. A time for all James.The entropy is that in fact , reread my words, after you shall perceive perhaps that more you towards the planck scale more the energy increases, like this limit, the planck scale, this wall separating unknown, you have this maximum indeed, it is that the real sense of entropy you know.

If know you prefer a thermodynamical vue, and about heat.Ok let's go, it is the same ,in fact you must uderstand that the entropy increases, it is foundamental. Of course I speak about the universal definitiopn of entropy. You can insert the irreversibility and the adiabatic processes, you shall see the transfert of Q. Now of course all system possesses its limits DUE TO OUR YOUNG AGE AT THE UNIVERSAL SCALE, but apparently you do not understyand what I mean. That means that the entropy can have so many forms. What I say is very simple, we have an entropy around us so enormous, and we utilize it with differents forms and works. We transform the energy then the motions(rotating spinning spheres). But for a real universal understanding of this entropy,you muts understand its distribution and even its cooling, paradoxal since the BB. The entropy is positive and increases.....E=(c²o²s²)m.....S entropy is universal and spiritual.


report post as inappropriate

James Putnam replied on Aug. 12, 2011 @ 23:27 GMT

"Buy a book of thermo James please, aftetr we shall speak perhaps, ..."

Don't give me an evasive response. When I ask for a definition of a property, within the science of theoretical physics, it should be accompanied by a mathematical representation. I don't get diverted from the subject. The subject is my claim that: "Thermodynamic entropy certainly is not 'maximum energy'." What is your definitional basis for the claim that entropy is 'maximum energy'?" I don't need a book. I need a direct answer.


report post as inappropriate

Steve Dufourny wrote on Aug. 12, 2011 @ 23:56 GMT







report post as inappropriate

Author Frank Martin DiMeglio wrote on Aug. 14, 2011 @ 20:26 GMT
F=ma can be used to fundamentally demonstrate quantum gravity and instantaneity in conjunction with balanced attraction and repulsion and equivalent inertia/gravitational force/energy (both at half strength/force).

This controls for (and averages) motion/mobility in keeping with space that is equally (and both) larger and smaller. The law of inertia ultimately and fundamentally requires instantaneity. This space is also equally (and both) invisible and visible. And this is also consistent with instantaneity and quantum gravity.

Everything above is the space of dream experience.

report post as inappropriate

Author Frank Martin DiMeglio wrote on Aug. 14, 2011 @ 20:36 GMT
Short of combining and including opposites such as gravity, inertia, attraction, repulsion, visible space, invisible space, etc. there will never be an intelligible, fundamental, and complete understanding of physics.

The union of gravity and electromagnetism gives intelligibility and stability/constancy to the incorporation of quantum gravity therewith.

report post as inappropriate

Author Frank Martin DiMeglio wrote on Aug. 14, 2011 @ 20:42 GMT
Gravity cannot be ultimately and fundamentally/completely understood apart from the fact that it is a contact force.

Also, never forget that the purpose of vision is to advise of the consequences of touch in/with time -- as Bishop Berkeley so wisely pointed out.

report post as inappropriate

Author Frank Martin DiMeglio wrote on Aug. 14, 2011 @ 20:53 GMT
Instantaneity is ultimately involves contact force as well -- a fundamental union of physics makes this clear.

report post as inappropriate

Georgina Parry wrote on Aug. 15, 2011 @ 11:38 GMT
Dear Oscar,

Why does the entropy of the quibit go up when the global state is known? I think that perhaps the coin is confusing me because a quibit is not at all like a coin just as a photon is not like a sock, even though I have had long discussion with Florin about socks and how they can be washed in different ways. Which now all makes sense to me.

Because you are not talking about coins in space-time but quantum information obtained from a space without temporal spread. Which corresponds to space where there is no time dimension but still passage of time through sequential change from iteration to iteration of the Object universe. Thats just change not entropy though. The new arrangement is just different not less in energy content or potential information content or more disorganized or unknowable, the information is there if the measurement is made.

You also said re entropy "This phenomenon is ubiquitous in nature, and happens both in quantum theory and classical theory." I don't accept the cold dark future of the universe due to increasing entropy. That is not what is observed happening in nature. What is observed is that there is erosion and there is deposition, that there are areas of high pressure and areas of low pressure, giving currents. Rock cycles, water cycles, carbon cycles, nitrogen cycles, stars die leaving clouds of dust and particles and stars are born from dust and particles. Galaxies form and galaxies are destroyed etc.

Nor do I accept that the broken jug is a valid demonstration of increasing entropy. The jug and the Earth ought to be considered as a single system rather than as separate independent parts. It falls to the ground and its fragments become -part of- the main Earth's surface. Not more disordered but merely rearranged in space.It is only more disorganized if one regards the jug in isolation from its environment, rather than just being a part of the environment that is being rearranged. It is no more entropy than the deposition of silt to build a delta or beach, which might later on be eroded again.

Please, as well as my initial question, could you set me right and explain why entropy is a good/useful way of thinking about/describing information becoming out dated by the spatial changes that have occurred?

report post as inappropriate

Georgina Parry replied on Aug. 15, 2011 @ 11:56 GMT
Dear oscar,

I have just noticed and read your reply to my former post. Thank you very much. I think it is very interesting that the observer is becoming a part of/one with the system under consideration.

You said "It seems that the above interpretation is in line with the view you are taking in your post." Yes I agree.

It is interesting. Thank you too for the references. I do not know if I will understand them but I will take a look. As you can see from my next post I really need to understand the basic postulates/assumptions first.

report post as inappropriate

Oscar Dahlsten replied on Aug. 15, 2011 @ 13:20 GMT
Dear Georgina,

Thanks for your questions.

"Why does the entropy of the quibit go up when the global state is known?"

In quantum theory one can have scenarios where the global state is known (has zero entropy associated with it) but the states of the subsystems are not (and therefore have high entropy). Such states are said to be entangled. Entanglement is one of the ways in which a subsystem can have a high entropy associated with it. (But having a known global state does not imply entanglement nor high subsystem entropy)

"I think that perhaps the coin is confusing me because a quibit is not at all like a coin just as a photon is not like a sock", Ok so I am not familiar with the sock argument, but yes there are differences between qubits and coins. However information theory statements, like how much data one can encode in a system, often turn out to be the same both for coins and qubits. This is essentially because for both of them one can only distinguish perfectly between two possible preparations (Heads or Tails for coin, |0> or |1> for qubit).

Our result fits in more with the paradigm that classical bits like coins and qubits are essentially the same, rather than different.

I think of entropy as a quantification of ones ignorance rather than disorder.

To talk about entropy one chooses a set of events and assigns probabilities to all of them (this step is subjective and depends on one's knowledge). The less one knows the flatter the distribution is. Entropy quantifies how flat (and wide, as more possible events mean more ignorance).

You are right about e.g. the cup breaking. In principle it is just nature evolving deterministically from one state to another. But if there is an observer who is limited to seeing just the cup they will not be able to predict its arrangement as that also depends on the environment. It is like that with the coin tossing too, in princple it is deterministic, but we are talking about a setting where the observer does not know everything initially, only e.g. the state of the coin.



report post as inappropriate

Georgina Parry replied on Aug. 15, 2011 @ 21:32 GMT
Dear Oscar,

thank you so much for your very clear explanation.(By the way, I was referring to Bertlmann's socks" paper (Bell, 1981), which I did not know about until Florin told me about it. We were talking about polarized photons, Alice and Bob and faster than light transmission of information, as I recall, but via experiments with socks, which just amuses me for some reason.)

I have never encountered entropy as ignorance before. It does make sense to me that as we progress further into the imagined future that kind of ignorance-entropy would increase as there will be cumulative changes involving many variables so predictions become less likely to be accurate. Just as a short term weather forecast is more likely to be accurate than a long term one.

I can see a kind of problem with having the observers knowledge/ignorance so closely woven into the description of the system. It becomes a description of the subjective awareness of something rather than the something itself. Almost like relativity in space-time. Which is not a description of what is but how an observer experiences what was. Or even the elephant as described by one of the blind men.

report post as inappropriate

James Putnam wrote on Aug. 16, 2011 @ 16:33 GMT
Oscar Dahlsten,

"Say now that instead you take a quantum coin, a qubit, and have it interact with some larger system. Then they tend to become entangled. This means that the state is known globally but the state on the qubit alone is not known well. Thus the entropy of the qubit alone goes up. The effect is in a sense the same: although one knew something about the qubit's initial state, that data became useless for predicting its state at a later time.

Data becoming useless like this is one version of the second law of thermodynamics: entropy of a system tends to increase."

Could you please say more about your use here of the word 'system'? I ask because the preceding paragraph seems to me to set up a situation that is reduced to the point where the system is not involved in the increased uncertainty or decrease of useable information. In other words, when taking only the qubit into consideration, why does this tell us anything about the 'informational (my choice of word) entropy' of the system?


report post as inappropriate

Oscar Dahlsten replied on Aug. 17, 2011 @ 11:39 GMT
Dear James,

Let me try to clarify what we mean by 'system'. A system is taken to be something that gives outcomes when measurements are performed on it. The state of a system is defined to be the probabilities one assigns to all possible outcomes of measurements on that system. One associates an entropy with the system by taking a function of those probabilities that puts a number to how little we know about it. For example the state of a coin is p(heads), p(tails).

We consider several systems combined. One or more are called the subsystem (e.g. the coin or the qubit in the example above). We let the state of the joint system evolve, e.g. corresponding to the coin being tossed. So the other subsystems are involved in the total interaction. Then we look at the state of the subsystem, the coin, and how its state changes. What tends to happen is that the coin becomes maximally correlated with the rest of the systems. Then the state of the coin alone, i.e. as assigned by someone without access to the other systems, is completely random, and thus has a maximal entropy.


report post as inappropriate

James Putnam replied on Aug. 17, 2011 @ 14:50 GMT
Dear Oscar,

Thank you for that very helpful reply. I wanted to know clearly how the word system was being used, and, you cleared that up for me. I find the subject of your blog to be very interesting. I didn't want to allow any of my own preconceptions to interfere with understanding your words.


report post as inappropriate

Sridattadev wrote on Aug. 24, 2011 @ 18:49 GMT
Dear Oscar,

There is no entropy in Singularity or Absolute equality. If we are a relative observer, the state of the coin matters as we are looking at only one side of a coin at a given time. But if we become absolute observer, then we will be seeing both the sides of the coin at all the time and hence the toss and state of the coin becomes irrelavant and hence no entropy.

God does not throw dice -- Albert Einstein

even if God or Singularity or Universal I does throw dice, I will always see what I want for I is on all the sides of the dice.



attachments: UniversalLifeCycle.doc

report post as inappropriate

DiMeglio wrote on Aug. 24, 2011 @ 19:24 GMT
The integrated and interactive extensiveness of thought and sensory experience must be shown.

report post as inappropriate

James Putnam wrote on Aug. 24, 2011 @ 20:18 GMT
The second law has been shoved aside by theoretical physicists. I am not saying that today's substitutions are themselves incorrect, I am saying that they are not about the second law of thermodynamics as it was first formulated. The first formulation does matter a great deal. The mathematical representation of the second law is Clausius' definition of thermodynamic entropy. It is not about statistics. It is not about microscopic motions of gas particles that may be incorrectly said to temporarily violate the second law. It was not about relativity theory. It was about macroscopic properties in an ideal setting that give mathematical results as Clausius formulated them. The abandonment of the original mathematical representation of the second law of thermodynamics is a theoretical error that cannot be permitted if theoretical physicists consider their own fundamentals to be primary. Perhaps they do not.


report post as inappropriate

Pentcho Valev wrote on Dec. 24, 2012 @ 10:15 GMT
Time to Refute the Second Law of Thermodynamics

A reasonable argument advanced by an intelligent design advocate:

Granville Sewell: "If an increase in order is extremely improbable when a system is closed, it is still extremely improbable when the system is open, unless something is entering which makes it not extremely improbable. (...) Order can increase in an open system, not...

view entire post

report post as inappropriate

Pentcho Valev replied on Dec. 26, 2012 @ 09:26 GMT
Maxwell introduced his "demon" in 1867 in a letter to Tait:

"...the hot system has got hotter and the cold colder and yet no work has been done, only the intelligence of a very observant and neat-fingered being has been employed."

Clearly, Maxwell's thesis amounts to the following:

In the presence of "a very observant and neat-fingered being", the second law of thermodynamics CAN be violated.

Now consider Kelvin's (original) version of the second law:

"It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects."

Kelvin's statement does not entail that the presence of an ANIMATE agency would drastically change the situation but still, by analogy with the Maxwell demon's case, one could advance the following hypothesis:

In the presence of an animate agency (e.g. an ordinary human being) the second law CAN be violated.

I am going to prove this hypothesis in a paper entitled "Maxwell's demons all over the place".

Pentcho Valev

report post as inappropriate

Eckard Blumschein replied on Dec. 27, 2012 @ 09:49 GMT

Isn't Maxwell's neat-fingered intelligent demon synonymous to a perpetuum mobile? I cannot imagine moving a finger having any effect without doing work. Maxwell's demon reminds me of Laplace's demon.

I do not understand why you suddenly got silent in the discussion about Maxwell's mistake and its consequences for emission theories and jumped into a different issue. You seem to be attracted from anything people like you you do not understand, and you seem to shy back from any result that demands a courageous correction of your own view.

Our discussion here should have convincing results.


report post as inappropriate

doug wrote on Dec. 29, 2012 @ 14:13 GMT
Quantum dynamics approaches Classical dynamics as the rate of matter (of traveling particles) approaches stillness (velocity = zero). At velocity = c, matter becomes Space.

report post as inappropriate

doug wrote on Dec. 29, 2012 @ 15:29 GMT
Final Statement - CIG Theory - to expand on my post above :

Quantum dynamics approaches Classical dynamics as the rate of matter (of traveling particles) approaches pure stillness (velocity = zero). At zero velocity, the two (quantum & classical theory) are indistinguishable. At velocity = c, Matter becomes Space. In between is everything else [Dark Matter & Dark Energy & other variations...

view entire post

report post as inappropriate

T H Ray wrote on Dec. 29, 2012 @ 19:44 GMT
"My Latest Thoughts: Why do large things move slower than small things?"

They don't.

Your persistence is admirable. Your physics, however, is demonstrably wrong. Galileo disproved your latest thought, which is based on Aristotle's philosophy, not on modern science.

As to, "Quantum dynamics approaches Classical dynamics as the rate of matter (of traveling particles)...

view entire post

report post as inappropriate

doug replied on Jan. 6, 2013 @ 01:48 GMT

Thank You, Thank You, Thank You.

Someone is actually reading what I write. OK Then, before I reply, one question you must answer for me, as you are a physicist.

It's my balloon question. We heat up a cold balloon & it goes from x amount of Space to 3x. Where did the new Space come from? [please don't tell me the particles are moving farther away from each other; I want to...

view entire post

report post as inappropriate

doug replied on Jan. 6, 2013 @ 03:05 GMT

Maybe its better to do this:

I. Assume that a given piece of matter turns into space as it moves, and turns into space according to:

0.02762u = 25.7MeV = 14,952,942.08 pm cubed of space

(Mass) (Energy) (Space)

II. And that matter turns to the space at the percentages associated with time dilation (Lorentz transformation) as that massive particle moves from zero velocity to rate "c"

[In other words, the above eqivalency holds at "c"; for rates less than "c" use the transformation]

III. Let's assume the above.

IV. What questions and/or complications and/or solutions arise from this assumption? [my latest find was why large things are large and small things are small]

Maybe that's a better way to approach CIG. Let's just assume the above holds. What world does it predict?

1. This is where the space for an expanding universe comes from.




If then the above "assumption" describes the world better than current theory, perhaps its worth taking a closer look at CIG

Why not make this a challenge to viewers.

You can look at CIG & my prior posts to fill in the list, or, to avoid bias, just consider the above assumption (whether you believe it or not), and write a program , or by other menas, describe the world we live in.



report post as inappropriate

John Merryman. wrote on Dec. 30, 2012 @ 02:52 GMT

This is on the bottom. For some reason that thread wouldn't load a reply button.

"Isn't the question how to deal with the pebble zero between positive and negative numbers identical with the question of a somehow extended point-like state of present between past and future? Of course, an ideal point has no extension. Euclid defined it as something that has no...

view entire post

report post as inappropriate

Paul Reed replied on Dec. 30, 2012 @ 09:58 GMT
Indeed, the reply to this thread link is missing. So John this is in respect of your kast post in thethread you started 11/8 16.43.


Clarification first, though you probably meant what follows. In respect of any given existential event, it does not only occur, but in doing so also creates physically existent ‘effects’ as the result of an interaction as it occurs with certain...

view entire post

report post as inappropriate

Paul Reed replied on Dec. 30, 2012 @ 10:18 GMT

This is in response to you last post in that thread.

It is not a “desire to have something to hold”. It is the fact that there is something, and it has a generic form. And therefore, that is the start point for an analysis, not metaphysical conceptualisations.

Mathematics is just a representational device, but like narrative or graphics, it has to correspond with what is ‘there’, otherwise it is just philosophy, but looks scientific because numbers are involved. So taking ‘a blank sheet of paper’, or a ‘non specific point in space’ or a ‘circle’, etc, is only the correct procedure if there is existential reality which relates to these concepts.

My ‘obsession’ as you call it, relates to one such simple fact about the physical existence we are investigating, and that is that it is existential sequence. Points, zeros, positives and negatives are irrelevant so long as they are seen purely as scale, ie not reified. The system must reflect existence, not the other way around, and existence, for example, does not involve nothing or infinity, which is what you said earlier, and I have said many times before. And just for clarification, John, when we say nothing, we really mean nothing of something in a particular context. This is the same as my change/difference does not physically exist, they are the result of comparisons of what does.


report post as inappropriate

Eckard Blumschein replied on Dec. 30, 2012 @ 17:03 GMT

Existence is not trivial even in mathematics. The square root of two does not exist within the rational numbers. The square root of minus two only exists within complex numbers. I agree, in principle, with Paul on that existence in reality must be distinguished from mere pictures, plans, expectations, etc. In so far Georgina's question was important: What does reality mean in physics?

I strongly disagree with those like Paul and you who are not aware of a trifle: The notion "the present" must not be understood as something existing in reality. It rather denotes the irrelevant in an unspecified sense distance from the extension-less point between past and future. This does not mean the notion present is useless. It is merely not suited for exact physics.

You wrote:"The problem is that when we consider "elapsed" time, ie. the record of what we have previously measured, we view it as a static, cause and effect sequence." I do not see your problem. Looking back upon my life, i.e. on my age, I see this measure quite independent of cause and effect and definitely not as a sequence. Do you question what Shannon pointed out: The past cannot be changed and is known in principle, the future can be influenced but it is not known for sure. In tat sense, the causal network has been frozen in the past.

You spoke of moving from future to past. I cannot see what could be the moved item. To me past and future are properties like plus and minus. Negative items do not really exist.


report post as inappropriate

John Merryman wrote on Jan. 9, 2013 @ 19:30 GMT

The issue of digital vs analog was the subject of the prior contest. The relationship of being vs. doing has been debated since the ancient Greeks. It's not as though there is much new to the debate. You keep repeating the same simple assertion; That reality must be a sequential series of still frames, with no other explanation of how they occur. You are certainly welcome to that view of reality. We all have to have some framework in order to process information and that is how the mind actually functions, whatever the reality in which it exists. I do not agree reality is fundamentally discrete. I see both distinction and connection as two sides of a dichotomy. In order for the distinct parts to be anything beyond isolated points, they have to interact. You may not see it that way, but I'm only offering my views and you are free to differ.

report post as inappropriate

Paul Reed replied on Jan. 10, 2013 @ 06:57 GMT

I know, and so was time, but neither were solved.

“You keep repeating the same simple assertion; That reality must be a sequential series of still frames, with no other explanation of how they occur”

I repeat a small number of facts (not assertions) about how, generically, the form of existence we are able to investigate must occur. How this actually manifests is for...

view entire post

report post as inappropriate

John Merryman replied on Jan. 10, 2013 @ 18:15 GMT

"whatever 'they' actually are must be innately capable of interaction, otherwise there would never be any."

Exactly. Being and doing.

report post as inappropriate

Login or create account to post reply or comment.

Please enter your e-mail address:
Note: Joining the FQXi mailing list does not give you a login account or constitute membership in the organization.