Thanks, Juan, for a careful reading and interesting points. Lots to parse here.
>I do not see why an appeal to randomness would be considered less fundamental. Of course, many phenomena in our universe is not random and it can be explained in base to "this" or "that", but there is no objective reason to expect that everything in the universe has a cause.
Agreed -- see my response to Flavio above. Some things don't need an explanation. But the Second Law does need an explanation, for several reasons. 1) It supplies many other subsidiary explanations, so it's not devoid of content. 2) It can't be fundamental in its own right, because it only applies to macrostates, not microstates. 3) It can't be explained from our time-symmetric dynamical laws, or randomness.
> I do not find any reason to believe that Universe is deterministic.
I probably agree with you here, but the real question is whether it's time-symmetric. Even our indeterminstic theories predict time-symmetric micro-phenomena.
>Does energy conservation follow from Noether theorem? Or it just that the Lagrangian formalism is only valid for non-dissipative systems and thus has conservation law just hidden in its symmetries. Moreover, all treatises on Noether theorem I know confound conservation of energy (diE/dt=0) with invariance of energy (dE/dt=0).
Well, when you do it correctly in classical field theory, you get constraints on the Stress Energy tensor, which is really the right way to go. "E" is a bit of a fiction, certainly in GR.
>The equal a priori probabilities postulate is not associated to the second law.
I'm skeptical. Could you point me to a stat mech argument for the 2nd law that doesn't implicitly assume it at some stage?
>The postulate is needed in equilibrium statistical mechanics to get thermodynamic properties for systems at equilibrium and routinely used for the description of reversible processes. In fact the postulate is not valid outside equilibrium and, thus, not valid to study the irreversible evolution of a system towards equilibrium.
Applying the word "valid" to the EAPPP seems like a category error. Of course, it's never *truly* valid: at any given time the actual system is in 1 microstate, with 100% certainty, and all others with 0%. The "a priori" means that you use it as a Bayesian prior, when you have no other information. And as you note, if you did this, you would predict it would be in an equilibrium macrostate. (Which it might not be, certainly, but that would be your best bet given no other knowledge.) And if you *knew* it wasn't in equilibrium, or knew anything else at all, you'd update your priors. But usually that just means applying the EAPPP to all possible states that were compatible with your updated knowledge. That's how you get to the 2nd Law from the EAPPP. Knowledge always trumps randomness.
>The "past hypothesis" not only do not explain the second law, but shows a basic missunderstanding about the second law. The second law is not about features of the initial state. Indeed the time-asymmetry encoded in the second Law cannot come about from time-symmetric dynamical laws. We need time-asymmetric dynamical laws.
That was certainly what Eddington thought -- but that challenge has been open for a century with no answer in sight. (If there are time-asymmetric dynamical laws, what are they?) By now, the question has been settled by computer simulations that show entropy increasing (from low initial boundary constraints!) using explicitly time-symmetric dynamics. In computer simulations, there is no possibility of hidden dynamics we don't know about.
>"The Second Law tells us that entropy always increases". Not true. That is only a superficial and misguided formulation of the law. The second law says that the production of entropy is non-zero. The secondf law is not dS>0. The second law is diS >=0. And this is the classic formulation, where thermal fluctuations are ignored.
Agreed! (But in our universe, at any reasonable coarse graining, it does increase.) Also agreed about the fluctuation issue; I talk about this in the 'anthropic' section.
> All the subsequent attempt to explain that superficial and misguided formulation of the law is invalid as well. Asigning a low entropy to the initial instant of the Universe does not explain anything, and the incompatibility between the second law of thermodynamics and mechanics (time-reversible) remains.
See the computer example above. Some of the best work on this topic has been done by Larry Schulman. He puts a low entropy *final* condition on systems and shows that entropy *decreases* in computer simulations. He also used initial and final boundaries and showed that entropy went up and then down again. There's no incompatibility whatsoever; all the asymmetries come from the boundaries.
> Effectively, we solve the Liouville equation (or its von Neuman quantum analog) and set initial state of very low entropy and the evolutions predicted by the equations continue contradicting the second law and observations.
I don't understand what your point is here.
>"Boundary explanations" do not explain anything.
Obviously, I disagree.
>Noticing that the systme evolved from A to B because it was first on A and latter was found on B is vacous of content.
True... But those boundaries can still explain what happens inbetween. And if you don't impose anything at B, the boundary at A can also be used to explain asymmetries, if A is "special" or essentially different from how it ends up at B. Furthermore, (the case I'm most interested in) consider *partial* boundary constraints (say, half the Cauchy parameters), constrained on both ends. Once the system is solved, these partial boundaries then explain the un-constrained parameters, at the beginning and the end, and all the parameters in the middle, too. So boundaries can absolutely be used to explain things, when combined with some way to solve the system.
> Moreover this kind of boundary 'explanations' often hide another serious missunderstanding of the second law; if all what was needed to explain that the system evolved irreversibly as A --> B because it was initially on A, then the second law would not be needed. The first law would be enough.
I think you're perhaps mixing up macro- and micro- concepts here. There are no irreversible events at a micro scale. (Or so most physicists believe; maybe we're all wrong.)
>Initial states and boundaries are alredy used in the laws of mechanics and electrodynamics, but those laws cannot describe irreversibility. And that is the reason why thermodynamics and the second law was born as a separate field of physics.
Yes, it was born separate, but Boltzmann (and others) figured out how to reunite them. The key difference is that when you zoom out to the macrostate, tossing away some of the data as unknown, then apparently irreversible (macro-) processes enter the story (assuming you have a special low-entropy boundary condition, so that the Second Law is in play). But if you know everything, even that apparent irreversibility goes away.
>The arrow of time, the irreversibility of the second law, has a dynamical origin: resonances. There are a broad literature in the topic.
There is indeed a very broad literature, and the vast bulk of physicists are perfectly happy with the boundary-based account, even if they're not willing to treat boundaries as fundamental in their own right. If special time-asymmetric resonances were needed, then why would entropy increase in computer simulations that lacked them? Don't you think you're already using Second-Law-style logic when you try to infer a time-asymmetry from a resonance? (Classical chaos is time-symmetric, too, but you can get an arrow from it if you impose a low entropy initial boundary.)
In general, I'd note that we have lots of time-asymmetric intuitions, and they're all too easy to slip into our analysis without properly seeing where they come in (as happened to Boltzmann himself). The discovery of fundamental time symmetry has been a big surprise to those intuitions, but a surprise that we should take very seriously.
All the Best,
Ken