"What I have said at the start about information content is standard stuff..." In Algorithmic Information theory, but it has little to do with Shannon's Information theory.
As Shannon states in the first sentence of his A Mathematical Theory of Communication, he is concerned with:
"various methods of modulation... which exchange bandwidth for signal-to-noise ratio... in a general theory of communication."
In other words, a trade-off (exchange) can be made, that increases bandwidth in order to offset the effect (on information capacity) of a decreased signal-to-noise ratio and vice-versa. That is what he is interested in characterizing.
On page 43, where he gives his famous Capacity theorem (Theorem 17), he states: "This means that by sufficiently involved encoding systems we can transmit binary digits at the rate (C) bits per second, with arbitrarily small frequency of errors."
Thus, unlike any other "flavor" of "Information theory", Shannon is primarily concerned with the circumstances, under which one can correctly (without errors) recover the values of each, individual bit, encoded into a continuous, noisy waveform. It is meaningless to talk about the error-free value, of an unrecoverable fraction of a bit, precisely because a fraction-of-a-bit has no "correct" value.
The significance of this, is that it pertains directly to "uncertainty" in physics. Namely what is the maximum number of discrete (quantized) bits of information, that can ever be recovered, without any errors, from any set of measurements of any continuous function/signal, that has a limited duration, bandwidth and signal-to-noise ratio (SNR)? In other words, what is the maximum number of bits, that you can ever be "certain" of, in measurements of any received "signal"? What happens if that maximum number happens to be exactly equal to one? What happens is - you get the Heisenberg Uncertainty principle. That is the origin of the HUP. It has nothing to do with any "weird physics", it is simply a property of any mathematical function, with a severely limited duration, bandwidth and SNR; the very properties that Shannon said he was concerned with, in his very first sentence. There are circumstances for which, there is no possible "exchange of bandwidth for signal-to-noise ratio", that will ever enable an observer to be certain of the value of any more than one, single-bit-of-information, within an entire set of measurements, of some continuous signals. That is what "entanglement" is all about.
Rob McEachern