Login or
create account to post reply or comment.
Robert H McEachern wrote on Mar. 5, 2019 @ 15:04 GMT
"Both were interested in how information theory could help illuminate quantum mechanics... the information you gain when you learn something about a system is mathematically defined to be the reduction in your uncertainty about it."
No. This is a fundamental misunderstanding about the nature of information, that has utterly confused physicists for generations.
"But the instant you make that measurement, the wavefunction collapses into one possibility or another"
No. There is no physical wavefunction - it is merely a computational tool.
Quantum theory only describes the probability of detecting a particle. It does not describe actual measurements, of anything, at all. Think about it. In the famous double slit experiment, the only thing ever observed/detected are spots on a screen, or detection counts of particles. The position of the particle being detected, is NEVER actually measured. Rather, which detector (from a set of detectors) detected the particle, is first observed, then the position of that detector, not the detected particle, is measured, and then, the particle's position is INFERRED, not measured, as being the same as the detector.
Nothing else is even a possibility, since the Heisenberg uncertainty principle is equivalent to the statement that only a
single bit of information is being manifested, in a quantum detection process: exactly enough information to answer one yes/no question "Was something just detected?", and nothing more.
"Making sense of this measurement problem is the 'most fundamental problem in all of quantum mechanics'..." Exactly. To make sense of it, you have to first recognize the fact, that no "measurement" is actually being performed on the detected entity - only a single-bit detection-decision (AKA wavefunction collapse) is ever being performed.
Rob McEachern
report post as inappropriate
Anonymous wrote on Mar. 6, 2019 @ 19:59 GMT
"There is no physical wavefunction - it is merely a computational tool."
True enough to Quantum Mechanics. But take a well engineered AM/FM radio with a mechanical air gap variable capacitor on the frontend tuning the antenna gain, and move it around to different places in a room. In some placements, proximity to metalic components of building construction will act as antennae and the selectivity of the radio reception will be blocked significantly, while only a few feet away the station's signal will be unobstructed and reception will have high sensitivity. We cannot 'see' what a photon, or waveform of EMR physically 'looks' like, but its physical observation by conductive elements behaves in a manner which indicates that some variable does exist which reacts across the classical spread wavefront of a modulated transmission. While also the selectivity of bandwidth is a linear LOS.
QM gets around this fundamental conundrum imposed by the dictum of all things being (tiny) hard particle interactions, by saying 'never mind aking, photons (particles) exist everywhere all at once on the expanding surface of the spherical wavefront until its observed where we want to detect it'. And the presumption that Maxwell's Demon can be put to work as a thermodynamic operator without any raise in pay, ignores that the one bit of information about entropy that has been ignored is that any form of energy is still firstly the same sort of energy in the raw. A wave form will not go from light velocity to rest instantaneously, it will slow exponentially to a density exhibiting particulate characteristics. That DOES NOT violate entropy. Its still the same quantity of existential energy.
report post as inappropriate
Robert H McEachern replied on Mar. 7, 2019 @ 00:42 GMT
You don't need to do anything fancy to lose FM reception. As a result of multipath interference, I can frequently be sitting at a traffic light in my car and be getting perfect reception, but if I creep forward just a few feet, reception will be totally lost. My HDTV's reception (via a small indoor antenna) goes from perfect to terrible, on windy days, due to the fact that the demodulator cannot...
view entire post
You don't need to do anything fancy to lose FM reception. As a result of multipath interference, I can frequently be sitting at a traffic light in my car and be getting perfect reception, but if I creep forward just a few feet, reception will be totally lost. My HDTV's reception (via a small indoor antenna) goes from perfect to terrible, on windy days, due to the fact that the demodulator cannot track the rapid variation in multipath caused by the swaying tree branches behind my house.
A "well engineered" FM demodulator is far from a linear LOS; a phase locked loop is used to, in effect, continuously retune a narrow bandpass filter, with a bandwidth much less than the overall signal bandwidth, to track the narrow "instantaneous bandwidth" of the FM carrier and thereby greatly reduce the effective receiver noise level - but only as long as the signal remains "above threshold".
"which indicates that some variable does exist which reacts across the classical spread wavefront of a modulated transmission" But that variable has nothing to do with any individual photon, anymore than the properties of water waves are innate properties of water molecules.
QM does not say that "photons (particles) exist everywhere all at once"; as Einstein et. al. suggested, only the absurd interpretations of QM say that. But such interpretations are no longer required; it has been demonstrated that classical objects, constructed to manifest only a single-bit of information, will reproduce all the seemingly weird behaviors that have so befuddled the physics world, for nearly a century.
In that context, it is important to realize that "information" as Shannon defined it, has little to do with "entropy" or even physics in general (most unfortunate that von Neumann persuaded him to name it "entropy"); it is a purely mathematical concept concerning the ability to perfectly reconstruct a continuous function from discrete samples. It is most unfortunate that the physics world has confused the two concepts. Shannon's concept, is key to understanding the "measurement problem".
It is easy to demonstrate that QM boils down to little more than the mathematical description of an energy detecting filterbank (a Fourier transform's power spectrum). When you send particles (quanta) with equal energy into the various channels of the detector, the ratio of total energy received in a channel, divided by the energy per quanta in the channel, enables one to infer the number of received quanta in each channel, thereby rendering the entire process as being simply equivalent to a histogram; that is the origin of the Born rule. No mysterious wavefunctions, wafting through the cosmos, are required to understand what is actually going on.
Rob McEachern
view post as summary
report post as inappropriate
Anonymous wrote on Mar. 7, 2019 @ 02:55 GMT
As usual, Robert is true enough to QM but also to interpretations thereof.
And yes, selectivity is in and of the FM demodulator, the transmission is a linear LOS groundwave. My '92 GE superadio 2 has a switched narrow bandpass filter, allowing tuning to high gain with an ear for distortion when a desired station gets ghosted. And I'm rather fond of small Ohio Colleges and Universities. Good Hunting, Guys.
report post as inappropriate
Robert H McEachern replied on Mar. 7, 2019 @ 18:25 GMT
Off topic and beyond the scope or what is appropriate here; but specialized receivers with coupled together phase locked loops have been used to track co-channel FM signals, such that each loop only tracks one signal and then uses it to cancel the interference in the other signal, thereby effectively eliminating the interference from both signals. A lot more complex than your single narrow filter, but with much higher performance.
Rob McEachern
report post as inappropriate
Anonymous wrote on Mar. 7, 2019 @ 20:39 GMT
Not so off topic, Rob, this is jrc by the way... I just had to get a new laptop that is so overloaded by the OS that I don't respond if I have to open an account (for anything).
Really, we must rely on technical higher order effects to deduce a local realistic picture of the micro realm, so specialized receivers such as you briefly describe do provide some means to figuring out what is...
view entire post
Not so off topic, Rob, this is jrc by the way... I just had to get a new laptop that is so overloaded by the OS that I don't respond if I have to open an account (for anything).
Really, we must rely on technical higher order effects to deduce a local realistic picture of the micro realm, so specialized receivers such as you briefly describe do provide some means to figuring out what is realistically happening. And maybe, just maybe, providing some insight towards a conjecture of what a "photon' or particle would physically 'look like'. And Classicism has dropped the ball on that pursuit as badly as has the ad hoc interpretations of QM. I am in complete agreement with you as to QM being a Math, not a physical theory, and think that has to be confronted directly in any effort to rationalize QM. The symmetry is baked into the math.
Towards that, RF signal transmission has enough infrastructure that the spherical broadcast wavefront in contrast to the localized multipath interference of reception suggests that a multitude of linearly directional 3D waveforms are more the reality of the Quantum Jump than a single 'photon'. And that the 'Quantum' photo-electric effect is time dependent on a rapidity work function. But that would mean that each Planck Quanta would only periodically precipitate at the pinch points of a volumetric wavetrain. The statistics of QM do work effectively, and the quest (however Demonic) to discover Why it does, could include a hypothesis that the wave form of EMR interacts only marginally with the electrostatic domain range of the macro world of subluminal particulate aggregates, due to velocity dependent super low densities. And only at near particulate densities periodically, does physical interaction respond to the variables inherent to matter. This creates a realistic picture of a very crowded landscape even in the most rarified regions of intergalactic free space. But then, empiricism calculates we have a lot of uncatalogued energy to yet account for.
Best as always. jrc
view post as summary
report post as inappropriate
Anonymous wrote on Mar. 8, 2019 @ 14:20 GMT
"No physical process can have as its sole result the erasure of information."
I suspect that conclusion would be dependent on arbitrary constraints. Information is firstly our own invention, like mathematics it is essentially experimental. The number line of equal increments is an arbitrary constraint, in contrast to the different lengths of my fingers and the asymmetry of my two hands. Entirely experimental.
"...information...is a purely mathematical concept concerning the ability to perfectly reconstruct a continuous function from discrete samples." - R. McEachern
We are assigning parameters before we can say that 'one bit of information' has been detected, and the best we can do is to limit to that one bit. And if we are working on the yet to be rationalized assumption of Planck's Constant being the finite least 'bit', we might be able to perfectly reconstruct it in a continuous quadratic algorithm as the root square mean. But it is quite possible to start with Planck's Constant and treat light velocity as the root exponential mean in a continuous function of energy distribution in a spherical volume, and arrive at finite results in which the tiny value of Planck, trivializes out of computation and a true continuous function is the obtained as a physical process. So would that be an erasure of information, or is Planck's Constant an arbitrary starting point and physically only represents an averaged least observable value?
report post as inappropriate
Lorraine Ford wrote on Mar. 11, 2019 @ 23:24 GMT
Re “In the quantum realm, observers typically know very little about where a particle is, or how fast it is moving, or how it is spinning. They have to make a measurement to reduce that uncertainty”:
This article seems to assume that the only problem is the observer: i.e. that there are “deep limits on what observers can know” about “where a particle is”. The assumption is that the particle always has a definite position, and that position outcomes are 100% determined by some law of nature.
But quantum mechanics is about the fact that, in the microrealm, particles don’t always have a definite position AND particles don’t exist everywhere (i.e. all positions) all at once AND particle position outcomes are inherently unpredictable to an observer (i.e. there exists no law of nature rule that always determines outcome positions for individual particles).
The “measurement problem … the "most fundamental problem in all of quantum mechanics,"” is only a “problem” because, like climate change deniers, physicists refuse to face quantum reality.
The real issue is how to INTERPRET the quantum randomness and indeterminacy of the universe, as seen from the point of view of an observer of the microrealm.
report post as inappropriate
Georgina Woodward replied on Mar. 12, 2019 @ 23:00 GMT
Without any context or reference, attribute such as orientation, direction of motion, even speed of motion are not applicable. Only when' in relation to this (or that) is applied can the relation of both beable and reference/apparatus/observer give a determination that in this (or that) context beable x has attribute (state) A. Attribute not intrinsic property. It doesn't mean the unmeasured is without orientation, direction and speed of motion relative to objects in its environment but the context has not been determined by the experimenter. The 'in relation to this' is not known. Orientation of the beables may be the source of randomness. As for location at one time (not a probability distribution or over time characterization), there has to be location relative to the environment even if unknown. Detection provides the 'in relation to this' context.
report post as inappropriate
Lorraine Ford replied on Mar. 13, 2019 @ 14:52 GMT
In physics, there are NO relationships between THINGS. This is an important distinction. (Law of nature) relationships only ever exist between seemingly natural categories of information like particle mass, particle relative position, “how fast it is moving” [1] and “how it is spinning” [1]. So, this purportedly existing relationship does not actually exist: “the relation of both beable and reference/apparatus/observer”. Similarly, there is no such thing as “motion relative to objects” or “location relative to the environment”.
And contrary to what you say, “attribute[s] such as orientation, direction of motion, even speed of motion” are ALWAYS “applicable”. These seemingly natural categories of information, and their lawful relationships to other such categories, continue to apply in the universe no matter what the “context or reference” of an observer. Maybe you are trying to say something about (what we represent as) the NUMBER values that apply to the natural categories?
But seemingly in the quantum microrealm, natural categories (e.g. relative particle position) can sometimes have NO numbers applying to them: i.e. there is no information available to an observer, there is nothing for an observer to detect.
1. Thermo-Demonics by M. Mitchell Waldrop, https://fqxi.org/community/articles/display/234
report post as inappropriate
Georgina Woodward replied on Mar. 13, 2019 @ 21:53 GMT
Lorraine, you wrote "In physics, there are NO relationships between THINGS", perhaps that is where the problem lies.I agree that properties are considered in physics and not in general the beable thing that has those properties, or attributes. Nor is the beable environment considered. Not being thought about isn't the same as not existing. It makes sense to me that the properties or attributes, even though 'distilled' at measurement, pertain to something and are not orphan information. 'Heads up' on a table is not a coin but a state that pertains to the beable coin's orientation in relation to the beable table. The motion attributed to an object is relative to the motion of the observer and the orientation relative to the reference used to describe the orientation.
report post as inappropriate
Georgina Woodward replied on Mar. 13, 2019 @ 22:47 GMT
Correction:The motion attributed to an object is relative to the motion of the observer, or apparatus or other reference object.
If velocity of X is given as 10 m/s one should ask in what context that is true. Who says so, and what is their relation to the measured or what is the motion being considered in relation to. Am I talking about number values? -yes when the state can be described with a number, otherwise not. Saying something like- orientation is not applicable in the absence of context-means orientation can not be given/told ( if it is given/told there has been a hidden reference used in the determination).
report post as inappropriate
Georgina Woodward replied on Mar. 13, 2019 @ 23:30 GMT
Correction: The motion attributed to an object is relative to the motion of the observer, or apparatus or other reference object or phenomenon.
For example; the orientation or motion state could be given in relation to the gradient of a gravitational field or a magnetic field, rather than the object sources of those phenomena.
report post as inappropriate
Lorraine Ford replied on Mar. 14, 2019 @ 01:30 GMT
Georgina,
I would question the use of the words “property” and “properties” [1] by physicists and others like yourself. Is “property” the correct term for classifications like mass and relative position?
Clearly mass and position, for example, have no independent existence. They are not something that is “possessed” by the universe or by objects because they only exist as relationships: they are more correctly seen as relationships. Therefore, classifications like mass and position are more correctly seen as categories (i.e. as relationships).
1. Property: “(mass noun) A thing or things belonging to someone; possessions collectively…An attribute, quality, or characteristic of something”, https://en.oxforddictionaries.com/definition/property
report post as inappropriate
Georgina Woodward replied on Mar. 14, 2019 @ 05:27 GMT
Lorraine, property is probably not the best term for measurables. States of being such as atomic number and chemical structure are properties belonging wholly to the beable, But a measurable is in part due to the beable measured and also in part due to the context of measurement, the method used. The state or value formed by the relation is attributed to the beable. A coin caught and revealed palm up might be heads, then flipped onto back of opposite hand and revealed as tails, then slid carefully from hand onto table top-still tails. Once the measurement relation is applied the state or value that will be 'discovered" is already 'decided'; making The model comprising different possible out comes obsolete but there needs to be a mental switch also, to thinking about the measurement, knowing it instead. What is being considered has altered, it isn't the same thing.
report post as inappropriate
Georgina Woodward replied on Mar. 14, 2019 @ 09:13 GMT
I wrote'The state or value formed by the relation is attributed to the beable.' The value or state is usually attributed to a named object. (That name, in such a circumstance, even if not acknowledged, pertains to a beable.)
report post as inappropriate
Lorraine Ford replied on Mar. 14, 2019 @ 23:36 GMT
Georgina,
I think that it is necessary to attempt to conceptualise the difference between categories and properties. But as it is not directly about my original criticism of the article, I have put it in a separate spot (see below).
report post as inappropriate
hide replies
Lorraine Ford wrote on Mar. 14, 2019 @ 23:24 GMT
If mass is a category, then is number a property carried by a particle?
A law of nature relationship (e.g. mass can be represented as a lawful mathematical relationship) is seemingly a natural category. These lawful categories don’t necessarily have to have any numbers equated to them. A category like mass or position can seemingly just “exist” as a relationship without any numbers being applied to the categories.
But if numbers are equated to these categories then a new world of possibilities opens up. Equating a number to a category requires (what we would represent as) the introduction of a new mathematical relationship to the universe-system. These (rational? irrational?) numbers, applied to categories like mass or position, are probably best described as properties of particles; they are the specific information carried by particles. But these numbers are contextualised by being equated to categories: i.e. the number information means nothing without the category (i.e. relationship) information.
So, what is a number? A number is not a category, because a number can be constructed by dividing a category by itself (we would represent this as a mathematical relationship between categories), leaving a thing without a category.
report post as inappropriate
Georgina Woodward replied on Mar. 15, 2019 @ 02:03 GMT
Mass is a category of measurable.I'd say number is not a property carried by a particle but a measured or calculated value and units can be attributed to a particle allowing comparison with others. If different units are used the number changes. What doesn't change when the units are changed is the amount of existence as something or somethings, un-quantified; and the where, location un-quantified, in comparison to its local environment. Many different measurements of location in comparison to other things in the environment could be made, giving many different number and units outcomes. The beable stuff of the particle is a property. So atomic number and number of electrons or number of other sub atomic constituents are properties- wholly owned by the particle.
report post as inappropriate
Georgina Woodward replied on Mar. 15, 2019 @ 23:40 GMT
I said mass was a category of measurable but that is only the half of it. The numerical value (and units) obtained for mass via measurement of weight or comparison of weight is the value of the mass measurable. A relation of the measured and measuring apparatus is needed to obtain it. The value output is knowable information.
Intrinsic mass un-quantified is a beable actuality, a property belonging solely to the object. Type and number of each type of constituent are also beable actualities.
Measurable value information and beable actuality are distinct categories.
report post as inappropriate
Georgina Woodward replied on Mar. 16, 2019 @ 02:50 GMT
To be accurate I ought to say the value output can a source of knowable information. To be knowable it must be in a form accessible to the senses; Most usually visual or auditory potential stimuli, emitted or reflected from the 'read out', or emitted 'sound waves'.The beable read out of an apparatus un-illuminated is not (generally speaking) knowable. Though it might be felt if the numbers are raised or indented.
report post as inappropriate
Lorraine Ford wrote on Mar. 16, 2019 @ 21:37 GMT
Re: “The framework, which is being developed by physicist Benjamin Schumacher… and mathematician Michael Westmoreland… put information at the center—along with a hypothetical, microscopic observer known as Maxwell’s demon. ...Schumacher and Westmoreland …were interested in how information theory could help illuminate quantum mechanics… I started wondering if information is more...
view entire post
Re: “The framework, which is being developed by physicist Benjamin Schumacher… and mathematician Michael Westmoreland… put information at the center—along with a hypothetical, microscopic observer known as Maxwell’s demon. ...Schumacher and Westmoreland …were interested in how information theory could help illuminate quantum mechanics… I started wondering if information is more fundamental than probabilities”:
In the real world, fundamental-level information cannot exist as binary digits (true/false, 1/0, yes/no or on/off), because a string of these symbols can have no inherent context or meaning, and no inherent relationship to any other such string of binary digits.
In the real universe, fundamental-level information seems to exist as categories: mass, position and velocity (speed and direction) seem to be natural categories, where every category has context and meaning because it is built out of relationships between other, seemingly pre-existing, categories. Categories seem to exist as part of a network of relationships, relationships that we represent with mathematical symbols.
Information also seems to exist as (what we would represent as) numbers. Seemingly categories must have pre-dated numbers in the universe because: 1) categories/relationships can seemingly exist without numbers being applied to them; 2) the number one can be constructed by dividing a category by itself, and (rational) numbers can be constructed using the number one; and 3) standalone numbers have no inherent context unless they are applied to a category.
Information can also be built out of (what we would represent as) algorithms. So, Maxwell’s demon 1) does an algorithmic analysis of the particle’s velocity, and 2) acts on the results of that analysis. If the number representing the particle speed is greater than a cut-off number, and if the particle’s direction is towards the chamber door, then the demon opens the door to let the particle through. To put it another way: If condition 1 is true, and condition 2 is true, then action is true. The true/false binary digit information only exists in the context of the existing category/relationship information, the existing number information, and the algorithmic question asked about that information.
So, the information that the hypothetical Maxwell’s demon acquires is highly sophisticated: it is not fundamental-level information.
view post as summary
report post as inappropriate
Lorraine Ford replied on Mar. 17, 2019 @ 00:44 GMT
(continued from the above post)
Re “there is a trap door in the partition operated by a tiny being [Maxwell’s demon] …If the being saw a high-energy molecule approaching the partition from, say, the left half of the box, it could briefly open the trap door to let that molecule pass through to the right side. And likewise, it could let low energy molecules pass through from right to left”:
The door open/closed outcome was not random.
And the door open/closed outcome was not necessary i.e. it was not necessitated/ caused by any law of nature relationship (a law of nature relationship is represented as an equation).
The not-random, not-necessary outcome was due to (what we would represent as) an algorithm.
But there is no way you can derive an algorithm from an equation i.e. there is no way that nature can evolve (what we would represent as) an algorithm from (what we would represent as) an equation.
So, in any universe that included Maxwell’s demon, (what we would represent as) algorithms must be an inherent part of the nature of that universe.
report post as inappropriate
Lorraine Ford replied on Mar. 17, 2019 @ 22:05 GMT
(continued from the above post)
Re “Or to put it another way, the being [Maxwell’s demon] could cause heat to spontaneously flow from cold to hot—a violation of the Second Law of Thermodynamics…Maxwell left this paradox to later generations of physicists as a kind of homework assignment: Where was the flaw in this thought experiment? What would keep Maxwell’s ’demon’, as other physicists took to calling it, from violating the second law? Did the demon’s ability to observe, think, and act change the fundamental physics in some way? Or was its ’intelligence’ still governed by natural law?”:
As explained in the above 2 posts: 1) Maxwell’s demon possesses sophisticated contextual high-level algorithmic information; and 2) Maxwell’s demon possesses the ability to create non-random, non-deterministic (i.e. not determined by any laws of nature) outcomes.
The possession of contextual high-level information, plus the ability to create required outcomes is what can “violat[e]...the Second Law of Thermodynamics”.
“[T]he demon’s ability to observe, think, and act” DOESN’T “change the fundamental physics” IF contextual algorithmic information, and the ability to act independent of the laws of nature, is already part of the physics of the universe.
report post as inappropriate
Anonymous wrote on Mar. 17, 2019 @ 15:43 GMT
"(what we would represent as) algorithms must be an inherent part of the nature of the universe." L. Ford
This is a clear (and fine) distinction between math and its assignment for analysis. There may be something we would call a 'Math" that would adequately describe a physical phenomenon of space, time and energy in unity; such that a quantity of energy would naturally assume a preferred volume and shape in a universe with an abundance of energy, and thus determine the extremely limited numbers of sub-atomic particle species currently identified by the Standard Model.
From a perspective of Topology, wherein an object is defined in relation to its own constituent reference points independent of an external reference, the Ford/Woodward dialogue seems to be wrestling with a challenge of establishing a convention of terminology in strictly limiting definition to an If and Only If constraint.
report post as inappropriate
Lorraine Ford wrote on Mar. 19, 2019 @ 23:01 GMT
Maxwell’s demon has free will. The demon acquires the following information, derived from observation of a particle, as a basis for action:
1) the speed and direction (velocity) categories [1]; and 2) the numbers that apply to these categories.
Depending on this information, the demon opens the trap door. The demon acts, causing outcomes that are:
1) not random; and 2) not determined by laws of nature.
The above process can be represented as an algorithm [2]. I.e. the structure of free will is represented by algorithms:
1) the algorithms are not necessitated by laws of nature or numbers that apply to e.g. speed or direction; 2) the algorithms represent the acquisition of information
about information [3]; 3) the algorithms represent causal factors that are independent of laws of nature; and 4) the algorithms could be one-off, or “physically locked in”.
..........
1. Categories are essentially transposed law of nature relationships.
2. See “Lorraine Ford wrote on Mar. 16, 2019 @ 21:37 GMT”
3. The “lower level” information is the numbers pertaining to the speed and direction of the particle; the “higher level” information is whether these numbers are greater than, or less than, some reference numbers.
report post as inappropriate
Anonymous wrote on Mar. 20, 2019 @ 16:46 GMT
Speed and position, while being relative between objects, also apply within any single object in terms of the reactivity of its intrinsic properties. And that we might designate as information given concise constraints.
Einstein once retorted in argument with Bohr; "I would just like to know what an electron Is". And this despite conventional assumptions at the time that the photoelectric...
view entire post
Speed and position, while being relative between objects, also apply within any single object in terms of the reactivity of its intrinsic properties. And that we might designate as information given concise constraints.
Einstein once retorted in argument with Bohr; "I would just like to know what an electron Is". And this despite conventional assumptions at the time that the photoelectric effect had to be physically a single whole Quantum of multiple quanta value, because individual quanta would radiate away before the observed multiple could accumulate to liberate an electron from the sample of base metal exposed to modulated frequencies in experiment. Yet Compton had established formulated analysis of elastic and inelastic scattering.
Elasticity is the measure of mechanical Speed at which something can be stretched, and resilience is the inverse function of the Speed at which it will return to a relaxed state (experiment by trying to cut a tire tread and then a rubber band). Macroscopicly this occurs electrostaticly between molecules, while at the quantum level it is inherent to the electrostatic field of an electron. Eperimentally there is nothing small enough that is electrically neutral that can be used to probe the electronic field to precisely define an inelastic core, and photons emitted or absorbed exhibit their own electromagnetic field. This is where the deductive reasoning of theory can provide means of hypothesis that can be subjected to falsification.
Assume, hypothetically, that an inelastic core exists in an electron. While Spin characteristics are intentionally designed to be geometric projections of an instant of observed measurement and not a real measure of physical rotation, that is a theoretical paradigm constraint and does not preclude a local realistic physical rotation of a subject electron. At the zero boundary between inelastic response and elastic (however slow) response, becoming ever more elastic at greater distance from the core; a physical rotation of the core would translate axially as counter rotating torque imparted diametrically to an emission of electromagnetic energy we witness as a photonic stream. Accounting for polarity of 'light' as direction of angular momentum. like playing with a button on a string.
So the information sought, would be what part field elasticity plays in determining frequency in relation to rate of spin of of a hypothetical core.
view post as summary
report post as inappropriate
Lorraine Ford wrote on Mar. 30, 2019 @ 01:38 GMT
Is Maxwell’s demon “governed by natural law” (https://fqxi.org/community/articles/display/234)?:
1. The universe runs on “fundamental level” information:
Fundamental-level information in the universe seems to exist as natural categories e.g. mass, position and velocity (speed and direction). And information also seems to exist as the numbers that apply to these categories.
2. Maxwell’s demon acquires “higher level”, algorithmic information:
IF the number representing the particle speed is greater than a cut-off number, AND IF the numbers representing the particle’s extrapolated direction positions the particle within the set of numbers representing the chamber door plane (THEN the demon acts to open the chamber door, letting the particle through).
3. The information that Maxwell’s demon acquires is the type of higher-level algorithmic information that living things acquire. This highly-structured higher-level information is not a logical consequence of fundamental-level information; however, higher-level information is necessarily built on top of fundamental-level information.
4. But these higher-level true-or-false conditions are information that has no reason to exist unless it provides the basis for “higher-level outcomes”: opening the chamber door was not a logical consequence of laws of nature because laws of nature do not operate on true-or-false conditions; also, opening the chamber door was not a logical consequence of the true-or-false conditions.
5. Maxwell’s demon is not “governed by natural law”.
report post as inappropriate
Lorraine Ford replied on Mar. 30, 2019 @ 22:54 GMT
This is a re-write of 4. from the above post:
4. An algorithmic relationship has the structure: IF particular situational information is true THEN cause particular outcome information to be true. The algorithmic relationship links particular information with particular outcomes, but apart from the algorithmic relationship, there is no necessary connection between the information and the outcome.
So, if outcomes that are 100% due to fundamental-level information continue to apply, then higher-level true-or-false information has no reason to exist. Higher-level true-or-false conditions are information that has no reason to exist unless it provides the basis for “higher-level outcomes” i.e. outcomes where at least one of the numbers representing the outcome information is
not a logical consequence of fundamental law of nature relationships.
The demon’s opening of the chamber door is a “higher-level outcome”. I.e. at least one of the numbers that represent the opening-the-chamber-door outcome was not a logical consequence of laws of nature. In any case, law of nature relationships are based on categories of information and numbers, they are not based on true-or-false conditions.
report post as inappropriate
Lorraine Ford replied on Mar. 30, 2019 @ 22:55 GMT
(continued from the above post)
Re “apart from the algorithmic relationship, there is no necessary connection between the information and the outcome”:
The demon’s opening of the chamber door is an outcome that was not a logical consequence of the true-or-false information. Any outcome could have been linked to the true or false information e.g. buy a red hat, or wear the blue socks.
report post as inappropriate
Anonymous wrote on Mar. 30, 2019 @ 17:28 GMT
Working from Faraday whom Maxwell analysed, the field is a physical attenuation of the same energy as the portion reckoned to be 'matter'. So it would be energy density in situ as a physical property of a particle which would equate with temperature, whereas macroscopically temperature is associated with motion of particulate matter. So macro realm temperature would be 'higher order information' and energy density would equate as higher temperature = higher velocity. But at the quantum level, energy density would equate as higher density = colder temperature.
This apparent dichotomy might be resolved if higher order information obtains from lower energy density regions of the particle field, such as that evidenced by electrical repulsivity, being the impetus of macro particle motion.
report post as inappropriate
Anonymous wrote on Mar. 30, 2019 @ 19:06 GMT
Not to detract from Maxwell, but in unifying the electric and magnetic fields mathematically his analytic method also segregated the full field. Where Faraday was essentially an intuitive experimentalist and made only the barest of mathematic analysis, Maxwell "associated" the electric and magnetic fields with a nondiscript particle. So it must be kept in mind that more questions abound from Maxwell than are answered.
The (very) low speed experiments of Faraday, however meticulously measured and documented, could only be mathematically extrapolated by Maxwell to converge with the light velocity oscillations of Hertz' radio wave experiments. And experimentally it is only the transition zone of EMR wich is ever actually observed.
report post as inappropriate
Anonymous wrote on Apr. 2, 2019 @ 13:21 GMT
"Only those who attempt the absurd will achieve the impossible. I think its in my basement... let me go upstairs and check." - Maurits Cornelius Escher
On the Quantum probability, infinite two dimensional complex plane, M. C. Escher could construct a perpetual waterfall or a continuous ascending and descending staircase. Perhaps choice of geometries can therefore allow Maxwell's Demon uninformed results. What would his friend, Roger Penrose think?
report post as inappropriate
Anonymous wrote on Apr. 4, 2019 @ 22:00 GMT
Temperature, like mass, suffers a lack of general definition in theoretical terms. Both are treated operationally. Yet thermodynamics must become generalized with both Relativistic and Quantum mechanics to progress beyond the current assemblage that is the Concordance of the cosmological standard model. In short, quantization needs Canons of general terms.
Consider Maxwell's convergence and divergence functions, curl and div. These commonly apply to coefficiencies of permeability (mu) and permittivity (epsilon) of a field associated with a mass. Mass : energy equivalence provides no law of proportionality to prescribe an upper density bound for any quantity of energy and so attempts to determine a finite field fall into a mathematic singularity.
In analysis, permeability and permittivity are seperate operations, yet in reality both would physically coexist. Both limit to a light velocity proportion in free space, but operate under opposite signs. The product would therefore be (-c^2), and to be consistent with mass : energy equivalence a postulate of proportionate density should also be quadratic. Thus a hypothetical upper density bound could be prescribed as;
lim = mass (mu * epsilon)^2 = mc^4
this does not in itself define what portion of the total mass must exist at that upper density, but that proportionate density would prescribe a finite quantity if a finite volume were derived as existant at a constant density. A differentiated quantity of energy would not need to become any more dense to become inertially bound as a finite field. And within that constant density volume temperature could equate to relativistic rate of response.
report post as inappropriate
Anonymous wrote on Apr. 6, 2019 @ 14:44 GMT
Information, in the modern theoretical sense, is the attempt to formalize what has long been the colloquial ambiguity, "how does 'it' know what X". Mathematically this is confronted by the conundrum of discrete vs. continuous.
As adults we are accustomed to a working knowledge of water being comprised of discrete molecules electrostaticly bound as a viscous fluid. We loose that childlike facility to be fascinated by the seemingly seamless quality of a shimmering slender stream running from a faucet that breaks into rivulets and drops as we curiously poke it with our finger. We begin our quantum discretion with the wise question, "How does the water 'know' how to do that?" How does what appears seamless become differentiated? Where does the break in symmetry occur? It becomes easier to simply start with things being made of pieces that cling to each other, than to retain the fascination of a continuum and seek how the stream can no longer 'know' itself as a contiuous stream. The knowing *itself* is information that is simply connected. Simply, means to not over-think it. Pieces knowing each other, or of each other, is information that is complexly connected.
How does light know what the rate of passage of time is? Given light velocity is accepted as a universal absolute, light could be simply connected with the highest limit of rate of passage of time, at least in the one dimension of it linear propogation. But we can only assume that the rate of passage of time anywhere, is somewhere between nil and light velocity. So light would be complexly connected in the two other orthoganol spatial dimensions to the rate of time. So it would be quite possible for light to consist of an elastic light velocity particle and also a one dimensional light velocity interval where the information of rate of time is physical sought laterally, and that relativistic effect is what registers by an observing system as a wave.
report post as inappropriate
Georgina Woodward replied on Apr. 7, 2019 @ 05:16 GMT
Anonymous, could you please put your name or initials at the end of your post if you are not signing in, as there may be several Anonymous-es posting and it would be good not to be unsure of who the posts belong to.
report post as inappropriate
Georgina Woodward replied on Apr. 8, 2019 @ 04:58 GMT
John, I'll try not to overthink this. Doesn't the question 'how does it know?', come with the prior assumption that it knows? But I don't think it knows; to have knowledge seems to require some temporarily persistent structure or organisation independent of the phenomenon 'known'. The water is simply taking the path of least resistance around rather than through the finger. Isn't asking how it knows to do that a bit like asking how the battery knows when the distant switch is open?
There is something odd about asking the rate of passage of time, let alone how light knows it. (Jumping back to what it meaning to know something rather than just acting, as the physical circumstances 'prescribe'). To measure the rate requires that used for comparison is already fixed, yet it is also the to be measured.
report post as inappropriate
Georgina Woodward replied on Apr. 8, 2019 @ 22:27 GMT
John, emergence of effects can happen at the macroscopic scale that aren't a property of the constituents at the atomic or sub atomic scale. Surface tension is one such effect of water. Also whether the water passes through or around depends upon the material of the finger. A foam finger will fill with water and then it will pass through,-not so for a flesh finger. The material's structure is a macroscopic arrangement of matter not a part of an atomic or sub atomic particle alone's existence. Though relevant, if thinking about a particle rather than continuous model, is the fact that fermion particles's can not occupy the same space but must occupy their own space. You say "the water appears seamless". Appearance is something else from the water, an observation product. Formed using input of EM radiation, the product does not consist of water molecules.
report post as inappropriate
Georgina Woodward replied on Apr. 9, 2019 @ 03:16 GMT
Prior to what is the rate of passage of time comes the question 'what is passage of time? For a human observer it would be the updating of the experienced present. Which is usually about 0.1 s. But MIT have found the shortest duration of an image identified, more than chance alone would give, is 13 ms. So if seen (and recalled) it has only taken 13ms for the update. If passage of time is considered instead to be the change in configuration of all that exists, the smallest conceivable change in location (of some thing or phenomenon) and the fastest motion ought to be considered. The smaller the scale the more change is happening presumably until beyond the particulate nature of matter reaching the the limit where there is no longer differentiation whereby change can be identified.Rate of change is not uniform but time is, as there is just the one configuration.
report post as inappropriate
Georgina Woodward replied on Apr. 9, 2019 @ 04:58 GMT
My last point wasn't very clear. At larger scales change tends to happen more slowly, making it less clear when the configuration is different (as on the larger scale the change (that has happened on the smaller scale) may not be discernible.
Different parts and scales of the whole pattern of existence can be undergoing different amounts of spatial change simultaneously. The parts and scales are not experiencing different rates of passage of time. Each unique time is the entire, all scale configuration of existence.
report post as inappropriate
hide replies
Anonymous wrote on Apr. 6, 2019 @ 15:24 GMT
...And a tip of my hat to Thomas H. Ray for his theoretic examination of One Dimensional Solitons.
report post as inappropriate
Anonymous wrote on Apr. 7, 2019 @ 15:46 GMT
Hi Georgi, its been me jrc, hogging the bandwidth. I had to get a new cheap laptop that is so overloaded by the OS that I don't use it for anything that requires 'creating an account'. Tough enough to keep a clean machine.
I thought of you while composing the point in argument that we can only assume the rate of passage of time anywhere, is somewhere between nil and light velocity. Einsteins euphemism that 'time stops at light velocity' is provocative but neglects the obvious point that it would only be so in relation to light velocity being the upper limit to the rate of passage of time. But that is all within the very limiting constraints of SR. In GR, temperature can be generalized to higher energy density = higher gravitation = lower temperature = slower passage of time. The challenge is to formalize that at the quantum level, and GR treats a region of constant density as an averaged mass density rather than a proportional mass density.
Condensed Matter Physics, which is by far the largest generic discipline in today's practicing physics community, recognizes this. But yet there remains the conspicuous absence of consensus on a definitive formulation of a particle of condensed energy which meets conventional standards, not least of which would be distribution in accord with inverse square law.
report post as inappropriate
Anonymous wrote on Apr. 8, 2019 @ 15:45 GMT
Georgi,
Yes, 'knowing' is a colloquialism. It's really about what we don't know, which is the greater part of anything we do seek to understand. So it is quite common and getting worse in the hubris of the Information Age of super-connectivity. Which is why the logical constraint of Rob McEachern, that information is a purely mathematical concept, is the better guide.
And yes, asking what the rate of passage of time might be, is rather odd. As it is to ask what might be meant by connectivity. But in the here right now, these are time critical questions. In 2017 the Chinese achieved what they later publicly announced as "a significant sequence of singlet pair production" and successfully transmitted a video via quantum key encryption between their research facility near Bejing and their partner lab in Vienna, Austria. At the time of President Trump's State visit to China in 2017, it was stunningly obvious that his body language had deserted him in making the usual political platitudes following the initial meet and greet exchange of portfolios. Everyone in the U.S. entourage with the exception of Gen. Mattis, ret. looked like someone that goes to a party and encounters something everybody seems to know and thinks everyone is playing a trick on them. News commentators seemed to conclude that it was simply a lack of personal preparedness on the part of an inexperienced President, but it was apparent to seasoned observers of geopolitics that the U.S. strategic estimate going in had been caught flat-footed.
So it isn't one's preference of paradigm that matters. It is a matter of who gets it to work with the infrastructure to support it, who then corners the global market on electronic transfer of funds and holds the algorithmic keys to pick winners and losers. And it doesn't matter who I am, in what I might say here, its just another grain of sand in the data mine.
report post as inappropriate
Georgina Woodward replied on Apr. 9, 2019 @ 02:21 GMT
John, I agree that the states prior to measurement, that is the information purportedly carried by the 'entangled' particles, is a mathematical model. The particles do not know their test outcome states which don't exist until the tests are carried out. As I understand it the entangled pairs can be used to tell whether there has been tampering, an attempt to look at an encrypted signal. Which isn't like a super padlock that will keep intrusion out but just lets on that it has happened. Like a hair stuck to door and door frame. It is a game changer for covert espionage. Which wouldn't be a problem if everyone had their cards on the table and was working at mutual cooperation, rather than winner takes all. I haven't seen the footage with body language you mention. Maybe there was some trepidation about possibly being caught with a hand in the cookie jar. Who knows? not me.
report post as inappropriate
Anonymous wrote on Apr. 9, 2019 @ 15:37 GMT
Georgi, Right, the appearance of smoothness macroscopicly emerges from the granular proliferation of the quantum realm. So much so that metaphorical illustration of a continuum is subsumed by the predominant theme in physics which all trends towards ever more of the same analysis of probabilities of the granular resulting in apparent smooth transitions.
I'm just going the other direction. A field is not a vector probability space, classical or quantum, finite or otherwise. It is a region of continuously connected energy. A particle IS a continuum, and its physical property of density varies as a function of gravitation. It does not require a photon to transfer energy, though it can produce a projection of continuously connected energy that conventionally would be interpreted as a photon. And within a finite field, motion and translation of inertia are generally and relativistically covariant. So primary force effects between otherwise discrete fields are conditions of merging regions of energy. That doesn't mean that two electrons or two subatomic particles can exist in the same space, but at densities lower than electrostatic seperation those density regions can and do meld into a consolidated magnetic and gravitational domain. So thermodynamics are also relativistic and consistent with the Cosmological Standard Model of Particle physics. It doesn't reject QM, but does grant something in the way of a rational ontology for some QM results. Not least of which is that entanglement is not so spooky as Quants like it to be. Light velocity is a universal absolute, but it is so because its the root exponential mean. In one light second 'spin characteristics' are rigidly connected across a seperation of 2.143^14 cm. (I prefer cgs, no apologies, no more hints) jrc
report post as inappropriate
Georgina Woodward replied on Apr. 9, 2019 @ 21:23 GMT
John, BTW if you post in 'Reply to this thread' rather than 'Add new post' the conversation joins up and replies don't get 'lost'.
report post as inappropriate
Anonymous wrote on Apr. 9, 2019 @ 23:22 GMT
I know, Georgi, But I don't want to reinitiate an account to do that. No offense, but I was intending only generic comments rather than dialogue and have pretty much spent my dollar. And I never care if anyone agrees with me or not, I'm not doing anything new except in regards competing models of which there is no lack. This is Theory, after all. Heck, there's more theories then there is people! But niether am I a lone wolf, more a shepherd really, and for more then half my life. So I always keep things pretty generic so as not to give away the store. I'll tell you what interested me in this article about Maxwell's Demon letting cold molecules collect. In the CMP model I like, a cold spot could be expected to develop under conditions of particle interactions, so it is conceivable that thermo asymmetric molecules have a probability of passing as homgeneously hot molecules. And I didn't have to initiate an account to post. Best jrc
report post as inappropriate
Georgina Woodward replied on Apr. 10, 2019 @ 02:16 GMT
John you wrote "it is conceivable that thermo asymmetric molecules have a probability of passing as homgeneously hot molecules." Tt's an interesting idea-like detergent molecules having hydro philic and phobic parts (I'm thinking). What keeps the gradient? Is one end more flexible, able to move more, and the other more rigid able to move less. Are such molecules common or unusual?
report post as inappropriate
Anonymous wrote on Apr. 10, 2019 @ 14:09 GMT
Georgi, I'm not familiar with the types of molecule you refer to and the concept is a potential area of application, but you seem to get the idea. I can think of one example, and that would be weak hydrogen bonding in water molecules, where the two hydrogen atoms are more on one side of the oxygen atom, and so a hydrogen will become attracted towards the 'bald spot' on an oxygen in an adjacent...
view entire post
Georgi, I'm not familiar with the types of molecule you refer to and the concept is a potential area of application, but you seem to get the idea. I can think of one example, and that would be weak hydrogen bonding in water molecules, where the two hydrogen atoms are more on one side of the oxygen atom, and so a hydrogen will become attracted towards the 'bald spot' on an oxygen in an adjacent molecule. Which suggests that a concentration of energy develops in the electrostatic range by the covalence bonding and migrates towards a center of gravity of the combined atomic masses.
This would be a good place to explain the build-up I was getting to. The central idea of the theoretical model is that there's too much energy to exist at a smooth constant density in the universe so it has to slow down and condense to save space. But it will do so at an exponential rate of negative acceleration, spherically. You can immediately see the problem! How do you account for the quantity of energy between the radii as density starts to stack-up on itself as it slows from light velocity to form a rest mass, and still incrementalize that on any one radius? Well... in linear algebra you can't use the exponential function (e) as the index, only as the base, otherwise it would possibly extrapolate out as its own root. But on a radius in a sphere, the natural log would only the energy on that radii, not between raddi, but any radii would have the same exponentiation and a change in spherical volume would algebraically be non-linear. So on one radius the exponential root of light velocity would express the exponential function on every radii of the gravitation concentrating energy into density as it slows from c to rest. That provides a scale independent proportion of c(c)^1/e to shrink a sphere at an average constant density to one of smoothly differentiated density across density range of a light velocity proportion of density difference; such as the c proportionate difference between electric and magnetic intensity in a point charge. So each primary force can be theoretically defined as a c proportion of density difference, and you have the rudiment of a unified field. It gets messy from there but has some interesting results. And I'll put my jrc to that. Cheers
view post as summary
report post as inappropriate
Steve Dufourny replied on Apr. 10, 2019 @ 17:29 GMT
Hello,
You have well explained,thanks for sharing.Ps I liked also the extrapolations with the radii of sphères.:)
Best Regards
report post as inappropriate
Anonymous wrote on Apr. 10, 2019 @ 19:22 GMT
Hi Steve!
I thought you would like it. It tickles me every time I try to wrap my head around it. You can see that while a sphere's volume varies by a factor of 2 its volume varies by a factor of 8, so that's a linear function, but the exponential deceleration compounding density is nonlinear. The amount of energy required to constitute the density gets progressively less and less in ever...
view entire post
Hi Steve!
I thought you would like it. It tickles me every time I try to wrap my head around it. You can see that while a sphere's volume varies by a factor of 2 its volume varies by a factor of 8, so that's a linear function, but the exponential deceleration compounding density is nonlinear. The amount of energy required to constitute the density gets progressively less and less in ever smaller volumes as the density compounds exponentially, so a huge density value needs a minuscule amount of actual energy to be extremely effective. But the corker is that energy quantity is the hidden variable! Its there! but only expressed in the quotient and divisor; density = erg/cm^3. Energy quantity, is the dividend and only expressed as a non-dimensional point density value. (aarghh! Where is IT!) Its there, its about energy. But its total quantity is expressed through density. Like in GR, force is the hidden variable as the product, and expressed in the multiplier and multiplicand mass*acceleration.
It can be done (and was) geometrically and algebraically in Euclidean R4 space and time, but winds up forcing the issue of relativistic time dilation. It's not difficult to accept that energy with condense by slowing to rest from light velocity... sure. But then comes the ontological twist of once you are out there on the edge where energy is at an empirical minimum density at light speed... where can it go without sucking the energy out of the field? So the model forces acceptance that light velocity is the limit of speed of time, just like in SR. If time is going at light velocity, then the inertially bound enregy needs not move in space. Rather it cannot go any further than its zero boundary of minimum density, and saves space for the over abundance of energy in the universe. And in context with previous posts, the continuously connected energy in a quantum level field equates higher density with lower temperature, while in the general gravitational referrence higher density equates with higher temperature as a function of random particle motion. Kind of nice given todays announcement of the first real photograph of a black hole, where extreme density quiets particle motion down to the quantum level of high density and colder temperature.
What a nice day! jrc
view post as summary
report post as inappropriate
Steve Dufourny replied on Apr. 12, 2019 @ 11:39 GMT
Hello jrc,
It is nice,indeed I have liked your post which is a beautiful general thought.
Friendly :)
report post as inappropriate
Anonymous wrote on Apr. 11, 2019 @ 03:19 GMT
... so what is exponentiated on a radii is time dilation, just like in a gravitational field in GR.
Good Night.
report post as inappropriate
Anonymous wrote on Apr. 12, 2019 @ 16:34 GMT
Thanks Steve, oh, typo should be written; c(c^1/e) where the first c is simply light velocity and so is one dimensional, and the c in parenthesis relates to density range at exponential rate of change and so is three dimensional arguing for allowing an exponential root. The argument is that the concentration of energy exponentially toward the center will require enough energy to shrink a constant density sphere from the radius of c(c^1/e) to one with a c radius. Doesn't matter at what scale, the proportion is still a light second deceleration condensing energy. :)jrc
report post as inappropriate
Anonymous wrote on Apr. 12, 2019 @ 18:52 GMT
If Schumacher and Westmoreland are using a Gaussian distribution then 1/e would be the base, and given QM's treatment of correlations in a gravitational referrence as if it were in a constant time density space, then some pair correlations could be deemed entangled. If the vector space were enlarged by a (c^1/e) proportion to reflect that constant time density parameter, perhaps only anomalous correlations would be outside the range of light velocity seperation.
report post as inappropriate
Anonymous wrote on Apr. 13, 2019 @ 16:58 GMT
It is immediately evident from e=mc^2 that more than one light velocity proportion of density can compound in one light second deceleration. So by the same reasoning that exponential rate of change occurs on a manifold of axes, of which only three are the required minimum, allows use of an exponential root; so can magnitudal density difference be an exponential root. There has got to be more than one way to skin Schrodinger's Cat. Happy Hunting - jrc
report post as inappropriate
Anonymous wrote on Apr. 16, 2019 @ 03:41 GMT
beg pardon, the model evolving the c^1/e proportionality was done in Euclidean, 3D+t space and time, not R4. But does force the issue of Relativistic covariance. Got a little ahead of myself. I'll bow out now, thanks for listening. John R. Cox
report post as inappropriate
Anonymous wrote on Apr. 17, 2019 @ 02:40 GMT
April 12 post. Wow! I stated that argument backward and didn't catch it. I have known something has been bugging me. It just reads so well. It doesn't sound as good to have to state; A light second deceleration of a sphere of c differential density, exponentially concentrated towards the center with a radius of c(c^1/e), will shrink to a sphere of c radius of constant density at the higher density. Its the same thing as saying that energy will decelerate exponentially in condensing mass proportional to an upper density bound, in the first place.
What? There is somebody that might read this that hasn't gotten befuddled trying to manage a rewrite of an old paper? That kind of mia culpa needs more than my initials - John R. Cox. Now I can sleep.
post approved
Anonymous wrote on Apr. 17, 2019 @ 16:19 GMT
Try thinking of the volume of a sphere as unity; 1.
A 1c unit sphere can be c(1/c) unit 1/c intervals of unitary volume.
The same quantity of energy at constant density in a 1/c unit volume, compounded with the density of the same energy quantity in a 2/c unit volume, compounded with the density of the same energy quantity in a 3/c unit volume... up to compounding with the density of the same energy quantity in the c/c=1c unit volume sphere; will exponentially compound a c proportion density difference across the volume difference. So c^1/e is the radial difference dependent on energy quantity. A c proportional density difference exists between a unit 1c volume and a unit 2c volume. c? jrc
report post as inappropriate
Steve Dufourny replied on Apr. 17, 2019 @ 18:35 GMT
Hi John,
I like read your creative posts.These proportions that you cited are interesting when we fractalise the spherical volumes.Density,Energy,....and many properties emerge in logic with geometrical algebras and good operators,it is a big puzzle,I study some works Of Lie,Hopf,Clifford….I find a road to formalise correctly my theory of spherisation and these sphères.Not easy.Friendly
report post as inappropriate
Anonymous wrote on Apr. 18, 2019 @ 01:43 GMT
Not easy at all, Friend> I keep making mistakes all the time. Ooops! that's not how that proportion obtains, it has to be the hard way. Drat! jrc
report post as inappropriate
Steve Dufourny replied on Apr. 18, 2019 @ 17:46 GMT
:) lol Indeed well said john,
report post as inappropriate
Anonymous wrote on Apr. 21, 2019 @ 06:00 GMT
The best argument for exponentiating a spherical condensate by extracting an exponential root; is by example. Given an ontology which provides a proportion for upper density bound in an inertially cohesive field, and theoretic densities specific to the primary force effects, which each are successive multiples of light velocity and so would also provide a cosmic background lower density bound....
view entire post
The best argument for exponentiating a spherical condensate by extracting an exponential root; is by example. Given an ontology which provides a proportion for upper density bound in an inertially cohesive field, and theoretic densities specific to the primary force effects, which each are successive multiples of light velocity and so would also provide a cosmic background lower density bound. That ontology rationalized from a parametric model to real provides a base radius diminished from the abstract Unit Sphere, from which the c(c^1/e) radial difference can apply to real limit of inertially bound gravitational limit in a hypothetical free rest mass.
So I crunched some numbers and found that the redefined value of the Boltzmann constant is just 2.7064 K* (smack in the CMBR zone) lower than the model's Energy quantity where the minimum Kinetic (inelastic) Density equals the Inertial (c^4 proportion) upper density bound. The model would protract a wavelength projection of 1.1278^11 cm, it's too heavy to be a true photon.
So while generated specific densities might seem prohibitively low, they are the theoretical minimum and observation and measurement are of aggregate effects operational of over-lapping fields and light velocity translation. The condensate form of 3.7366^-16 erg would exponentially distribute such that 9.3283^-40 erg would exist at constant density in a core volume of 2.7777^-45 cm^3. Projecting c(c^1/e) from the K radius to the Gravitational limit radius produces a G radius of 1.8688^-1 cm, and a volume of 2.7338^-2 cm^3 which would require 3.4074^-28 erg to exist a minimum gravitational density. So you can see that an exponential deceleration can't be incrementalized by compounding a same density or a same energy quantity in concentric spheres. The proportion of energy density difference is c^3. The proportion of energy requirement of density in volume is the order of (c^3)^1/e. And the change in volume is a proportion equal to c^3[c(c^3)^1/e]. There is nothing linear in a spherical condensate.
So Thermodynamics are equitable with Planck quantization, it just helps if you partition Planck's Constant as mean work function. Thanks, you have to stick to long form rigor to get the results that don't diverge. John R. Cox
view post as summary
report post as inappropriate
Steve Dufourny replied on Apr. 21, 2019 @ 17:20 GMT
Wowww,you are relevant John,
Thanks for sharing your creative extrapolations.
report post as inappropriate
Anonymous wrote on Apr. 21, 2019 @ 06:12 GMT
oops, the difference in volume is c^3[(c^3)^1/e] ; like c(c^1/e) cubed.
Balls! my ISP keeps loosing fqxi. jrc
report post as inappropriate
Anonymous wrote on Apr. 21, 2019 @ 14:38 GMT
now, now.. the c^4 proportion is; m(mu*epsilon)^2 = mc^4 = Ec^2.
report post as inappropriate
Anonymous wrote on Apr. 21, 2019 @ 17:57 GMT
Again, thanks Steve, it is in your wheelhouse. So if you and others wish to play with it, the empirical values in computation were as follows:
c = 2.997925^10 cm/sec
h = 6.626196^-27 erg/sec^-1
Boltzmann = 1.380648^-16 erg/Kelvin*
e generated by algebraic OS of calculator; key ( 1, INV, lnx, =, store)
Water trickles towards towards a floor drain on a well laid concrete floor because the drain is about 3cm lower than at the sidewall 5m away. A bowling ball placed near the wall will slowly roll toward the drain so gently it could nudge an egg out of its path. But Putt a golf ball across the same surface and it would splatter the egg. Momentum (p) is the sum over path intervals of the cosine transformation of rapidity of change of slope of a waveform of EMR decelerating from midpoint of wave length to rest moment. So the photon rest energy can exceed the value of Planck's Constant in wavelengths longer than a benchmark spherical waveform having progressively lower amplitudes. Given that the 21cm microwave frequency is paramount in CMB surveys, suggests that a real spherical waveform might be in the vicinity of 10^-2 to 10^2 cm. And ALL rest energy quantities of the EM spectrum would be less than what would have a proportionate upper density bound equal to or greater than the theoretical specific Kinetic (inelastic) Density.
"Roll a ball, a ball a penny a pitch." as an old, old song goes. :) jrc
report post as inappropriate
Anonymous wrote on Apr. 23, 2019 @ 18:34 GMT
One final note; If you look at Coulomb's Law it is a derivative of inverse square law which as with SR is an invariance function. Invariably, 1/r^2 will obtain in measurement from A to B, OR (but not and) B to A. A & B are seperate inertially bound objects. But in a single inertially bound field, that relationship is covariant to the upper density bound, A & B are different magnitudes of density in the same field and vary by the inverse of exponential rate, (1/e). Ineractive fields in regions of density less than electrostatic separation could be as an intersection of values A & B to the limit of magnetic (viscosity) density, and as a union of values A & B to the limit of gravitational (aetherial) density. So while interactive fields then obey 1/r^2 a transform from an abstract parametric Unit 1cm^3 sphere to a real 1cm^3 sphere can follow the form of; ( r * r ) > ( c * c^1/e ). :) jrc
report post as inappropriate
John R. Cox wrote on May. 11, 2019 @ 17:44 GMT
If you have gotten curious from the results of a spherical condensate and have played with a little reverse engineering (and aren't afraid of argumentation in pure mathematics) but ran into a problem getting duplicate results, it is likely that is because electronic calculators and calculation programs are engineered at the microprocessor level to NOT do certain functions. Everyone is familiar with entering X divided by zero and getting a display result that says 'error'. Same way with extracting an exponential root, its prohibited in linear algebra so you have to employ a work around. Generate the numerical value of the exponential rate unit and if your calculator or program won't handle an INVerse y^x with x being the numeric 'e', try doing 1/e using the 1/x function and then use that value as x in the y^x function key input. If you have googled X^1/e, you likely got only an incorrect linear graph, and you can find the correct curve the old fashioned way using the 1/e value in y^x for y=x^1/e to graph values for 1 thru 10, then 15, 20, 25, 30, 35, 40, 45, and 50; and it will fit onto a notebook size page of graph paper ruled 5 squares per inch. The curve is a thing of beauty, and as a function of compound non-linear functions it works so well, so simply that the weighted arguments in conventions of axiomatic usage may simply be due to there not having been a recognized good use for it.
While we're at this, when it comes to typical security protocols in computer systems, the engineered prohibited math functions pose a hazard by algorithms which could subvert a prohibited function with a 'work around'. Perhaps something we should look into. :-) jrc
report post as inappropriate
Login or
create account to post reply or comment.