FQXi Essay Contest - Is Reality Digital or Analog?
Quantum Theory without Quantization by Ken Wharton
The only evidence we have for a discrete reality comes from quantum measurements; without invoking these measurements, quantum theory describes continuous entities. This seeming contradiction can be resolved via analysis that treats measurements as boundary constraints. It is well-known that boundaries can induce apparently-discrete behavior in continuous systems, and strong analogies can be drawn to the case of quantum measurement. If quantum discreteness arises in this manner, this would not only indicate an analog reality, but would also offer a solution to the so-called "measurement problem".Author Bio
Ken Wharton is an Associate Professor in the Department of Physics and Astronomy at San Jose State University. His research is focused on the foundations of quantum theory, with a particular interest in fully time-symmetric approaches.Download Essay PDF File
I am very intrigued by your essay! You may be equally intrigued by my essay “A World Without Quanta?” submitted and posted in this FQXi contest!!! In my essay I provide (among many other connections) a simple continuous derivation of Planck's Law without using energy quanta. I show that this Law is really a mathematical identity that describes the 'interaction of measurement' (and more generally 'energy exchange'). I also provide an explanation for our observations of 'energy quanta' which goes well along your ideas of how discreteness arises out of an underlying continuity when measurement takes place.
This is very encouraging! I have been a 'voice in the wilderness' for a number of years now and it is good to know someone else has similar ideas. I should include in this short list Hayrani Oz (professor of aerospace engineering at Ohio State University). He too has been using very successfully 'time-integrals of energy' (my quantity 'eta') for many years. We are coauthoring a chapter on a Thermodynamics book coming out latter this spring based on our results. Also, I should give great credit for FQXi that has provided me the space and opportunity to engage others in good meaningful discussion and not be shouted down and blocked from participating.
Some very interesting ideas, though I would have a few quibbles.
For one thing, the essay question is: Is reality digital, or analog. Not is the foundation of reality digital or analog. If it were the latter then the premise of your argument would be correct, the foundation of reality is analog, but it is this emergent digitalization which forms reality as we perceive...
view entire post
Some very interesting ideas, though I would have a few quibbles.
For one thing, the essay question is: Is reality digital, or analog. Not is the foundation of reality digital or analog. If it were the latter then the premise of your argument would be correct, the foundation of reality is analog, but it is this emergent digitalization which forms reality as we perceive it. It is a bit of a dualism between top down digitalization and bottom up analog.
The next point is that you use the constraints of the Big Bang to frame your theory of emergent digitalization, but isn't the primary assumption of an expanding universe that the only way purely digital quanta of light can be redshifted is by the actual recession of the source? If light is actually analog, there are quite a few ways it could be redshifted.
Here are some methods: this that
Not to mention that the premise of "tired light" was dismissed, based on the assumption of a discrete model of the photon, since there was no observed scattering.
As for the premise that there are only two models of time, block time versus instantaneous points, what if time is fundamentally fuzzy?
It's not that the present flows(Newton)/exists along some dimension(Einstein) from past to future, but that the changing configuration of what is that turns the future into the past. Tomorrow becomes yesterday because the earth rotates.
We do proceed from past events to future ones, but the physical reality is what is present, so it is the present that is the constant, while the events coalesce out of future potential, into present circumstance and are then replaced, to recede into the past.
Consider that there is no way to calculate all possible input into any event, as it could be arriving from opposite directions at the speed of light. So prior to the actual occurrence of an event, its total cause is still in the future. Once it has occurred, the event recedes into the past. So in this sense, the future is cause and the past is effect.
Time then, is an effect of motion, rather than the basis for it. Therefore there cannot be a dimensionless point in time without freezing the very motion creating it. It would be like trying to take a picture with the camera speed set at zero. This means that a particle cannot be isolated from its motion. It has no fixed position. Frozen/motionless and it would cease to exist.
Same with Schrodinger's cat. Death is not an instantaneous point. It is that collapse of future probabilities into past effects which creates the process of time. Not a progression from a determined past into a probabilistic future that only seems to yield multiworlds.
Nor would we need the potential conceptual problems caused by block time, whether conservation of energy issues, determinism, or time travel.
Welcome back to the netherworld of FQXi discussions.
view post as summary
John -- Thanks for the "quibbles", but I guess we just have a different take on where our human perceptions intersect with both the physics (the time issues you mentioned) and the purpose of the essay question itself. So I'll just leave you with a nice quote...
"Our present QM formalism is a peculiar mixture describing in part laws of Nature, in part incomplete human information about Nature -- all scrambled up together by Bohr into an omelette that nobody has seen how to unscramble. Yet we think the unscrambling is a prerequisite for any further advance in basic physical theory..." Edwin T. Jaynes
II. BOUNDARY-INDUCED QUANTIZATION
I call this temporary quantization. When nature puts the squeeze on degrees of freedom, you get a quantum number and have quantization of energy levels. When WE do put constraint on freedom of particles, we too cause quantization = observation related!
A free photon has free direction. If we squeeze the photon true a slit, direction is constrained and now, direction is a temporary quantum number and direction is quantized i.e. interference .
Although I'm obviously in general agreement with your overall point, I'll recommend possibly changing your terminology into something that's more neutral when it comes to the difference between spatial and temporal boundary constraints. Maybe "Local quantization"? (Although the word "local" comes with a lot of unfortunate baggage...) "Regional quantization"?
Knowing that the ideas in your essay will interest Hayrani Oz (Prof. Of Aerospace Engineering, Ohio State University) I took the liberty of forwarding him your essay. His reply was lengthy so I made a pdf and am attaching it here. Your ideas and Hayrani's 'Enerxaction Dynamics' appear to match well. I hesitate to include Hayrani's email in this open forum, so if you have a follow up reply I can likewise forward it to him.
Thank you for your interest, and for forwarding Hayrani's comments... I'll follow up via email.
Nice essay. Actually, I'm finding your time-symmetric stuff more and more intriguing as time goes by. It's interesting to note that your broader conclusion is essentially the same as both mine and Dean Rickles' (and a few others that I've read so far).
Hope all is well with you! I nominated you for membership in FQXi, by the way.
Hi Ian - thanks for the nomination!
Although you're right that we have a big point of agreement (that real-life measurements are never going to be able to answer this question one way or the other) I wouldn't characterize that point as my "broader conclusion"... More of a preliminary point that I quickly got out of the way. Your essay certainly tackles that question in a much broader and comprehensive manner.
In that sense, my essay is sort of a sequel to yours -- after all, near the end of your essay you say: "So perhaps the more enlightening question would be, are all "quantum" theories necessarily discrete?" You said your instincts were on the "yes" side of this question... Any chance my essay has swayed your opinion on this issue? :-)
When I get a chance I'll head over to your own topic and post some questions of my own... Cheers!
Hmmm. I don't know if you have swayed me on this, but I will say that I have a much better understanding of the block universe concept now. I still think it is discrete on some level, but I think I'm thinking of discreteness in a slightly different way. So, kind of like the reverse of the usual way of thinking, imagine that everything is locally continuous (which I don't necessarily think it is, but let's just suppose for the sake of argument it is). How could you tell if your little local part of the universe wasn't just some discrete point in a much larger system? Or, for that matter, what if our universe is a discrete point in some strange system of multiple universes? Wacky stuff, but hopefully it illustrates the way in which I envisage "discrete" here.
Ian -- Yes, of course you're right that there's no way to prove there's not a discrete substructure, and even if one was found, it would still be possible that there was a continuous sub-substructure under that! (etc., etc.)
But my point is that if you take QM measurements away, there's no *evidence* that anything is discrete. And if those same discrete measurements can be explained as an emergent feature of a continuous system, as I'm proposing, then there's no evidence for anything fundamentally discrete at all.
Sure, it still may turn out to be that way, but one shouldn't just instinctively point to QM as evidence that reality is discrete, especially given the measurement problem.
I liked your essay. I think that your boundary induced quantization is similar to a path integration condition. Certainly with respect to past and future this seems to be the case. The individual paths will constructively and destructively interfere with each other so as to match the endpoint (BC) conditions.
There is a bit of a point which I am pondering. The big bang as a boundary for quantization makes sense if the spacetime is classical or continuous at the start. Otherwise, you BIQ is approximate. If spacetime in the very early universe has quantum fluctuations, or is quantized, then I am less certain on how one can apply that as a boundary. If on the other hand spacetime may be fundamentally classical, where quantum gravity only refers to some other field from which spacetime emerges, then this theory should be more exact. Even still I am not sure how one would treat the quantum field spacetime emerges from.
Thanks for the kind comments... I was waiting to reply until my latest paper (relevant to your first comment) was up on the arXiv, but it now looks like it's going to be another week, so I'll post the link later.
As far as the point that you're pondering... Just because something is "classical", does not mean that it must emerge from some deeper quantum level. One can look for quantum phenomena to emerge from a classical foundation, rather than the other way around -- the central point of my essay, really. Still, I think it's important to take a sufficiently broad view of a "classical foundation" -- say, a local Lagrangian field density on a classical spacetime manifold. From that unproblematic starting point one can study various "nonclassical" rules and constraints, and see how they might lead to higher-level quantum behavior.
good to see you and congratulations for this lucid essay. You have a deep understanding of the quantum and the ability to explain it so well. And I don't say this just because I agree so much with what you wrote :) [paper
]. I would like just to comment that the system is quantized
, with or without the boundary conditions. What these conditions can bring in is the discretization of the spectra, as you explained so well.
Cristi Stoica, Infinite Resolution
(this year I focused on singularities in GR)
But I'm confused: what do you mean by "the system is quantized"? Clearly not the discrete outcomes... What features of a generic system determine whether it is "quantized", if not any discretization? (Deep question there, I know, but you seem to have something particular in mind...)
I did not intend to be cryptic, I just wanted to be concise :). Thanks for the feedback, indeed I need to detail. If I understand well, you start with an equation describing a quantum system (e.g. Schrödinger, Klein-Gordon or Dirac). Then exhibit in the system described by that equation discrete behavior from appropriate boundary conditions. I think you did right. I think that the...
view entire post
I did not intend to be cryptic, I just wanted to be concise :). Thanks for the feedback, indeed I need to detail. If I understand well, you start with an equation describing a quantum system (e.g. Schrödinger, Klein-Gordon or Dirac). Then exhibit in the system described by that equation discrete behavior from appropriate boundary conditions. I think you did right. I think that the wavefunction is fundamental, and physical (and I don't think that the original idea of Schrödinger, who interpreted the square of the wavefunction as charge density of the electron, is that bad, only that it has some trouble when more particles are involved). This is why I agree with your approach. You name this method "Boundary-Induced Quantization". I think that your usage of the word "quantization" is appropriate, because it shows that discrete behavior arise from a continuous field (which can be the wavefunction, an electromagnetic field etc). On the other hand, what I intended to point out is that there is a standard usage
of the term "quantization". This usage refers to procedures which are applied to a classical theory in order to obtain a quantum one. In the Hamiltonian of a classical system one replaces the classical variables (functions) with operators. This way, we obtain the Schrödinger equation from nonrelativistic systems of classical particles, the Klein-Gordon and Dirac equations from relativistic systems of classical particles, QFT from classical fields.
Schrödinger started with his equation, which, according to the definition above, represented a quantized system, and showed that for electrons bounded in an atom the only possible eigenstates of energy correspond to discrete modes. So, he explained the discrete part of the spectrum of the energy of an electron, in this way. We can distinguish two steps. 1. Obtain quantum equation from classical one, and 2. Show that bounded electrons exhibit discrete behavior. According to the terminology I mention, the first step is quantization. According to the meaning of the word, I agree that you can call the second step quantization too. I just wanted to clarify, because one may wonder what is the relation between BIQ and canonical quantization, geometric quantization, various prescriptions for the "second" quantization etc.
Given that I see two steps, and I associate BIQ only with the second, I need to mention that I do not intend by this to say that your view is incomplete. The first step, passing from a classical description to a quantum one, is only due to the historical accident that we understood the classical systems before discovering the quantum behavior. The fundamental one is the quantum system, and there is no need to show how we go from classic to quantum. It is an artifact due to the original impression that the classical is obtained immediately just by h->0 (which turned out to be insufficient). What we need to show is the reverse, how to obtain classic from quantum.
view post as summary
Ah -- I now see where you are coming from, but I disagree with the "out" that you've provided me. I do not think physicists should simply "start" with operator-valued equations and explain classical physics as some limit of those equations. Especially given that there is another approach.
It turns out that the Klein-Gordon equation *is* the classical equation for a classical scalar field. There's also a classical Dirac field (see the last chapter in Goldstein on classical fields.) There's nothing "quantum" about these equations until you start interpreting them via operators as you describe. Yes, if you start with particles you have a problem, but recall my premise is that everything is continuous, so one is forced to start with classical fields anyway.
So why do we then go to operators? It's the easiest way to get to a framework that can predict discrete outcomes. But if there is some other way to get a near-discreteness without operators -- as argued in my essay -- then there would never be any reason to do your "step #1" in the first place.
Now, after several years spent hoping that one could get all of quantum theory by applying closed-hypersurface boundary conditions to classical field equations, I've finally come to terms with the fact that this alone isn't going to work. But I'm far from giving up on the classical field framework itself. (I've just dropped back from field equations to the classical Lagrangian densities that generate those equations in some -- perhaps approximate -- limit.) After all, if you "start" from an equation that one can't even interpret, none of the consequences are going to be interpretable, either.
That was the most clearly written and beautifully insightful essay I've read. It also gave me many new answers, viewpoints and much confidence on my own model of discrete fields.(DFM) I've also learned a lot from your refined explanations. You certainly have a 10 from me, but what I'd like from you to read my own logic based but rather agricultural local reality iteration of.. ..well really of what you seem to be suggesting may be true. (which shows limits to Bells domain). http://fqxi.org/community/forum/topic/803
I've been struggling with it as I can't seem to falsify or find the errors in how the DFM seems to fully unify SR/GR and QT. It uses a real quantum symmetry breaking boundary transition process implementing energy changes in an underlying field structure. It also made my hairs stand on end what you referred to the ultimate boundary of the big bang, as I recently posted a short pre-print paper reaching a logical and very physical solution to how exactly that.... anyway the paper, which only took 2hrs to write as a part derivative of a full one under consideration, is at; http://vixra.org/abs/1102.0016 I would really appreciate you reading both, and advising me precisely where the errors are as unfortunately no-one has found them yet.
There are a number of other papers looking at implications, which are quite extraordinary, seeming to resolve issues right across physics. It seems to suggest our failure has been one of complex logical thought involving visualisation skills with multiple variables, and over reliance on mathematical abstraction.
I'll say no more for now, but just thank you, for your essay, and in advance for your time and hopefully comments.
Thanks for the kind words -- but I'm afraid I don't really see any connections between our two essays. Still, I'm glad my essay gave you some useful ideas.
My only comment on your essay would be that I think you would find it beneficial to treat light as a wave, especially when it comes to analyzing light in a moving dielectric or plasma. The distinction between phase velocity and group velocity is particularly crucial to your analysis (the phase velocity in a plasma is actually c*n, not c/n, for example.)
Many thanks. I agree with the wave treatment. You'll have noticed I consistently referred to signal not phase velocity to avoid confusion. I've studied and researched optics for many years and there is still poor understanding, within but particularly outside optics. Optic Fibre and plasmon science has helped, but, well just look at; Nano letters DOI;10.1021/nl103408h. and Science, vol331,p892. to see how poor the science of just a few years ago was.
I wrote a paper clarifying much re; superposition, harmonics, plasma and refraction, but to the specialist editors it's not 'new discovery' just a clearer way of explaining what we've already discovered, and to general journals it's too far from the ruling paradigms to be considered! We have to smile!! I've now been asked to agre to publication is a less mainstream journal. What does one do!?
I could have written a whole essay on the wave aspects, but omitted it all to avoid red herrings as it is the overview that's important.
I'm not sure if you saw the fundamental derivation from correctly treating time averaged Poynting vectors in co-moving ion media or missed it. It did require slow reading, difficult multi variable visualisation, and consideration of the consequences. Essentially it derives from pure logic SR and GR with a preferred 3rd frame and quantum mechanism, and it's falsifiable.
Or perhaps you disagreed with the logic for some reason? Please do advise if you can find the time. (Don't get confused by plasma waves as we're dealing only with the block reference frame of the medium).
I think my response to your email is more appropriately posted here, since others may benefit by our discussion.
“ I'm quite interested in new ideas of how to get quantum behavior to emerge from classical fields”.
This was also what attracted my attention to your essay, as this is exactly what I am doing in my essay. What I mathematically...
view entire post
I think my response to your email is more appropriately posted here, since others may benefit by our discussion.
“ I'm quite interested in new ideas of how to get quantum behavior to emerge from classical fields”.
This was also what attracted my attention to your essay, as this is exactly what I am doing in my essay
. What I mathematically demonstrate is that Planck's Law for blackbody radiation can be derived using continuous processes, without using energy quanta and statistics. Since Planck's Law is at the very roots (historical as well as theoretical) of modern physics, this result is very significant.
But more than that! In my essay
I show that Planck's Law is an exact mathematical tautology that describes the interaction of measurement! This, in my view, explains why Planck's Law fits so remarkably well the experimental data. Check the blackbody spectrum obtained from measurements and obtained from Planck's Law.
. "The FIRAS data match the curve so exactly, with error uncertainties less than the width of the blackbody curve, that it is impossible to distinguish the data from the theoretical curve”. Naturally, the measurements will be exactly the same as the tautology that describes the measurements.
I also show in my essay
why it is mathematically true that energy is proportional to frequency and why the uncertainty principle must hold. But this is only just some of the results in my essay. Too many to list here in this post!
You further write,
“Really, I was stumped at "mathematical identity" -- at that point you are claiming to derive a physical conclusion with no physical assumptions...? Surely there is some link to physical reality in this math, or it wouldn't mean anything. So what's the underlying picture of reality that this math is assuming to be true? “
I fully understand why you were “stumpt” at the mathematical identity nature of Planck's Law. I was anticipating just such response!
But there is nothing unusal about finding mathematical tautologies in physics – and without these having a 'physical basis'! If I was to measure a distance of 3 miles going east and follow that with a measure of a distance of 4 miles going north, and then measure that I am a distance of 5 miles from where I started, do I need to have derived the Pythagorean Theorem using some 'physical basis' in order for this Theorem to apply to my physical measurements? Likewise with Planck's Law, as I show in my essay!
Conserning my photoelectric effect paper
. I am surprised that you actually read it since I don't disucss this result in my essay!
“1) There is no experimental delay between the time that a weak photon source is turned on and the time that the detectors start registering the photons. If the energy had to "build up" over time, one would expect to see such a delay.”
As I explain in the paper, the time required for an 'accumulation of energy' h to occur (the minimum threshold needed for energy to manifest) is h/kT. I think you will agree that this is a very short time! I don't think any experimental claims are for a shorter time.
But there is a more general principle about 'instantaneous' that you raise which I find very important. Do you really believe that if the 'source' is turned on at say t=s, the 'sensor' will detect the photon at t=s also? That 'instantaneously' (in the sense t=s) the photon will be detected? I show in my essay
that The Second Law of Thermodynamics states that some positive duration of time is required for a physical event to manifest. Physical events have both 'extention' in space as well as 'duration' of time. Your view that events happen 'instantaneously' at t=s in my opinion violates this fundamental Law.
You further say,
“2) A related issue is when the average field is very weak everywhere, but there are many detectors. If the energy has to build up to hv on one particular detector, they would all take a long time to fire -- but in fact one of them will fire quite quickly, as if all the energy in the whole field somehow was "directed".”
This would be a paradox if you assumed 'ballistic photons' carrying an energy of hv (how? don't ask!) following a path trajectory and striking some one detector! But in my view, the 'photon emitted at the source' is not the same as the 'photon detected at the sensor'. These are separate by related events, as I also argue in my explanation of the double-slit experiment
). A detector will 'fire' when it has 'a minimal accumulation of energy that can be manifested'. If a detector does not 'fire' it means that it does not have that threshold to 'trip' the detector. You may ask, what happens to the 'lower than threshold' energy at a detector? It's possible that eventually it just dissipates into the Cosmos, undetected and undetectable. Or it may linker around a bit for the next photon to 'strike'. Since all this is below our 'veil of observation', we just wont know.
Finally you say,
“ I simply don't understand how you can simply assume that the energy is always exponentially increasing with time”
What is exponentially increasing with time is the 'time dependent local representation' E(t). But this is at the level of 'accumulation before manifestation'. When energy becomes 'manifested', an amount of energy hv (in agreement with the quantization hypothesis!) is absorbed and the 'exponential representation collapses' (see my essay
for a fuller discription of this).
Ken in my essay
I present exactly what you are also seeking: Quantum Theory without Quantization. This is what brought me to your corner!
view post as summary
I was interested in your comments to Constantinos. You seem to be saying he may be wrong ref the delay. In fibre optics the delay is established with great accuracy as polarisation mode dispersal (PMD) delay, somewhat frequency and polarity dependent (birefringence) but fully consistent with that Constantinos derives. This is the 'charging' or 'momentum' delay of scattering.
Also consistent with this and of topical interest are the latest results reported in Science vol 331,p892, and p 16 of 26th Feb NS, where particles were charged and 'bounced off', or were re-emitted by, the fine structure ABOVE the surface of matter, (done here with coated glass). This is equivalent to reflective scattering. The 19th Feb NS (p18) showing plasmons 'grabbing photons' through a nano hole and re-emitting them, when an 'empty' hole won't let then through at all! All equivalent to QED, with electrons 're-emitting' photons, and always at the relative 'c' of the electrons if in a refractive medium co-moving wrt an incident medium. And we find the greater the relative motion the higher the 'fine structure' surface plasma or 'plasmasphere'. Is that purely a co-incidence? The discrete field model (DFM) explores the implications if not. It's consistent with Constantinos and Edwins, and we haven't been able to falsify it yet.
You ask about "how to get quantum behavior to emerge from classical fields". It it worth considering the converse; How to get classic relativity to emerge from quantum behaviour. With refractive dispersion this seems to emerge naturally.
Food for thought?
Thanks for all the experimental facts you brought to my defense! I had no idea there is so much evidence for such 'time delay'. I think Eckard Blumschein would also add to this list the Gompf et al. false measurements of single photon counting. Rethinking this issue over again, I would like to add to this supportive arguments and experimental evidence the Heisenberg Uncertainty Principle. Clearly, QM uncertainty results in some positive duration of time for an amount of energy 'delta E' to manifest.
I think the rejection to my proof that Planck's Law is a mathematical tautology that describes the interaction of measurement is more 'disbelief' than 'refutation'. It cuts so deeply into the grain and fibre of modern physical thinking. It's just hard for physicists to accept.
really interesting, accessible, clear, enjoyable. Nice introduction explaining your approach to the question and where you are going with it. I definitely want to spend more time re reading it as it is full of good ideas and explanations.
PS.I have used a quote from your essay on the FQXi Time travel blog forum (where you very clearly explain the static nature of space time.)
Good luck Georgina.
Ken / Costas
There's much more on delay time too. Also look at the Mossbauer effect (1957) where the charge/emission scattering delay is attributed to 'recoil.' There is a logical discrepancy here related to continuous processes, which is probably why his results are oft ignored, but the actual results have been repeated and confirmed (At one of the US major universities I think).
The frequency dependence of PMD in fibre optics is fascinating, as it also reverses at a certain frequency! My work focussed on harmonics, which explains this and absorption bands in terms of Huygens/Fresnel principle (HFP), in similar terms to superconductivity. Waves are still very poorly understood!
I agree with your arguments about the fact that discreteness is just a consequence of our models, but in the same way continuity is also just a consequence of our models. Until now We have ignored that the properties of nature we see are conditioned by our models particularly by the logic we use to study nature; this is not a philosophical idea but a mathematical reality. On my essay I try to explain how our perception of quantum reality is blurred by the use of classical-logic tools. I would like to hear your opinions about it.
Thanks for a fascinating essay. I agree with premise to take the unpopular route of making QT more compatible to GR. Focusing on the measurement problem from the GR POV is both novel and creative. I also would like to point out the work of Joy Christian which I was introduced on the forum
of FQXI's very own website, which seems to support your work, although he concentrates on non-locality. He uses topological and division algebra arguments to conclude *that "quantum non-locality" is nothing but a make-belief of the topologically naive.*
Having said that I have one small "quibble" of my own. You wrote: "First and foremost, GR is a theory of spacetime."
It was my understanding that GR was first and foremost a theory of gravity, that includes spacetime. Isn't true that GR is actually agnostic as to the ontology of spacetime? Although gravity is assumed to be the curvature of spacetime, isn't it indistinguishable from a field in an arbitrary background? For example see here
. In the words of Kip Thorne, isn't the "curved spacetime paradigm" equivalent to the "flat spacetime paradigm" in GR?
I would be interested in your response, and thanks again for a beautiful essay.
Thanks for your comments... You certainly make a fair point -- I'm sure that a dozen different physicists would give you nearly as many different answers to the fill-in-the-blank sentence: "First and foremost, GR is a theory of ___ ". I was coming at it from the perspective that GR is more naturally about block-spacetime than it is about the dynamics of instantaneous 3-geometries... but now that you mention it, I certainly should have hedged my pronouncement somewhat.
That said, the problem with thinking of gravity as a field on flat spacetime is that it raises the possibility of *other* fields that aren't coupled to gravity in the ordinary way. You avoid this issue by setting the other fields directly into curved spacetime. Sure, maybe the equivalence principle will fail and one will be forced to consider this possibility, but one shouldn't confuse this (evidence-free) motivation with the more typical motivation: we don't know how to implement standard quantum theories in curved spacetime. To me, this is all the more reason to drop back to classical fields (which work perfectly well in curved spacetime), and try to figure out how quantum-like behavior might emerge from those GR-compatible entities.
For more of my thoughts on these issues, you could try Reference , which is also online at arxiv.org/abs/0706.4075 .
I'm glad you clarified this point. I didn't have a problem understanding it, though I did note by their comments that at least a couple of other people in the contest did.
I agree heartily with your research program. If you haven't had a chance to read my essay, I do hope you can before the polls close.
Thanks, for your response. You have given me some food for thought. I have read your ref.  and confirms what I had deduced from your essay; your work is essential to gaining a better understanding of the foundations of the "quantum realm".
Wishing you continued success,
Further to my post above Dan Bruiger has just posted me this, on reverese Doppler shift, apparently surprising but consistent with discrete fields (DFM) and my CD/Harmonics paper I reffered above. http://physicsworld.com/cws/article/news/45366
My theory predicts the effect, emerging as a natural result of harmonics, which proves the Regaza delay factor, (found experimentally anyway in PMD) and also it's reverse over short harmonic frequencies where wave particle 'polarisation' inverts.
I haven't read the paper yet, but was sure you'd be interested too.
"At the very least, the \measurement problem" should give one pause when drawing digital conclusions from quantum theory. As for the best, there is promise that we can solve the measurement problem by framing it in our continuous block universe."
I don't see a commitment to one or the other but I do see an open mind that perhaps cagily lean toward analogue which is my perspective but less well-argued.
Well written essay with many good arguments for a continuous reality. You are being a little unfair with the Wikipedia quote. If the word "bound" means stable forever (i.e. infinite lifetime) then the energy must have a single discrete value. Also as you know I don't like the block universe model. Such a model doesn't allow for causal chains. I do like retro-causation in the quantum world where future events can bring reality to properties which are initially undefined in the past. Causal chains in the macroworld go forward in time and this gives time its sense of flow from past to future. A block universe doesn't have any sense of time flow.
Finally, it is good that you acknowledge that we will probably never know for sure if reality is discrete or continuous.
Thanks, everyone, for the nice comments... I apologize for not finding the time lately to respond to everyone personally.
The essay's reference  (with co-authors David Miller and Huw Price) is finally ready for public viewing... It's now at http://arxiv.org/abs/1103.2492 . Comments on that paper are probably best sent via email, rather than here.
Congratulations on your dedication to the competition and your much deserved top 35 placing. I have a bugging question for you, which I've also posed to all the potential prize winners btw:
Q: Coulomb's Law of electrostatics was modelled by Maxwell by mechanical means after his mathematical deductions as an added verification (thanks for that bit of info Edwin), which I highly admire. To me, this gives his equation some substance. I have a problem with the laws of gravity though, especially the mathematical representation that "every object attracts every other object equally in all directions." The 'fabric' of spacetime model of gravity doesn't lend itself to explain the law of electrostatics. Coulomb's law denotes two types of matter, one 'charged' positive and the opposite type 'charged' negative. An Archimedes screw model for the graviton can explain -both- the gravity law and the electrostatic law, whilst the 'fabric' of spacetime can't. Doesn't this by definition make the helical screw model better than than anything else that has been suggested for the mechanism of the gravity force?? Otherwise the unification of all the forces is an impossiblity imo. Do you have an opinion on my analysis at all?
thank you for the citation added (even though the long delay) in your paper arXiv:1003.4273 [
Time-symmetric boundary conditions and quantum foundations] to my paper arXiv:0903.3680 [Compact Time and Determinism for bosons: foundation].
As you have already noticed, in my essay [ref:http://www.fqxi.org/community/forum/topic/901] Clockwork Quantum Universe [\ref] (thank you also for the congratulations for this first phase on the contest) the possibility of a consistent interpretation of quantum mechanics in terms of boundary conditions.
I would like to introduce myself in quantum terminology and share the truth that I have experienced with you. who am I?
I superpositioned myself to be me, to disentangle reality from virtuality and reveal the absolute truth