Search FQXi

If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

Forum Home
Terms of Use

Order posts by:
 chronological order
 most recent first

Posts by the blogger are highlighted in orange; posts by FQXi Members are highlighted in blue.

By using the FQXi Forum, you acknowledge reading and agree to abide by the Terms of Use

 RSS feed | RSS help

Steve Agnew: on 5/7/18 at 4:46am UTC, wrote All that means is that you do not believe there is any intrinsic role for...

Eckard Blumschein: on 5/5/18 at 0:46am UTC, wrote Steve A, I don't feel the assumed as open system with many also open to...

Steve Agnew: on 5/4/18 at 18:11pm UTC, wrote So you agree that matter oscillates with phase and amplitude and the FFT is...

Eckard Blumschein: on 5/3/18 at 16:58pm UTC, wrote Facing minor health problems, I neglected the exactness of my English. ...

Eckard Blumschein: on 5/3/18 at 7:26am UTC, wrote Steve A, Well, cosine transformation might be seen the basis of Fourier...

Steve Agnew: on 4/27/18 at 3:29am UTC, wrote think that you know that there is some kind of absolute phase...

Eckard Blumschein: on 4/21/18 at 2:59am UTC, wrote There was perhaps not yet a third crisis of cosmology conference after...

Jonathan Dickau: on 4/20/18 at 14:48pm UTC, wrote For what it is worth... I was there! The paper by Louis Marmet cited...


Georgina Woodward: "The mass of the lion entity is not divided between different areas of high..." in Anatomy of spacetime and...

Georgina Woodward: ""Superfluous" is not the correct word. 'Redundant' or 'no longer viable' is..." in Anatomy of spacetime and...

Lorraine Ford: "Hi Stefan, I have replied below." in The Present State of...

Lorraine Ford: "Take care Steve, Best wishes from Lorraine" in The Present State of...

Jim Snowdon: "Since evolving on our rapidly rotating planet, we have used it`s rotational..." in The Quantum Clock-Maker...

Steve Dufourny: "a general universal clock of evolution irreversible correlated for me with..." in The Quantum Clock-Maker...

click titles to read articles

The Quantum Clock-Maker Investigating COVID-19, Causality, and the Trouble with AI
Sally Shrapnel, a quantum physicist and medical practitioner, on her experiments into cause-and-effect that could help us understand time’s arrow—and build better healthcare algorithms.

Connect the Quantum Dots for a New Kind of Fuel
'Artificial atoms' allow physicists to manipulate individual electrons—and could help to reduce energy wastage in electronic devices.

Can Choices Curve Spacetime?
Two teams are developing ways to detect quantum-gravitational effects in the lab.

The Quantum Engine That Simultaneously Heats and Cools
Tiny device could help boost quantum electronics.

The Quantum Refrigerator
A tiny cooling device could help rewrite the thermodynamic rule book for quantum machines.

September 19, 2021

CATEGORY: Blog [back]
TOPIC: The Quantum Hourglass—How a Quantum Time-keeper Can Replicate Continuum Temporal Events [refresh]
Bookmark and Share
Login or create account to post reply or comment.

Blogger Mile Gu wrote on Mar. 1, 2018 @ 20:17 GMT
This post has been co-written by Mile Gu and Thomas Elliott.

Artist's impression of a quantum hourglass
In 2006, Oxford University Press set out to determine the most commonly used nouns in the English language. Topping the list was ‘time,’ closely followed in 3rd and 5th by ‘year’ and ‘day’ respectively, both delineating periods of time. This highlights how deeply embedded the concept of time is within the human psyche.

This prevalence of time is perhaps not too surprising—observing and tracking time are integral to our daily lives. Our work days are carried out according to schedules, meetings are planned with set durations. We synchronise with others by fixing times to meet, and we plan each day with the knowledge that they contain a set number of hours. To prepare for what is going to happen in the future, time-keeping is essential. Indeed, one can argue that one of the foundations modern civilization is built upon is our capacity to track time and anticipate future events.

In this context, the hourglass is distinguished for its role as one of the first accessible means of tracking time to the accuracy of seconds in medieval society. Indeed, their function is so familiar that ‘the sands of time’ has become a popular idiom, referring to the visual metaphor that the passage of time appears to flow like falling sand, steadily and irreversibly progressing from past to future.

Zoom in close enough on an hourglass, and one will see the individual grains of sand. At this level, the flow is not smooth, but inherently granular. At any moment, a finite number of grains of sand will have fallen, incrementing temporal progress in discrete packets. Time itself however appears, at any observable level, to be continuous. The hourglass analogy thus extends only so far.

This limit illustrates a pertinent observation—objects with a finite configuration space can only mimic the passage of time to finite precision. This isn’t particularly surprising: a finite configuration space can support only a finite memory. Thus, irrespective of any other physical limitations, such a system can only store the current time to a finite precision.

Indeed, this limitation is all too familiar to those in scientific fields that involve digital modelling or simulation. Theoretical models almost always assume time operates on a continuum. Whether modelling neuronal spike trains or dynamics of quantum systems, time is generally represented by some parameter t that takes on real values. Digital simulations of the resulting systems must however inevitably approximate such dynamics by discretising time.

As an illustration, consider the simulation of a particularly simple delayed Poisson process. This consists of a single system that emits only a single type of output at probabilistic points in time. The probability of an emission occurring during each infinitesimal time-interval δ is constant, with one catch: no output is ever emitted within a fixed relaxation period τ after an emission. For a device to replicate these statistics correctly, it would need to record whether it is in such a relaxation period, and if so, precisely how much time has elapsed since the last emission. Let’s call this elapsed time t. Only by storing t can the simulator know precisely much longer it must wait (i.e., the value of τ – t) before it is okay to emit the next output.

However, the variable t is real, and can take on a continuum of values. The more decimal places we wish to store about it, the more memory is required. In the limit where we want to faithfully predict the next emission with accuracy up to an arbitrary level of precision, this memory becomes unbounded. In practice, when writing computer code to perform the simulation, we would approximate t by discretising time into granular packets. For example, we could take some sufficiently small Δ and call it a day, provided we are happy to track time only to the nearest kΔ, for some integer k. A typical program for simulating such a process would have a pseudocode along the lines of:

if t is less than τ,

increment time by setting t = t + Δ


emit an output and set t=0 with probability p Δ

end repeat from start

Each iteration of the code simulates one timestep Δ. As Δ goes to zero, the output of this simulator will become statistically identical to the original delayed Poissonian process. The cost though, is that the number of values t can take scales as 1/Δ, growing to infinity as Δ goes to zero.

Thus, the more accurately that we wish to simulate the process, the more memory we need to invest. There is an ever-present trade-off between temporal precision and memory, and perfect statistical replication requires the allocation of unlimited resources. An hourglass, with a finite amount of sand, can thus never achieve exact replication.

While this trade-off is of clearly of practical relevance, it is fascinating also from a more foundational perspective. Many scientists seriously consider the possibility that we live within a simulatable reality, wherein everything in nature can be thought of as information processing. If we assume the memory capacity of this underlying computer is finite, running a program like the pseudocode above, how would it simulate processes that operate in continuous time? One may be tempted then to conclude that either we live in a computer with unlimited resources, or that continuous time exists only as a theoretical idealisation. Perhaps our universe is itself like sand in an hourglass—zoom closely enough, and everything appears granular.

While this could be a valid possibility, is there perhaps a way to avert this conundrum?

What we have not yet considered here is the quantum nature of information. The key element is that every bit of data represents some physical system with two different configurations—one which we label |0>, the other |1>. Provided the system can be isolated sufficiently well from the environment, we can also steer it into quantum mechanical degrees of freedom that possess coherence: superposition states represented by α|0>+β|1>, simultaneously |0> and |1> with specific weights dictated by α and β. The difference now is that α and β are intrinsically continuous degrees of freedom. Thus, this quantum bit—a finite physical system—contains within it a continuous parameter. Could this be leveraged to encode the continuity of time?

In our latest article, published in npj Quantum Information (4, Article number: 18 (2018)), we show that this ingredient gives us exactly what we need. Instead of using a classical hourglass where each grain of sand has either fallen or is yet to fall, we employ a quantum time-keeper where the grains of sand are in a weighted superposition of both possibilities. By deforming the weights continuously with the passage of time, we are able to prove that the delayed Poisson process could be modelled with perfect precision using finite memory.

Pragmatically, this result could immediately lead to memory savings in continuous time simulations. Numerical evidence indicates that our results apply to much more general cases, where the waiting time distribution between successive emissions is arbitrary. Such general processes, known as renewal processes, are relevant in many diverse fields of study—from modelling the firing of neurons to arrival times of buses. Thus a means to simulate such systems with less memory could have direct practical use. Similar advantages can be found when considering other continuous variables, such as position, as was shown in a companion article recently published in New Journal of Physics (2017).

The foundational consequences, however, are perhaps more exciting. Let us again entertain the scenario where we live in a simulated reality. Would the architects of this reality prefer the use of classical or quantum information? Our work shows that if they are intent on constructing a universe where time flows smoothly, then quantum mechanics may be the only feasible method. Time, should it be continuous, could well necessitate an underlying quantum reality.

Mile Gu is a physicist at Nanyang Technological University and the Centre for Quantum Technologies, Singapore. Thomas J. Elliott is a physicist at Nanyang Technological University, Singapore. Their research was supported in part by FQXi.

Bookmark and Share
this post has been edited by the forum administrator

report post as inappropriate

John Brodix Merryman wrote on Mar. 2, 2018 @ 03:40 GMT
The problem is that we do experience reality as those discrete flashes of perception, aka thoughts and so naturally think of it as this "flow" from prior to succeeding events.

The logical explanation for this effect though, is that there is only this dynamic physical state and its changing configuration creates the effect of time, such that it is future becoming past. Probable, to actual to residual. Tomorrow becomes yesterday, because the earth turns.

So the constant is this present state, much of it changing at the speed of light and any way we have to measure this change is of a distinct and particular process. Duration is the present, as events coalesce and dissolve.

Which is why different clocks can run at different rates and remain in the same present, as with metabolism, since they are separate actions.

Why time is asymmetric, because action is inertial. The earth turns one direction, not both. Time and its direction arise with the first law of thermodynamics, the conservation of energy, in that energy is always and only present. As Einstein observed, at the speed of light, time stops. No change is possible beyond that. It is only in the slower and quantized units that change exists.

Given the narrative effect is foundational to memory, history and civilization, it is no wonder we would consider it foundational to reality, but then we still see the sun moving across the sky, from east to west and run our lives accordingly.

Bookmark and Share
post approved
Steve Agnew replied on Mar. 10, 2018 @ 18:17 GMT
You like the time metaphor of earth spinning and say that earth's spin means that tomorrow becomes yesterday. And of course, yesterday did become the present as well...and the present moment will become tomorrow as well. What is not so clear is why any of these statements help understand time since they are all already implicitly imbued with time.

It is true that entropy points a very nice arrow for macroscopic time except for those Maxwell demons that seem to mess up the arrow of microscopic time. It is quantum phase decay that points the arrow of microscopic time and as part of quantum phase decay, the earth not only spins, but earth's spin slows down over time and that spin decay also sets the arrow of time.

Mainstream science attributes all of the slowing of earth's spin to gravity tidal friction. However, some of the slowing of earth's spin is also due to quantum phase decay and the question is how much. Quantum phase decay seems to be everywhere and is pervasive.

This article here simply truncates quantum phase decay by assuming a characteristic time, tau. Although this can be a very good approximation for discrete time intervals, strictly speaking the authors could have simply kept a truncated exponential decay and gotten to the same place much more easily...

Bookmark and Share
report post as inappropriate

John Brodix Merryman replied on Mar. 11, 2018 @ 16:49 GMT

That is a bit like saying atoms bouncing around are all already implicitly imbued with temperature.

It is a simple question; Does action create time, or is time foundational to action. That is why the question of whether it is the point of the present "flowing" past to future, or change turning future to past is relevant.

The first necessarily assumes the underlaying dimension of events(past, present, future) must exist external to this situation called "now" and now is little more than a light flashing across them. The second see the events as an ephemeral effect of an underlaying dynamic of creation and dissolution and the term "now" is simply our experience of the process, as it is occurring.

So just as temperature is a way to quantify the effect of mass action, time is a way to quantify the frequencies of that action.

Both on the micro and macro levels. For instance, employment statistics amount to a form of temperature reading of human activity. So if I use the earth turning as an example of dynamics creating a particular example of time/frequency, it is no more or less fundamental than quantum activity. Anymore than Kelvin is more or less fundamental a measure of temperature then Fahrenheit or Celsius.

If you go back to the very first essay contest, The Nature of Time and the winner, Julian Barbour's entry, it makes a similar argument;

Basically the principle of least action as the most reliable clock. This would apply to any scale.

Bookmark and Share
report post as inappropriate

Steve Agnew replied on Mar. 15, 2018 @ 03:30 GMT
Of course, atoms bouncing around are what temperature is and so what you say is self evident.

Does time emerge from the action of matter? Yes.

Or does action emerge from the motion of particles in space and time? No.

You do have a wonderful intuition about the nature of reality and that I respect. The causal set granulated universe does seem to be the answer to unification of charge and gravity and so I really like it right now...

Bookmark and Share
report post as inappropriate

Robert H McEachern wrote on Mar. 7, 2018 @ 22:50 GMT
Sounds rather like the authors have reinvented the concept of a fractional delay-line, for producing a continuous range of delays within a sampled signal. See IEEE Signal Processing Magazine, "Splitting the Unit Delay", January 1996, pages 30-60.

Rob McEachern

Bookmark and Share
report post as inappropriate
Eckard Blumschein replied on Mar. 8, 2018 @ 06:14 GMT
I quote from an IOP article:

"As such, the quantum simulator cannot tolerate overlap between the states $| {s}_{j}rangle $ and $| {s}_{j+1}rangle $, and must store them orthogonally (allowing them to be distinguished). In this scenario, the quantum simulator cannot demonstrate any advantage in memory cost over its classical analogue."


Was I wrong when I interpreted your decision to not participate in the last FQXi contest? Did you get aware of opinions by Traill, Kadin, Klingman, Jackson, Watson, Bollinger, Szangelois, and De Santos which I consider closely related to your classical result?

May I ask you for a comment on Kadin's prediction?


You know that I strictly thinking don't share your (and Einstein's) idea of a present state between past and future. I also persistently object to "Tomorrow becomes yesterday, because the earth turns". Isn't time more fundamental than the rotation of a particular body? The words yesterday and tomorrow don't belong to the agreed in physics usual scale of time but rather to the different point of view that is anchored at the moving relative to it border between elapsed and future time.

Eckard Blumschein

Bookmark and Share
report post as inappropriate

Robert H McEachern replied on Mar. 9, 2018 @ 17:46 GMT

I did not participate in the essay contest, because I see no advantage to posting a paper on FQXi, over posting it on viXra.

I have previously glanced at the essays you noted, but while I see that they all address similar issues, I think there is a fundamental difference in our outlook.

I agree with Max Planck's statements regarding the historical causes for the lack...

view entire post

Bookmark and Share
report post as inappropriate

John Brodix Merryman replied on Mar. 10, 2018 @ 00:38 GMT

Einstein seems to be the popularizer of the eternalist view; That all events exist on the time dimension and "now" is as subjective as "here."

My argument is there is only this state we refer to as "present" and its changing configuration creates the effect of time. This is commonly referred to as the presentist view.

My particular observation is that our perception of the flow from past to future can only exist in the present state and so it is the events which form and dissolve, thus going future to past.

Potential, to actual, to residual.

It's a bit like trying to figure out how the cosmos swings from east to west, before taking into account the earth turning west to east. Our perception is subjective.

Bookmark and Share
report post as inappropriate

Steve Agnew wrote on Mar. 10, 2018 @ 17:43 GMT
Thanks for the nice review and the technical paper was useful as well. However, the concept of interpolating a continuous time from an interval of discrete actions seems very straightforward, even for the q-bits of quantum information.

The basic proposition seems to be defining a characteristic decay time window as a block function or any other finite time then allows knowledge of a precise time. Of course, the uncertainty principle only allows for a precise time for an infinite energy or memory in your terms. You have bound the infinite energy with the assumption of a well-defined decay time.

What is not really so clear is why you need all this complex math for such a simple notion...

The discrete matter and action are what make up our granulated universe and so time is fundamentally granulated as well. Time and space both emerge from the discrete actions of the discrete matter of an ever-changing universe of coherent phase decay. The earth spins and sets our time, but the earth spin decays as well and earth's spin decay is part of the phase decay of the universe that sets the direction of time.

Bookmark and Share
this post has been edited by the author since its original submission

report post as inappropriate
John Brodix Merryman replied on Mar. 11, 2018 @ 17:48 GMT

"The discrete matter and action are what make up our granulated universe"

Isn't that a bit of a tautology? How can we be sure this discreteness and granulation are not prerequisites of deriving information, as opposed to processes creating such formations?

If there was no discreteness, then the logical effect would be one of fuzziness and indeterminateness. How do we know there are not two sides of the coin?

For example, consider waves: The information we derive from them is frequency and amplitude, but that only describes the effect of the underlaying energies. Is it possible when we are exploring sub atomic woulds, this information we perceive is equivalent to frequencies and amplitudes, i.e. descriptive of dynamic process that would otherwise appear fuzzy. Like a picture with a slow shutter speed is blurry?

Consider that if you leave the shutter open longer, you are letting more light in and more information, but the effect is to blur the picture, because it is naturally in motion. So to make it clear, we try to get a faster shutter speed and less light. Just as with physics, we try to get clearer pictures of nature by looking at smaller and smaller units. Yet can't quite explain away the normal fuzziness.

As I've argued elsewhere, energy and information effectively go opposite directions of time. Energy being conserved, is only present, but its changing configuration makes it go from one event to the next, thus past to future. While information, being created and dissolved, goes future to past. Much as the crests of those waves come and go, but the energy is only transferred to other forms.

Think of a factory: The discrete product goes from start to finish, future to past, while the production line points the other direction, consuming material and expelling product.

Bookmark and Share
report post as inappropriate

Steve Agnew replied on Mar. 15, 2018 @ 03:40 GMT
I have recently found the causal set granulated universe that has a large body of peer reviewed and public information. However, there are perhaps 10-20 researchers in this field and they all insist that causal sets are not yet ready for prime time.

My review of causal sets suggests that there should be much wider scrutiny and that may of causal set axioms seem to make sense. Instead of the spacetime universe of absolute time from A to B, the causal set universe orders action from parents to children. We are caused by our parents and from that family relation space and time emerge...

Bookmark and Share
report post as inappropriate

Jonathan J. Dickau replied on Mar. 16, 2018 @ 14:48 GMT
Yes they are quite interesting...

Causal Sets are in the category of causal structure theories in quantum gravity, along with LQG (loops), CDT (triangulations), and a handful of others. Ben Dribus was an avid participant here at FQXi, a few years back, and he since wrote a book on Causal Set theory. I got to hear Lee Smolin give a talk at GR21 about 'Energetic Causal Sets' which was very interesting indeed. Check out:


Enjoy, Jonathan

Bookmark and Share
report post as inappropriate

Steve Agnew wrote on Apr. 7, 2018 @ 23:48 GMT
To be useful, all a quantum computer needs is for a long enough quantum phase superposition for an acausal set of qubits. There seem to be many examples with the latest being this, for example. How useful quantum computing might actually be is a different question, but quantum superposition has be demonstrated time after time.

The causal set of discrete matter and action represents quantum norms and so superposition does not play much of a role in the reality of a causal universe. It is the acausal set of quantum superposition that upsets many people with its uncertain and therefore indeterminate futures. Once quantum phase decays, though, we are back to the causal norm of determinate gravity action.

Time is simply the parents and children of action of a discrete causal set and both space and time emerge from these family histories. It is the inherent decoherence of quantum phase that sets the arrow of time, not really entropy or temperature. Note that thermodynamics has an autocorrelation function that is essentially the same as quantum decoherence.

Eggs never unbreak in the causal universe, but quantum neutrons both decay by emitting neutrinos and neutrinos also create neutrons from proton absorption. However, quantum phase decay favors neutron decay over creation and so points the arrow of quantum time. This means that while quantum time is reversible, quantum phase decay sets time's arrow.

Bookmark and Share
report post as inappropriate

Steve Agnew wrote on Apr. 8, 2018 @ 00:03 GMT
All quantum matter oscillates with amplitude and phase and the Fourier transform of that oscillation is a matter spectrum that also has both amplitude and phase. The power spectrum of quantum matter is the norm of that causal set and is what we think of as determinate relativistic gravity reality.

However, matter's phase spectrum does not have a classical meaning since relativistic gravity matter ignores quantum phase. The Fourier transform of quantum charge, then, gives both amplitude and phase of charge amplitude and it is the phase that leads to superposition states.

Quantum superposition means that two complementary neutrons can coexist in the same space and time and it also means that a single neutron can coexist in complementary states in two different places and times as well. Quantum superposition then seems to violate spacetime's causal order since a single neutron only exists in one place. But really spacetime's causal order derives from the causal set of quantum norms and not the acausal set that includes quantum phases.

Bookmark and Share
report post as inappropriate
Eckard Blumschein replied on Apr. 9, 2018 @ 13:07 GMT
Steve A,

While I don't have any problem with relative phase, Fourier transformation requires to arbitrarily choose an absolute phase reference. Cosine transformation avoids this arbitrariness.


Bookmark and Share
report post as inappropriate

Steve Agnew replied on Apr. 10, 2018 @ 04:16 GMT
...except when you realize that you the observer has phase as well as the source has phase. Therefore, it is impossible to know the absolute phase of a source since you cannot know your own phase.

This is the quantum conundrum...

Bookmark and Share
report post as inappropriate

Eckard Blumschein replied on Apr. 10, 2018 @ 15:23 GMT
Steven A,

Quantum conundrum? My dictionary tells me: "A conundrum is a difficult or confusing problem". Where is a problem with the reference point that must be chosen as to apply Fourier transformation? How does the putative problem relate to quantum theory? I don't feel confused.

The lack of a natural reference time or phase is due to a very fundamental concept: In order to take advantage of simultaneity and a ubiquitous time scale, all Christians and all physicists who are using the common notion of time agreed on the same event of reference, the birth of Christ at GMT=0. In this sense, everybody may know "his" absolute time or phase. Of course, the choice of Christ's birth as reference point is not a natural one.

Alternatively, everybody may refer time or phase not to a chosen event but to the actual natural border between what already happened and what is expected to happen. The belonging scale of elapsed time or phase is steadily shifting relative to the usual one. Where is the problem?

Decades ago, I got aware of the undeniable fact that our two cochleas cannot know the arbitrarily chosen reference of time and phase while interaural time/phase differences contribute well to localization by ear. Even if most experts refuse admitting belonging weakness in the theory of signal processing, the usual event-related scales of time and phase might be inappropriate.

I guess, that basics of QM are also affected. This got obvious in the 1920th with interpretation in complex plane. For instance, when Dirac denied negative frequency, he failed to clarify that negative frequencies are a necessary consequence of FT with not available in advance future data. It should not be a problem to admitt mistakes.


Bookmark and Share
report post as inappropriate

Login or create account to post reply or comment.

Please enter your e-mail address:
Note: Joining the FQXi mailing list does not give you a login account or constitute membership in the organization.