If you have an idea for a blog post or a new forum thread, then please contact us at email@example.com, with a summary of the topic and its source (e.g., an academic paper, conference talk, external blog post or news item). You may also view a list of all blog entries.
Six Degrees to the Emergence of Reality
Physicists are racing to complete a new model of "quantum complex networks" that tackles the physical nature of time and paradoxical features of emergence of classical reality from the quantum world
Yesterday afternoon at the quantum foundations meeting in Erice (supported by COST) we celebrated the 80th birthday (somewhat in advance) of GianCarlo Ghirardi who famously worked on collapse models, in an attempt to deal with the quantum measurement problem. He’s the “G” in GRW collapse theory. (Ghirardi is pictured here — a bit fuzzily —being presented with a gift by Catalina Oana Curceanu and Detleft Duerr.)
Physicists Angelo Bassi and GianCarlo Ghirardi discuss collapse models. From the COST quantum foundations meeting in Erice, Italy.
I’ve just posted a special podcast with interviews with physicist and meeting organiser Angelo Bassi and Ghirardi himself. Bassi talks a bit about the motivation behind collapse models and what they are, but they basically try to help explain why the classical world we see around us involves people and things in definite places, while one small scales, particles exist in a fuzzy uncertain realm.
The idea is that the wavefunction of particles can undergo spontaneous collapse, but in the case of individual particles, the odds of this happening are slim, so on the microscopic level you should see the same sort of things that standard quantum mechanics predicts. But when you bring lots of particles together in a macroscopic object, the probability of collapse shoots up — and hence they behave classically.
But how do you test this idea? There’s nothing in principle in standard quantum mechanics that prevents ever larger particles (even cats) being in quantum superposition, if you can prepare your experiment carefully enough (which is tough to do). By contrast, GRW predicts that above some certain mass limit, collapse is inevitable, no matter how pristine your experiment. So that gives you something different to search for.
In a previous post, I mentioned matter-wave interferometry experiments. Yesterday, FQXi’s Hendrik Ulbricht, of the University of Southampton, talked about efforts to see quantum effects in ever larger objects — cold atoms, molecules, metal clusters or nano particles, and even cantilevers — but at the moment they are not well-developed enough to be able to test collapse models. Another problem is that if you carry out such a test and you do see a loss of quantum effects, it might have been caused by problems with the experiment, and decoherence due to interactions with the environment, rather than revealing something fundamental.
The blurry image shows a dog who apparently loves physics — he gatecrashed the meeting for two days running (in search of Schrodinger’s cat?). The first time, he ran onto the stage with the lecturer, who was speaking about quantum biology. The second time, he stopped in front of the stage and barked loudly at the speaker, who was talking about string theory. Make of that what you will!
Much of the first day was taken up with discussions of matter-wave interferometry. Quantum mechanics tells us that the boundary between waves and particles is murky at small scales. But just how far does this ambiguity stretch? Markus Arndt and Jonas Rodewald of the University of Vienna, and James Bateman, of Southampton University, opened the meeting by talking about ways to test whether ever-larger particles display quantum properties such as superposition, the ability to be in two places at the same time, for instance. (Bateman works with FQXi member Hendrik Ulbricht and you can read more about Ubricht’s Southampton tests of the quantum limits in an article written by reporter Sophie Hebden.)
Bateman’s talk particularly caught my attention when he mentioned plans to put such quantum experiments in space to try to detect dark matter. Although we have good evidence for the existence of dark matter — which is invisible but exerts a gravitational pull on other matter — from astronomical observations, physicists still do not know what it is and have devised numerous clever experiments to try to detect it directly and help identify it.
The inspiration to link dark matter to quantum optics came from an intriguing suggestion in 2013 by C. Jess Reidel that dark matter could be causing decoherence in matter wave experiments (knocking fragile quantum objects and causing them to lose their nifty quantum properties). As Reidel said in the abstract to a paper in Phys. Rev. D: “The apparent dark matter wind we experience as the Sun travels through the Milky Way ensures interferometers and related devices are directional detectors, and so are able to provide unmistakable evidence that decoherence has Galactic origins.”
It was a provocative notion and Bateman and co decided to investigate further. “This was an interesting idea, but essentially none of the details had been worked out, so we started talking to theoretical particle physicists,” Bateman told me. Those theoretical physicists were initially dismissive, but together with the experimental team, they came up with a candidate dark matter particle that would have evaded detection by dedicated collider experiments and would not have shown up in high energy collider experiments, but might show itself in a matter-wave experiment.
It turned out that this candidate could not actually be the decoherence mechanism in any existing matter-wave experiments, says Bateman. The hypothesised particle has a mass of about 0.02 per cent of the electron. “It is very low mass, which means its de Broglie wavelength” — the wavelength associated with quantum particles — “is large compared with the nuclei in normal matter,” Bateman explains. This means that it coherently interacts strongly with normal matter — so much so that “it couldn’t penetrate Earth’s atmosphere, let alone the glass and metal vacuum chambers of, for example, Markus Arndt’s interferometers,” says Bateman.
It’s not too much of a disappointment, though. A dark matter decoherence mechanism, despite sounding cool, isn’t that different from other decoherence mechanisms, in which the environment interacts with a quantum system, destroying its fragile properties. So, notes Bateman, even if it had worked in theory, it wouldn’t have magically solved the quantum measurement problem in a new way.
But this still left open the exciting possibility that a quantum optics experiment carried out beyond the atmosphere could be sensitive enough to pick up signs of this particle. The team outlined such an experiment in a paper published in Scientific Reports, involving a suspended nanoparticle. The way in which the nanoparticle’s position changes will tell them something about dark matter.
The idea is to place the experiment in a spacecraft located at Lagrange point 2 (a stable point in our solar system beyond the Earth). The nanoparticle’s position can be precisely tracked by firing laser light at the particle and collecting it with a lens. Light that has been scattered by the particle will interfere with the rest of the laser light, changing the intensity of light picked up by detectors, in a way that can be precisely monitored. (Image from Scientific Reports5, article number: 8058, courtesy of James Bateman.)
The team plans to send the experiment into space as part of the MAQRO (microscopic quantum resonators) consortium. You can read more about space-based quantum tests in a Q&A that reporter Colin Stuart carried out with David Rideout and in an article that I wrote for Nature about the quantum space race between China and Europe.
Our latest podcast has been posted and we’re catching up with a couple of old friends of FQXi to talk about their recent work, and making a couple of new ones.
First up we have quantum physicist Martin Ringbauer, of the University of Queensland, discussing tests investigating whether the quantum wavefunction is real. You may remember that Martin took part in one of our most popular podcast pieces last year, when he chatted to us about simulating time travel in the lab with photons. Now, he and his colleague Alessandro Fedrizzi, also from Queensland, and others have carried out tests to try to uncover whether the mathematical term used to calculate the evolution of quantum systems, the wavefunction, simply represents our lack of knowledge of the true state of reality (an idea favoured by Einstein) or whether it directly corresponds to reality itself. In other words, is Schrödinger's cat really alive and dead at the same time, or is it actually in some set state, and we just don't have the tools to measure it?
The question was discussed in depth on the site a few years ago, when FQXi's Jonathon Barrett and his colleagues Matthew Pusey and Terry Rudolph came up with a no-go theorem that appeared to favour the interpretation that the wavefunction is real. Physicist Oscar Dahlsten wrote a nice summary of that whole debate and of PBR’s claims, for us at the time. Now Ringbauer, Fedrizzi and their colleagues are tackling the same question from a different angle, performing experiments with polarised photons to try and close in on an answer. In some cases, different polarisations are indistinguishable by single measurements in the lab, and the team have calculated whether the measured level of ambiguity can be explained by the more intuitive classical-style models favoured by Einstein (Psi-Epistemic models) or not. Their results point to the “not” side, indicating that the wavefunction is real. On the podcast they talk about what exactly their tests have shown so far, which quantum interpretations they have ruled out, and which remain.
Next, we visit another long-running debate, this time based on black hole firewalls. Last year, at the FQXi conference in Vieques, we held a panel discussion, featuring Anthony Aguirre, Raphael Bousso, Andrew Hamilton, Seth Lloyd and David Lowe about what actually happens in and near a black hole:
That debate was inspired by the so-called AMPS paper (by Almeiri, Marolf, Polchinski and Sully) that predicted that if certain quantum laws hold, and information is not lost from the universe when black holes evaporate, then the black hole event horizon must be replaced by a ring of fire. If true, this would contradict one of the founding principles of general relativity, which requires that the event horizon displays “no drama”.
Physicists have been analysing the theory to try to come up with a resolution and to answer to whether firewalls really exist or not. Now cosmologists Niayesh Afshordi and Yasaman Yazdi have found a new way to think about the issue. They have been looking for possible observational signatures of firewalls, and have come up with the idea that neutrinos may be generated as matter falls through a firewall. It may be even be possible that highly-energetic neutrinos that have been picked up by the Ice Cube Detector at the South Pole are due to firewalls. (Although, of course, they also might also be explained by other sources too, as Afshordi explains in the podcast. However, as of yet, the most conventional explanations don't fit that well with the data.)
Afshordi is fond of off-the-wall ideas, as you may remember from another extremely popular interview we ran a couple of years ago, about whether our universe was created by the collapse of a 5D star (see “Bye Bye Big Bang?”. On the podcast, Afshordi talks us through how the firewall paradox arose and his ideas for observing them. FQXi member Sabine Hossenfelder has also blogged in some detailed about the paper over on Backreaction.)
In our last item, reporter Carinne Piekema talks to quantum physicist Jacob Biamonte about the new discipline of quantum network theory, and how it could explain the emergence of classical reality and the origin of time’s arrow. You can read about his work in an article by Carinne too.
Stuck for a last minute present for your loved one for Valentine’s Day?
Not to worry, FQXi has teamed up with Springer to bring you the perfect gift: a compilation of reworked essays inspired by the “Which of Our Basic Physical Assumptions is Wrong?” contest (“50 Shades of Reality” as it were.)
Cutting and pasting the blurb from the back of the book:
As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.”
Of course, you can still read the original entries on the site, but in the new volume, the winners were invited to expand upon their entries, making them more technical, if needed. So those of you with a mathematical bent may find these even more rigorous and enjoyable.
They have also been revised to take into account feedback from the discussions on the site, so thank you to all of you who ranked and discussed the essays, in this contest and others.
The compilation includes contributions from Robert W. Spekkens, George Ellis, Benjamin F. Dribus, Israel Perez, Sean Gryb & Flavio Mercati, Daryl Janzen, Olaf Dreyer, Steven Weinstein, Angelo Bassi, Tejinder Singh & Hendrik Ulbricht, Giacomo Mauro D’Ariano, Ken Wharton, Giovanni Amelino-Camelia, Torsten Asselmeyer-Maluga, Sabine Hossenfelder, Michele Arzano, Julian Barbour, Ian T. Durham, and Sara Imari Walker.
This is the first in a series of books inspired by our essay contests. I will keep you posted when more appear.
Here's Eugene Wigner, from "Unreasonable Effectiveness":
"The first point is that the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and that there is no rational explanation for it."
I want to make a casual suggestion here, but one about which I am serious. What follows, in other words, is not meant to sound flippant.
Might the famous unreasonable effectiveness of mathematics—its spectacular success in quantifying, model-building and predicting future states of natural systems—be simply a matter of coincidence?
We tend to be amazed that math works again and again. Wigner compares the situation to a man with a bunch of keys finding that the first one or two he chooses always open the door. This would indeed be surprising, but the analogy can be read another way: how many ways are there to get into the house? The keys don't open windows; they don't open walls; they don't open the ceiling, or the yard, driveways or bushes or clouds. Keys fit those things that, well, fit keys. To note that one of your keys can do any task at all would be one thing. To keep being amazed that keys unlock doors is quite different.
Let's say a single key turns out to open a huge number of doors in Eugene's house—perhaps even an infinite number. That is surely an awesome thing, especially to a human mind: What a powerful key! Look at how many tasks it can handle—door upon door upon door!
But it still tells us nothing about what we aren't able to do with it (paint the house, grow the garden). Opening yet another door, excellent as that achievement may be—from the first discoveries in fluid mechanics to the latest in quantum chemistry—is, from the point of view of math's mysterious utility, essentially the same feat as it was the last time around. Look at that! Quantifiable things are still able to be quantified. Who knew?
But even if endless discoveries are made using the abstraction of mathematical tools, we are not justified in assuming that we are, in this manner, making all possible discoveries. Something can be infinite (say, the set of all mathematical expressions that correspond, in some suitable defined way, to nature) without being all-encompassing (say, if that set we just described turns out to be a subset of another, also infinite set, "the set of all truths about nature").
Let's take a different approach. Part of logical positivist epistemology—the "logical" part—regarded mathematical statements as truths that can be known precisely because they are, ultimately, tautological. The whole intimidating edifice (Russell's phrase for Hegel) of modern mathematics is, in this view, simply a restating, or at best a following out, of axioms. Seen in this way, the complexities of any branch of mathematics could, by a sufficiently comprehensive mind, be immediately surmised from its axioms. If you understand that A is B and B is C, you already understand that A is C; you may not have stopped to draw out the steps, but when presented with the claim A = C, no new calculation was required per se. You already "got" that.
Positivism took it on the chin, and nobody much credits it any more. I am one of the few holdouts. But just grant for the moment their claim that math is an exercise in symbolic logic, and that any mathematical formalism is a tautology whose conclusions are, in a real sense, implicit in its axioms. Imagine, now, an "axiom bundle" as the sharp tip of an enormous glacier of implications. If the premise embedded in the tip is true of anything at all in the real world—any recurrent pattern, any stable quantity—then so is the glacier.
Wigner looks at the glacier and says: Wow! Look at all the things in nature that are mathematical! I'm saying: What you actually mean is that the axiom tip happened to correspond to at least one kind of natural thing.
Is that such a wonder?
To return to our first metaphor, if we poke Wigner's key all over the house, eventually we may find a lock, then be amazed that we have a key and nature has a lock. If all the locks in the house are versions of the first lock, of course, we shouldn't be surprised when the key keeps working; in a sense, we keep repeating the same procedure. But it doesn't follow that we are understanding all that much about the house.
I'm not suggesting—I hasten to clarify, here, at the end—that there are *supernatural* truths we may be missing with the key of naturalism. I'm suggesting that there may be *non-mathematically accessible natural truths.* And no, I don't know what such truths would look like, though I note that some other folks harbor similar suspicions. Stephen Wolfram, in A New Kind of Science, suggests that traditional formulae are inadequate to the task of understanding nature, and that something else is needed (in his view, it's the empirical study of cellular automata). Thomas Nagel, whom I took to task here for his Mind and Cosmos, thinks consciousness itself cannot be explained using the default of materialism, which means there is at least one thing to nature that isn't reducible to mathematics (following the reductionist arrows from consciousness to neuroscience to biology to chemistry to physics to math, which is where, if you agree with Max, reality stops. By the way, Max, you owe me an email.).
I don't know how convinced I am by my own line of reasoning. (Here's a possible counterargument: Since all truths are logical—that is, even if Mother Nature throws dice, she does not act incoherently—then, if math really is symbolic logic, all truths are mathematical. QED.)
At least, though, what I've sketched out here is a logical possibility—and one that would explain that seemingly "unreasonable effectiveness" of Wigner's key as a tool for uncovering nature's secrets. It may just be a sampling error.
A mathematical philosophy - a digital view By JOSELLE KEHOE
I’ve become fascinated with Gregory Chaitin’s exploration of randomness in computing and his impulse to bring these observations to bear on physical, mathematical, and biological theories. His work inevitably addresses epistemological questions...
2014: Paradoxical Cats and the Physics Year in... By ZEEYA MERALI
[picture]It's become a bit of a tradition for quantum physicist and FQXi member Ian Durham to join us on the podcast each December to choose his favourite physics stories of the year. As always, Ian's gone for an unconventional top pick. I'd be...
Trick or Truth? — Essay Contest 2015 By BRENDAN FOSTER
Physics and mathematics -- It seems impossible to imagine the history of either one without the other. For gravity theory alone, we see so many examples of this -- from Newton creating calculus, to Einstein mining differential geometry.
Please Join Us For the 2014 FQXi Video Contest... By ZEEYA MERALI
It's time to get out your ballgowns and tuxedos. FQXi is rolling out the red carpet and inviting you to join us as we announce the winners of our first ever video contest: "Show Me The Physics!"
As Brendan hinted in an earlier post, the judges...
Planck Sheds Light on Dark Matter and Closes the... By ZEEYA MERALI
[picture]This image isn't a close-up of part of Van Gogh's "The Starry Night," but it is a patch of sky showing the swirling the magnetic field inferred from Planck data. (Image via ESA-Planck Collaboration, prepared by Marc-Antoine...
Janus Universes and a Gravitational Arrow of Time By ZEEYA MERALI
[picture]We can always count on FQXi member Julian Barbour to raise the tone of the conversation, this time by referencing the Roman two-faced god Janus with his new theory that explains the origin of time's arrow, using gravity (Phys. Rev. Letts,...
Show Me the Physics Winners — Tune in Soon! By BRENDAN FOSTER
Astute followers of FQXi's first ever video contest Show Me the Physics! may know that the contest timeline lists today as the day for announcing our winners. Well, I am happy to announce that our judges have in fact made their choices...
A Physicist and a Science Writer Walk Into a Bar By GEORGE MUSSER
Quantum physics can make rocket science look like kindergarten circle time. Even experts find it daunting. So imagine the challenges that science writers face, both in understand the physics and conveying it to a general readership. To try to help,...