If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at email@example.com with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.
Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.
Dropping Schrödinger's Cat Into a Black Hole
Combining gravity with the process that transforms the fuzzy uncertainty of the quantum realm into the definite classical world we see around us could lead to a theory of quantum gravity.
The main theme of August's FQXi conference was centered around the physics of the observer and so, in wrapping up our discussion of the conference, it remains to be asked if any progress was made toward a better understanding of the concept. As with just about any conference or meeting of researchers from such diverse backgrounds, it is natural to expect that each of us came in to the conference with out own opinions, ideas, and, yes, biases. It's difficult to say if anyone actually changed their mind about anything or was swayed by a reasonable but opposing argument, as a result of this conference.
Of course, science is not based on opinion. The final arbiter of science must always be carefully constructed experiment. Certainly there are some, including a few attendees of this conference, who have suggested that science, and most notably physics, has moved into the post-experiment era in which an elegant mathematical description is all that is required to prove a physical "truth." As someone who wrote his doctoral thesis on one failed attempt to reduce physics to a purely deductive exercise, I can say that this argument is actually nothing new. Physicists have occasionally entertained this idea at one time or another throughout the past four centuries. In some sense, Hilbert formally challenged physicists to do exactly this---axiomatize physics---in his famous Sixth Problem (though, it should be noted, that his actual statement of the problem can be interpreted as having a narrower focus). Experiment, however, persists as the final arbiter of physical "truth" precisely because it has the most direct connection to our senses which remains the only way in which we directly interact with the world. In other words, we expect science to interpret the world of our experience.
That being said, modern physicists know well that direct experience can often be deceiving. Both relativity and quantum mechanics---the two foundational pillars of modern physics---raise direct issues concerning the role of the observer in our understanding of the world. Indeed, some of these issues have been at the forefront of physics for four centuries. After all, it was Galileo (and not Einstein as many incorrectly believe) who developed the principle of relativity which states that the laws of physics should be the same in all inertial reference frames. Galileo's thought experiment involved someone in a windowless cabin on a ship in calm waters---would that person be able to determine with absolute certainty if they were moving or not? The answer, of course, is no.
The key here is that this says something profound about the nature of the observer---all inertial observers must agree on the laws of physics, though not necessarily on the specific outcomes of individual experiments. In other words, the universe must be self-consistent, i.e. it must operate in the same manner for all inertial observers.
In the quantum realm, however, we learn that the observer can affect the outcome of experiments. Is this a violation of the principle of relativity, then? Not necessarily. The principle simply says that the same rules must apply to all inertial observers. Variation in the outcomes of individual experiments is allowed as long as the rules that led to those results are the same in each case.
And that, right there, is the real key to this entire discussion. Science has very carefully built a structure and methodology for addressing these issues over the course of those same four centuries that has been a consistently reliable predictor of future outcomes. Arguable this methodology---which defines modern science---is the greatest achievement in the history of humanity. One of its hallmarks, particularly in physics, is its emphasis on rigor and clarity (perhaps due to the close ties between physics and mathematics). We need to know what it is that we're talking about otherwise we end up just talking past one another, sometimes without realizing. John Wheeler's argument that we need to move beyond defining things sounds almost Aristotelean in contrast. Indeed, one way in which this grand enterprise we call science has progressed has been by agreeing on definitions and then testing those definitions. In theory, this should lead to further refinement of those definitions. Unfortunately, science, including physics itself, has become fractured enough that universal consensus on some definitions---including "the observer"---is lacking.
A classic example of this that recalls several conversations and exchanges made at the last few FQXi conferences, is the concept of entropy. Despite the fact that Boltzmann and Gibbs clarified the definition of entropy in the 19th century (which Jaynes so elegantly further clarified in the mid-20th century), there are those that persist in defining entropy via a Clausius relation. While such a relation is certainly a valid manner in which to describe certain types of entropy, it is abundantly clear from even a cursory reading of Boltzmann, Gibbs, and Jaynes that such a relation does not work as a universal definition. It is simply a way in which entropy behaves under certain, limiting conditions and tells us nothing about its actual nature.
At any rate, this brings us back to the question at hand. What is an observer? For that matter, what is an event? Does the latter require the former? These were both questions that ostensibly were to be addressed at the conference. In fact an entire session was dedicated solely to the observer. During the Q&A session of the associated panel, Jeremy Butterfield pointed out that modern philosophy, via Frege, Russell, et. al., had established set definitions for many of these "truths," relations, etc. that are generally free from the types of disciplinary bias you get in science. For example, those who prefer to define entropy via a Clausius relation are typically those who work in areas reliant on classical thermodynamics. This comment by Butterfield raises a few interesting points. Notably, it sets a definitive role for philosophy in the advancement of science. Scientists are often dismissive of the role of philosophy, but, just like science, philosophy is not monolithic. While it certainly contains its share of post-modernist rubbish, it also includes some very important and rigorous work including, as Butterfield pointed out, in defining certain important terms used by science. Having these independent, broadly developed ideas and definitions can help to relieve some of the tensions surrounding some of these definitions.
All of this is to say that we still have no consistent definition of an observer, per se. Even Butterfield did not offer one. The session and the subsequent panel did little to clear the fog on this issue, Butterfield's comments notwithstanding. That's not to say that the others didn't offer interesting and cogent opinions. David Wolpert, for instance, suggested that an observer, whatever one is, must be in a non-equilibrium state. While this is an intriguing idea, the other panelists showed no inclination to jump on that bandwagon. (It bears mentioning that, if observers automatically come equipped with a reference frame and since reference frames naturally break some symmetry when they are introduced to a problem, then Wolpert's idea might be the germ of something more generally valid and useful.)
At any rate, Jim Hartle offered up an amusing anecdote at one point that could be a kind of metaphor for the session. He mentioned that Murray Gell-Mann once compared something to "sticking a pin in the I Ching, but no one understood what the hell he meant." This may sound like a harsh indictment of the session, but it shouldn't be taken in that way. In fact, the session had the essence of one of those good, working sessions from legendary physics conferences of old like the Shelter Island Conference in 1947 (which Oppenheimer felt was the best conference he had ever attended). In that sense, the session lent a feeling to the proceedings that this---the FQXi conference---was, first and foremost, a working conference. And that is what makes these conferences so unique. Some of the brightest minds in physics, philosophy, neuroscience, mathematics, biology, computer science, and elsewhere communicate with one another, poking and prodding at the heart of difficult questions, nudging the scientific process along. This is where the real work gets done.
What exists? On the one hand, this seems like the kind of naval-gazing question that provokes derision and mockery from those more interested in practical matters. I exist, you exist, this blog post exists. It's self-evident, right? Of course one could simply presume that everything they experience is nothing but a dream or illusion and that only they, themselves, actually exist. But solipsism is a useless philosophy when dealing with the IRS, say, or anyone else for that matter. So it may seem to be a silly question to ask.
On the other hand, when one delves into it more than superficially, defining existence turns out to be about as complex as defining consciousness. Putting solipsistic arguments aside, there are some things that quite obviously exist. But then there are grey areas. In my recent blog post on consciousness, I mentioned that there was a good deal of overlap between the nature of consciousness and the nature of existence. Giulio Tononi, as I pointed out, believes that there are gradations to existence that are a result of the causal power of something. I gave the example of a painting that is completed by a painter, but then the painter and painting are immediately engulfed in flames such that no record is left of the painting leaving us wondering if it ever existed in the first place.
At a certain level, it is absurd to think it didn’t exist simply because no record of it was ever made. This is actually just a rehashing of Maxwell’s demon; there is a record of it somewhere in a real universe because the act of painting it increased the entropy of the universe in some manner. A better question (and, truthfully, the real question I have about the painting) is, what became of the information associated with the aesthetic appreciation of that painting?
To put it another way, I can imagine many fanciful things that I know simply cannot exist because they violate the laws of physics: artificial gravitational fields in relatively small, non-rotating spacecraft, spacecraft that make sound in empty space, etc. While they may not be physically realizable, they nevertheless exist in my imagination which, as part of my mind, which very clearly exists. (If you read the article on consciousness, you may recall that this was Tononi’s starting assumption about consciousness.)
In philosophical circles, this is known as ontological commitment and, as the name suggests, refers to a relation between a language and something that is proposed to be “extant” by that language, i.e. something that language says exists. It is generally understood that the “thing” that is proposed to exist does not necessarily have to be physical. One of the earliest and most influential formulations of ontological commitment was given by the philosopher W.V. Quine. What is interesting is that it centers around what can be stated in a formal language. In other words, it would seem to rule out the possibility of the existence of things that are “unspeakable,” i.e. not representable in a formal manner. This is, of course, closely (though not perfectly) aligned with Heidegger’s famous question, ‘What is a thing?’
In recent decades, physicists have even begun to consider the issues of existence and “thingness.” Chris Isham and Andreas Döring, for example, have even ventured so far as to discuss the latter directly in their work on topos theory in physics, something most physicists might be tempted to avoid, at least explicitly. So it is that FQXi convened an entire panel at this year’s conference devoted to discussing the concept of existence.
While Tononi was not actually on the panel himself, he did, as I mentioned, address the issue when he discussed consciousness, equating levels of existence with degrees of causal power: maximal existence is possessed by things with maximal causal power. Though the concept of gradations of existence is missing, Rafael Bousso’s theory of existence could be viewed as philosophically kin to Tononi’s. Bousso makes the claim that the only thing that exists is a particular causal “patch” in spacetime (this is somewhat similar to the concept of a causal “diamond”). It is his view that everything that we can measure is in a particular causal patch and it is meaningless to consider anything else. It might be tempting to think that Bousso’s approach is just a restatement of the hard-line operationalist view that would deny the existence of the moon if no one is looking at it, but I think that would be a mischaracterization. What he is really saying is that it is meaningless to talk about the existence of things that we have no hope of ever measuring. For example, he emphasized that this rules out the existence of a typical multiverse since it isn’t contained within our causal diamond (no word on what his theory might say about Wiseman’s many-interacting worlds hypothesis). The causal patch does contain many possible histories which, I suppose, might make it compatible with some consistent history theories of quantum mechanics. But the causal patch, which appears to be Bousso’s only bound on measurability, is fairly limiting. For example, it conveniently does not rule out the non-measurable aspects of string theory (of which he is a proponent). The fact remains that not all limitations on measurability are necessarily due to the dynamics of space and time.
Some of the questions I have already raised are indicative of the types of problems that the concept of a causal patch does not address. For example, Jenann Ismael asked how we can meaningfully talk about evolving interactions between the mind and the world if the mind is in the world? The more general formulation of this question might be to ask how we can meaningfully talk about interactions between a sub-system with a larger system of which it is a part. But then, as Steve Giddings pointed out, how do we properly define a sub-system?
When polled on the concept of existence, the panel offered a wide array of views, from Carlo Rovelli’s musings about the “existence” of Hamlet, to Laura Mersini-Houghton’s assertion that existence requires an observer. Ismael explicitly mentioned Quine by name, though said her views are a somewhat modified version of his arguments about ontological commitment. Bousso took a slightly different tack when pressed on the topic and said that, ultimately, what matters are the fundamental, base axioms from which everything else can be derived. In my notes, I wrote “I’m surprised to find myself agreeing with Rafael” on this point. But in hindsight I’m not sure why I wrote that since every attempt to axiomatically derive physics has, to date, failed. I wrote my PhD thesis on one such failed attempt (Eddington’s). So I suppose I will fall back on Tononi’s position: I know I exist. Perhaps the rest of you can be derived, but perhaps not.
While it isn’t the sexiest topic for a blog post, this year’s FQXi conference did include a panel discussion on science funding that raised a number of salient points worth discussing. I will slightly abuse this space and pontificate a bit.
The panel included physicist Andrew Briggs (who, at one point, mistook me for Çaslav Brikner) representing the Templeton World Charity Foundation, Ashley Zauderer from the John Templeton Foundation, Sarah Hreha from the Gruber Foundation, and Federico Faggin from the Federico and Elvia Faggin Foundation. Whatever one may think of these organizations—and they are not free from criticism—they have nevertheless funded a good deal of excellent pure scientific research that would not otherwise have been funded.
And there’s the rub: there is precious little funding for foundational research these days. As governments continue to cut back on non-military discretionary spending, researchers have no choice but to turn to private funding that, due to its very nature as private, will never be completely free from criticism concerning its motivation. The fact is, government funding for fundamental research has traditionally been blind to the philosophical motivations of research—good research stands on its own and should be relatively self-evident via its methodologies. Somewhat ironically, many of those who do criticize government as pushing a particular agenda with its funding have actually driven it to do just that by increasing the emphasis on and importance of practical research, i.e. research that leads to applications, particularly those with short-term economic impact. In other words, meddlesome politicians have increasingly been attempting to ensure that government research funding does push an agenda—theirs (the politicians’).
This, of course, is terrible for fundamental science. I won’t spend time explaining the importance of fundamental science here as I will assume that most readers understand this importance (nevertheless, see here and here for example). What I will say is that the increasing reliance on private funding is leading to more fractured research. What do I mean by that?
Critics of Templeton and similar organizations assume that these groups pressure researchers to fit their findings into the pre-existing ideologies espoused by the organizations. So, for example, they could assume that the Templeton organizations might try to prevent the publication of Templeton-funded research results that directly contradict what the critics might see as Templeton-supported philosophies such as religion and free enterprise. Within the larger Templeton umbrella, there seems to be little evidence that this takes place. Of course, the same can’t be said of all such organizations. Indeed, there are plenty that do push various agendas and that routinely attempt to silence dissenting opinions. But for every one of these types of foundations there is one that puts no pressure on its researchers.
The greater problem, in my mind, is the increasing specialization of funding areas. So while many (perhaps most) such private funding organizations may not try to directly influence the outcomes of the research they sponsor, they do influence what type of research gets funded in the first place by focusing on particular areas. This is what I mean by “more fractured research.” Under the traditional model, government funding would broadly cover the basic sciences and foundational researchers could often fairly easily justify their research within such a broad call.
This post-World War II model of government agencies as quasi-independent and free from political pressures is a kind of “live-and-let-live” ideal in which the freedom of the researcher to explore (which is the essence of science) is fostered by not limiting the ideas that seed the research in the first place. But with an increasing reliance on private funding, there comes an increasing focus on the interests of the funders themselves. I want to emphasize that there is nothing inherently wrong with that. Foundations are free to spend their money as they wish and, as I mentioned above, they have funded some very important work. The problem is that it creates gaps in funding.
If history has taught us anything it’s that we are terrible at predicting how discoveries in one area may lead to advancements in a seemingly unrelated area. Who, a century ago, would have—indeed could have—predicted that a theory as esoteric as relativity would become an indispensable foundation of modern life (via the ubiquity of GPS-driven technologies)? Luckily unlikely connections between fundamental research and later technological advances were at least understood to exist, even by private corporations, many of which used to have phenomenal support for foundational research. Westinghouse, GE, Bell Labs and IBM come immediately to mind as examples. But an increasing emphasis on short-term profit margins has all but killed off fundamental corporate research.
Of course, there’s another problem that is increasingly exacerbated by this new funding paradigm. Private foundations such as the Templeton organizations, the Gruber Foundation, and the Faggin Foundation simply do not have the resources of governments or large corporations. As such, they are limited in their ability to support anything other than bold new ideas. Scientific results must be reproducible but, as Matt Leifer pointed out in the Q&A session after the initial panel discussion, there is very little funding for projects aimed at reproducing existing research results, particularly in physics. Independent verifiability is a cornerstone of science. As an example, consider Joe Weber’s claim in 1968 that he had detected gravitational waves in his lab at the University of Maryland. None of his results could ever be reproduced despite several attempts. While this lead to an eventual consensus that Weber had not, in fact, observed gravitational waves, it also helped spur the development of LIGO which recently did detect gravitational waves (and, at the press conference announcing the results, Kip Thorne was quick to credit Weber with really starting the field). What if no one had ever really questioned Weber’s results or been able to adequately test them? What if everyone simply moved on? It’s difficult to say if the world would have been a different place, but it is certainly not clear that LIGO would have ever been built and the broader applications of the technologies developed for it have yet to be fully realized. Certainly, verification of existing results is not sexy work. But it is important work that gets at the heart of what science is: the embodiment of skepticism.
Where does all of that leave us? Eric Weinstein (also during the Q&A session), in what may have been the most impassioned argument of the conference, entreated scientists to stop “asking for money” and instead demand some kind of royalty payment for creating the backbone upon which modern society itself is founded. Stop for a moment and think about the ubiquity of the worldwide web. Think about how it has so radically changed society and particularly commerce. Think about the trillions of dollars in wealth it produces. Now think about Tim Berners-Lee who created it as a collaboration tool for physicists working at CERN. Tim doesn’t get a penny of that wealth and neither does CERN. Certainly, they may receive some indirect benefits, but there is little appreciation (let alone knowledge) among the general populace of the origins of the web and even less desire to, perhaps, repay some of that.
Of course, Eric’s entreaty, though passionate, would be extremely hard (if not impossible) to implement on a practical level and I’m sure Eric knows this. The broader point that he was making is that the dialogue needs to change. Science needs to stand up for itself, not just on social media and in the press, but also in how it deals with those who control the vast majority of the money that lubricates the economy.
Through his foundation, Federico Faggin funds research in areas that he is passionate about. This is a good thing and we need more funders with Federico’s passion. But we also need to recognize the bigger picture regarding science’s place in the world. This includes not losing site of two of science’s greatest traits: its inherent skepticism and its gestation in the free exchange of creative ideas. I don’t have easy solutions to any of the issues raised here and neither did any of the panelists. Sometimes it is necessary to simply frame the question. I don’t know if that was the intent behind the creation of this panel and I hesitate to say that they were successful in that regard. But they did get the dialogue started. It’s our job to keep it going.
One of the many highlights of the recent FQXi conference on the Physics of the Observer was the session on consciousness. Consciousness is quite possibly the most enigmatic aspect of human existence. It is at the core of who we are as individuals (and, some argue, as a species) and yet we don’t really know quite what it is, let alone whether or not it has a physical basis.
This fall I am teaching a course on the nature of time (based partly on the 2011 FQXi conference in Bergen and Copenhagen) and early on I introduce the notion of a system. For the purposes of the class, I begin very simply: a system is anything that has measurable properties or characteristics. Properties and characteristics are used to help distinguish two systems from one another or one system in a certain state from that same system in another state (where a state is any configuration of properties and characteristics). For the moment I’ll just say that they are anything that can be measured for the purposes of distinguishing states. I will also refrain from defining the concept of measurement since that could be an entire blog post unto itself.
At any rate, this raises the important question: is consciousness a system? The human body is a system; an automobile is a system; the galaxy is a system. Even language is a system which means systems can be abstract concepts as long as they have measurable properties and characteristics. Finding measurable properties and characteristics of consciousness is one of the goals of Giulio Tononi who, more than a decade ago, first introduced Integrated Information Theory (IIT) as a means of tackling these deep problems of consciousness.
I first met Tononi at the 2014 FQXi conference in Vieques which featured several terrific talks on IIT (and consciousness in general) from the likes of Tononi, Larissa Albantakis, Christoph Koch, and Chris Adami. One of the scheduled conference excursions was a kayak outing on a bioluminescent bay. The road out to the kayak launch was bone-jarringly rough (and was not helped by our guide’s penchant for speed and the old van’s completely useful shocks and struts). By the time we arrived I was on the verge of losing what little food I had eaten that day. Tononi drew the short straw and ended up as my kayaking companion. Ever gracious, he did much of the paddling and allowed me some time to recoup on the relatively still waters of the bay. We had a wide-ranging conversation about consciousness and, in particular, some of the disorders he had encountered as a psychiatrist over the years. It was a singularly memorable experience.
David Chalmers (who was also at the conference this year) has argued that any attempt to define consciousness in purely physical terms will eventually run into the so-called hard problem which is the problem of explaining how and why we have phenomenon-based experiences, i.e. how sensations, for example, can acquire properties such as taste. IIT, by contrast, begins simply by assuming consciousness exists, i.e. it takes it as an axiomatic truth. Questioning its existence isn’t going to get us anywhere. In other words, I could assume that my entire life is nothing but a dream, but that won’t stop the IRS from trying to collect my taxes every year. So IIT takes consciousness as being self-evident.
In truth this is how most of science works. We have to start somewhere when developing theories to explain the world and so we create hypotheses, propose axioms, and develop propositions which are then tested and analyzed. We may find that some are correct and we may find that some are not correct. But, again, we have to start somewhere.
Tononi starts with experience. He thinks of consciousness as having an experience. That’s what defines it. As he noted in his talk at this year’s conference, you can’t “squeeze” consciousness out of the brain. In this sense, his approach would seem to circumvent Chalmers’ argument that purely physical definitions will ultimately lead to the hard problem by starting with an axiom rather than a physically measurable phenomenon (as a note, Chalmers appears to be personally agnostic concerning IIT).
Of course any discussion of consciousness is bound to have a strong overlap with any discussion about existence, i.e. “what exists.” Tononi seems to think there are gradations to the concept of existence. In other words, existence isn’t binary. Existence is based on causal power. Anything with maximal causal power definitively exists. Consider the following example which is one I routinely have used when thinking about the nature of information. A painter locks himself (or herself) in a room and paints an absolute masterpiece. Just as the painting is finished, a fire engulfs the building, burning it to the ground, taking the painter and the painting with it. Did the painting exist?
I originally came up with this scenario in order to ask about the nature of the information contained in the painting and, in particular, in the aesthetic appreciation of the painting. The painting itself adds information to the world and burning it simply rearranges that information. But the aesthetic appreciation of that painting also adds information to the world. What happens to that information when the painting is burned if it (the information) is not conveyed to anyone else?
From the standpoint of IIT, this is akin to asking if the painting ever existed in the first place. As Masafumi Oizumi pointed out in his talk, IIT basically says that it is necessary (though not sufficient) for a system to produce information in order to produce consciousness. All of the raw materials of the painting were already in the building when it burned. Creating the painting had no lasting effect; it had no causal power. By the standards of IIT it never existed. As Tononi pointed out, functionally equivalent systems are not necessarily phenomenally equivalent systems and vice-versa. If we were to consider two equivalent rooms in the burned building, both with the same art supplies and each containing an artist, but assume that only one of the artists painted a masterpiece before the building burned to the ground, the two rooms are phenomenally equivalent in the record they left in the world and yet, since the artists each took different actions prior to the fire, the two rooms are not functionally equivalent.
At any rate, Tononi says that if consciousness does exist then its existence must be based on its maximal causal power. This, of course, raises all sorts of interesting sociological and psychological questions about people who go unnoticed by society, but the reason Tononi makes this assertion is because it provides a means by which the theory can be measured—integrated information,
the details of which are beyond this blog post. Suffice it to say that the panel discussion at the conference was lively and interesting. But as FQXi’s fearless leader Max Tegmark pointed out, while IIT may or may not be correct, it is at least testable. As such, Christoph Koch of the Allen Brain Institute (who was at the 2014 conference but not at this year’s) has said it is “the only really promising fundamental theory of consciousness.” That seems reason enough to study it.
On Friday, cosmologist Sean Carroll spoke about his latest research into the emergence of space — and maybe gravity — from quantum entanglement.
Sophie Hebden has profiled Carroll’s work for us, in the article “In Search of a Quantum Spacetime.” Many physicists, when trying to think about what the world looks like on small scales, start with a classical framework — a picture of the world in which objects have definite properties — and then try to modify it to make it quantum. Carroll and his colleagues argue that nature is fundamentally quantum and work their way back to the world we see around us from that staring point.
Listen to the audio from his talk to find out more about this, and the idea of describing the evolution of systems in terms of “quantum circuits.”
Sean Carroll asks What Happens Inside the Wavefunction? From the 5th FQXi International Meeting.
Dirty Secrets of…Quantum Foundations: Matt... By ZEEYA MERALI
[picture]Audio from Matt Leifer’s talk from the FQXi meeting has now been posted here (video of this, and all other talks from the meeting will follow). If you’ve been following our essay contests over recent years, you’ll know that Leifer...
Announcing Physics of the Observer Grant Search... By BRENDAN FOSTER
[picture]Last October, FQXi announced its new program on Physics of the Observer, including a request for proposals on research and outreach projects. We asked applicants to consider questions like, what does it mean to be an observer? What sort of...
Happy 10th Birthday FQXi! Podcast: Space News,... By ZEEYA MERALI
[picture]Believe it or not, it’s a decade since FQXi launched, back in May 2006. There’ll be more celebrations on the site to come, but to commemorate our birthday month, we invited one of FQXi’s directors, Anthony Aguirre, on to the latest...
David Ritz Finkelstein (1929 - 2016) By BOB COECKE
We all just enjoyed the detection of gravitational waves due to two colliding black holes. David Ritz Finkelstein, who passed away in January, was the first, in 1958, who identified Schwarzschild's solution of the GR equations as corresponding to a...
RIP Edgar Mitchell By WILLIAM OREM
Rest In Peace astronaut Edgar Mitchell, the sixth man to walk on the moon and the last member of the Apollo 14 mission. He was oddly credulous for a man of science, confusing the public with his proclamations that alien visitors interceded in the...
How risky is too risky? Evaluating the expected... By ANTHONY AGUIRRE
[picture]On Dec. 8, 2015, two different groups (sharing an author) posted papers to the arXiv announcing the possible detection of planet-sized objects in the far outer solar system (Vlemmings et al, arXiv:1512.02650v2 and Liseau et...
Measuring Consciousness in the Lab By MAX TEGMARK
[picture]If you're driving, you're having a subjective experience of colors, sounds and vibrations. But does a self-driving car have a subjective experience? Does it feel like anything at all to be a self-driving car, or is it a zombie in the sense...