Zenith Grant Awardee
Nao Tsuchiya
Monash University
Co-Investigators
Hayato Saigo, Nagahama Institute of Bio-Science and Technology, Japan; Steven Phillips, National Institute of Advanced Industrial Science and Technology (AIST), Japan
Project Title
Precision phenomenology: revealing the structure of qualia
Project Summary
Recent progress in consciousness research is driven due to technological advances in measuring neural activity in human brains. Neuroscientific understanding of conscious brains recently led to several theories of consciousness. Testing these theories is still difficult because we still lack clear understanding of properties of conscious sensation. Mathematical phenomenology is an emerging field of the study that tries to model mathematical properties of consciousness, such as color, that precisely matches with people’s reports. Here, based on a mathematical tool, called category theory, we propose a model of visual consciousness. Our model is consistent with the known facts in psychology and neuroscience while it captures our impressions of experience when seeing a brief flash of a natural photograph. As a test our model, we examine if we can indirectly but perfectly characterize and predict our particular sensation, for example color of redness in some part of the photo, by its similarity relationships with all other colors in the photo. To test this prediction, we will employ the PI's recent, novel, web-based psychological experimental platform, called Massive Report Paradigm. This will allow us to “measure” the quality of consciousness for the first time, serving as a litmus test for theories of consciousness.
Technical Abstract
Recent progress in understanding the physical substrates of consciousness is largely technology-driven, with little insight into mathematical structures of quality of consciousness, or qualia. As such, any scientific efforts to understand the nature of qualia suffers from this bottleneck: poor understanding of structures of the explananda. Here, we employ category theory to address this gap, proposing a model of visual phenomenology, which captures impressions of visual experience while being consistent with known physiology and psychophysics. Critically, we propose mathematical conditions on particular properties of qualia (e.g., color) and their similarity relationships. If they satisfy these conditions, we can regard qualia and their relations to constitute an “enriched category”, with which we can characterize qualia, which are difficult to even describe on their own, through a massive web of relationships with the other qualia, via the Yoneda lemma in enriched category theory. Thus, this project will first empirically test if qualia and their relationships satisfy the required conditions for an enriched category. We will overcome the difficulty to measure a required number of many similarity judgements among all possible qualia of a certain type (e.g., color) through the use of PI’s invention: web-based experiments, called the Massive Report Paradigm.
QSpace Latest
PressRelease: Shining a light on the roots of plant “intelligence”
All living organisms emit a low level of light radiation, but the origin and function of these ‘biophotons’ are not yet fully understood. An international team of physicists, funded by the Foundational Questions Institute, FQxI, has proposed a new approach for investigating this phenomenon based on statistical analyses of this emission. Their aim is to test whether biophotons can play a role in the transport of information within and between living organisms, and whether monitoring biophotons could contribute to the development of medical techniques for the early diagnosis of various diseases. Their analyses of the measurements of the faint glow emitted by lentil seeds support models for the emergence of a kind of plant ‘intelligence,’ in which the biophotonic emission carries information and may thus be used by plants as a means to communicate. The team reported this and reviewed the history of biophotons in an article in the journal Applied Sciences in June 2024.