Jump to main content
Multistable Perception across Modalities
Details
Multistable Perception across Modalities 
Logo Multimodal Mulitstability - notes in a Necker Cube

Multistable perception across modalities

Resolving sensory ambiguity in vision and audition

Human sensory systems are continuously confronted with ambiguous information - yet perceptual experience is usually stable, unique and unambiguous. A promising approach to understand the formation of unambiguous percepts from ambiguous sensory evidence is to challenge this process experimentally by inducing perceptual multistability. In multistability, a stimulus evokes distinct perceptual experiences that alternate over time without a corresponding change in sensory evidence. Multistable phenomena have been described in most sensory modalities. In vision, multistability is typically used synonymously to rivalry, reflecting competition between perceptual interpretations. Examples include Necker's cube, Rubin's vase/face reversal, dynamic stimuli (e.g., apparent motion), and binocular rivalry, where the two eyes are presented with conflicting stimuli. In audition, multistability is studied with sequences of tones that are alternatingly perceived as bound or unbound, and with verbal transformations where the repetitive presentation of the same word induces the perception of rearrangements (e.g., life/fly). Given the scientific interest in multistability in either modality, the respective research fields are surprisingly segregated, and some effects well-known in one field are mostly ignored in the other. This is even more remarkable as in both fields there is considerable debate as to whether a common process is involved in various forms of multistability, and such a process would predict links between modalities beyond those observed so far. To overcome this gap between auditory and visual multistability, the present proposal brings together expertise from both fields. Besides the transfer of knowledge, specific multimodal paradigms will be developed that allow simultaneous measurement of auditory and visual multistability without response bias or interference. This possibility to assess several multistable effects at the same time also allows for assessing the role of intra-individual fluctuations. Well-established concepts from one modality will be tested in the other (e.g., predictability, multistable object formation, volitional control). Concepts or paradigms for which results in the two modalities appear to be in conflict (e.g., interruption of stimulus presentation, sequences for stimuli with more than two interpretations) will be systematically compared by matching the respective designs as closely as possible. As main results, we expect a better understanding of multistability in either modality, a joint framework that addresses common principles - and possibly common processes - across modalities, and a separation of modality-specific from modality-general factors. Since multistability is the model case for perceptual organization at constant sensory evidence, the results will also broaden our understanding as to how higher-level (cognitive) factors shape the organization of human perception beyond modality-specific accounts.