Projects
For more information regarding current and past projects please click on the associated tabs.
Humans integrate sensory stimuli from the physical (e.g. sight, hearing, touch) and chemical senses (e.g. smell, taste) into a coherent multisensory perception of their environment and evaluate it. For example, we perceive food based on its appearance and odour and evaluate whether we like a food or not. However, our brain should only integrate and evaluate multisensory stimuli that can originate from a common cause: When seeing and smelling food, the brain must first infer whether the appearance and smell were actually caused by a single food, and not by different foods or other odour sources. This project uses psychophysical, psychophysiological and EEG methods to investigate how humans infer the causal structure of olfactory-visual food stimuli, integrate or segregate the stimuli and finally evaluate their multisensory pleasantness. The results will significantly expand our understanding of how the brain forms, evaluates and expresses the multisensory perception of food.
Duration: 2023-2026
In our everyday environment, our brain constantly combines a multitude of multisensory stimuli into a coherent multisensory perception of the environment. For example, we correctly perceive which voices belong to which faces at a party. In order to link the stimuli veridically, the brain should only combine stimuli from one source (such as a speaker). However, the brain only has limited attentional capacities to process the multisensory stimuli and to focus on relevant stimuli (e.g. the conversation partner). The brain therefore has to solve two challenges: First, it must infer the causal structure of the multisensory stimuli in order to integrate them in the case of a common cause or to segregate them in the case of independent causes. Second, the brain must use selective attention to focus its limited attentional resources to relevant stimuli in the competition of multisensory stimuli. The current project investigates the interplay of multisensory causal inference and attention in audiovisual perception using psychophysics, electroencephalography (EEG) and Bayesian computational modelling.
Duration: 2024-2027
Repeated and persistent rumination (brooding) is a major symptom of depressive disorders, while at the same time physical activity shows positive effects on depressive symptoms. In cooperation with the University of Tübingen, the project is investigating whether the positive effect of physical activity is mediated by a reduction in rumination. In an EGG study with depressive individuals, we develop a decoder that recognises ruminative states from neurophysiological EEG activity patterns. Thus, we can demonstrate how physical activity reduces rumination that we decode online from neurophysiological data.
Duration: 2021-2025
In addition to spatial vision and hearing, humans can also spatially smell the location of external odour sources, similar to many animals (e.g. dogs). It is unclear whether humans combine spatial olfaction with spatial vision or hearing. In this psychophysical study, we investigate whether our brain integrates spatial odour stimuli (olfactory stimuli) with auditory spatial stimuli. The project is funded by the Special Fund for Scientific Work at FAU.
Duration: 2023-2024
Humans integrate stimuli from different sensory systems into a coherent and reliable representation of the environment. Yet, the human brain only integrates the stimuli, weighted proportional to their sensory reliability, if a small spatial and temporal disparity between the stimuli suggests a common cause of the stimuli. If large discrepancies suggest independent causes of the stimuli, the brain segregates the stimuli. Our previous and current work suggests that the dorsolateral prefrontal cortex (dlPFC) first determines the causal structure of the stimuli based on their discrepancy, and then the anterior intraparietal sulcus (aIPS) integrates the stimuli in a reliability-weighted manner in the case of a common signal source. However, this model of multisensory causal inference in cortical hierarchies has so far been little tested. In addition, schizophrenia could alter multisensory causal inference, leading for example to hallucination. In the project, we investigate where, when and how cortical hierarchies make a causal decision and integrate or segregate audiovisual stimuli. In EEG and fMRI studies in healthy volunteers and schizophrenia patients, we are investigating the role of the dlPFC in causal decisions and the aIPS in the integration of audiovisual stimuli. In a TMS study, we then investigate whether both regions also play a causal role in causal decisions and integration.
Duration: 2018-2021
Psychophysical study on the influence of attention on explicit causal inferences in audiovisual perception. See the resulting DFG project MultiAttend: DFG project ‘MultiAttend’
Duration: 2021-2022