Our methodologies consist of electrophysiological measures, magnetoencephalography, VR simulations, longitudinal methods, and experience sampling. The research is carried out in close collaboration with international and national partners.
Our sense of time plays a crucial role in constituting a unified experience of the surrounding world and self-consciousness. The perception of time adaptively changes according to circumstances but can also be seriously disturbed by strong emotional states and psychopathological conditions.
Our team uses immersive virtual reality and brain research methods to examine the effects of action-perception contingencies and emotional states on the perceived passage of time. Time perception is measured using established cognitive paradigms such as the temporal reproduction and bisection tasks.
To see which processes and neural networks underlie the modulated perception time, we use electroencephalography, autonomic nervous system measures, and magnetoencephalography to record the participants’ nervous system activity related to the timing task.
The research is aimed at bringing light on the mechanisms underlying the distorted perception of time in people with psychopathologies. In collaboration with the team of professor Giulio Jacucci, we are creating a bioadaptive virtual reality system that detects the participant’s psychophysiological state and adapts visual features of the VR to recalibrate the person’s perception of time.
Relaxation and meditation techniques can be used to promote stress regulation and emotional wellbeing. We have translated these techniques into technological environments, such as virtual reality. Additionally, visualization of physiological activity obtained from sensors, such as EEG and electrocardiography (ECG) can be provided for the user as biofeedback.
Receiving biofeedback has been shown to facilitate the effects of meditation and relaxation techniques. In our research, the participants are immersed in a virtual meditation environment where they receive biofeedback about their physiological state in the form of colorful visualizations and levitation of their avatar.
Such setting has been proven to facilitate meditative state and evoke empathy in the case of compassion meditation while two persons are immersed in the same meditation environment where they can see each other’s biofeedback.
Emotional expressions as manifested in facial movements, voice, and touch are a crucial part of face-to-face interaction. How the brain processes such multimodal emotional information remains poorly understood. Our group has investigated how emotional expressions conveyed multimodal channels during virtual face-to-face interaction are integrated into perceptual processing and decision-making.
Utilizing event-related brain potentials (ERPs) and recordings of autonomic nervous system responses in simulated interaction setups, we have been able to show that a sender’s facial emotional expressions modulate subsequent touch perception at a very early stage of somatosensory processing. In addition, our findings show how motivational tendencies and gender influence the manner in which people perceive multimodal emotional expressions and make decisions based on them.
Aging people face a multitude of new challenges in their everyday lives, such as deterioration of cognitive resources, problems in wellbeing, and a decrease in workability. Devices such as smartphones enable people to have easy access to digital tools that help to tackle these problems.
Our research focuses on detecting problems related to deteriorating cognitive resources due to aging and creating tools to help mitigate these problems utilizing adaptive mobile technologies. We use intensive longitudinal data collected with unintrusive wearable sensors and smartphone apps to monitor and analyze the dynamic nature of a person’s well-being.
This allows a holistic approach to detect and predict periods of decreased well-being and problematic behaviors. Additionally, the same technology will be used to deliver treatments. As part of the research, a mobile app is developed to deliver support for behavior change and stress regulation in an automated or counselor-controlled fashion.
Exploring and modifying the sense of time in virtual environments. Click here for the project pages for VIRTUALTIMES.
Adaptive environments and conversational agent based approaches for healthy ageing and work ability. Click here for the project pages for CO-ADAPT.