top of page
human-biology-background.jpg

VR seminar series 2022-2023

Mondays – 14:15-15:30
Sharet building, room 214

Tel-Aviv University

7/11

Tom Schoenberg, Head of the TAU-Minerva XR/VR Center, Tel Aviv University, Israel

Introduction to the TAU-Minerva XR/VR Center for the study of human intelligence in immersive, augmented and mixed realities.

A new center had been established on campus called “TAU XR” (extended realities). In this lab, researchers from across campus can now use immersive, augmented and mixed-reality environments for conducting research spanning all areas of research as well as or enhancing students’ learning experience. This allows for unique research and teaching opportunities. This research includes data-rich environments, with big data including eye tracking, head and body location that can be quantified.

19/12

Dominik Bach, University of Bonn, Germany

 

Critical intelligence: computational characteristics of human escape.

Animals have to cope with immediate threat to survive and reproduce. Many species employ complex and sophisticated defensive behaviors. Rapid decisions between these actions, without much leeway for cognitive or motor errors, poses a formidable computational problem. In this lecture, I will give an overview of these challenges, and potential theoretical solutions. I will then present data from experiments designed to reveal the biologically implemented decision algorithms. Here, humans are exposed to various threats in a fully-immersive virtual reality, in which they can escape and run for shelter. These data challenge the view that human escape behaviour is instinctive or hard-wired. Instead, the underlying algorithm appears goal-directed, and dynamically updates decisions as the environment changes. In contrast, information-seeking behaviour might rely on simpler computations.

20/3

Elana Golumbic, Bar Ilan University, Israel

"Ecological investigation of real-life attention: Insights from the Virtual Cafe and Virtual Classroom"

The construct of attention has been studied extensively by cognitive psychologists and neuroscientists, using carefully controlled laboratory experiments. These types of studies have shown that attention facilitates many essential cognitive abilities such as learning, memory, social communication, goal-directed behaviors, and self-control. However, despite extensive research on the cognitive and neural mechanisms of attention, most empirical research is carried out in labs under highly controlled conditions and using artificial paradigms that are a far cry from the challenges of attention in real-life environments. Advances in Virtual Reality and wearable neurophysiological technology now affords the opportunity to bridge the gap between the lab and real-life, and to study human attention under ecological conditions that simulate those faced with on a daily basis. In this talk, I will present a novel VR-based experiment platform for studying behavioral, ocular, physiological and neural manifestations of real-life attention. We focus specifically on two typical scenarios: The Virtual Café, where individual attempt to pay attention to natural speech of a partner amidst a chaotic background, and the Virtual Classroom, where student are challenged to pay attention to the teacher's lesson and resist distraction from external, task-irrelevant stimuli in the environment. In both cases, we show evidence for neural processing of both task-relevant and task-irrelevant stimuli, demonstrating the non-exclusive nature of real-life attention. Moreover, our results highlight important individual differences in the deployment of attention and sensitivity to irrelevant stimuli in the environment, which are often overlooked in traditional attention research.

1/5

Stefan Debener, Oldenburg University, Germany

Monitoring brain activity - out and about

Unlike other functional neuroimaging modalities, electroencephalography (EEG) shows promise in capturing human brain activity during natural behaviour and whole-body movements. Since human behaviour is context-dependent, advances in the field of cognitive neuroscience can be expected from mobile EEG research. However, interpreting brain activity recorded in complex, uncontrolled situations is very challenging and requires the availability of contextual information, such as sounds, movements or other physiological signals. Moreover, hardware should be unobtrusive and robust, without compromising signal quality. I will discuss the current state-of-the-art and report several studies using mobile EEG for the investigation of cognitive-motor interference and auditory attention tracking in uncontrolled environments.

5/6

Simone Shamay-Zuri, University of Haifa, Israel

The empathic brain: A two-brain approach for understanding empathy

Empathy allows us to understand and share one another’s emotional experiences. Despite the developments in the study of empathy, the vast majority of empathy paradigms focus only on passive observers, carrying out artificial empathy tasks in socially deprived environments. This approach significantly limits our understanding of interactive aspects of empathy and how empathic responses affect the distress of the sufferer. We recently proposed a brain model that characterizes how empathic reactions alleviate the distress of a target. In a series of experiments, we examined brain-to-brain coupling during empathic interactions. We show that, brain-to-brain coupling in the observation-execution (mirror) network increases in empathy conditions. Critically we found that brain-to-brain coupling predicts distress regulation in the target. We extend this work to understand interaction of groups and show aberrant synchrony in autism spectrum conditions. We conclude that employing this multi-brain approach may provide a highly controlled setting in which to study social behavior in health and disease.

26/6

Michal Ramot, Weizmann Institute, Israel

Disentangling the physiological and cognitive pathways of anxiety

***

VR seminar series syllabus
bottom of page