The Minerva center runs its activities in TAU XR center.
Please check out TAU XR for more information.
Activites
Activities
Course
Research in Extended Realities: Pros and Cons
In recent years there has been an increase in the number of studies using virtual/ augmented/ extended realities. In this seminar we will discuss the pros and cons of these novel techniques. Course lecturers as well as students will present different studies using these techniques to study human behavior. Course Number: 1501-1036
Course
Unity3D for XR - Minerva Course
Latest advancements in real-time 3D technologies and spatial computing platforms (XR) make the creation of “Immersive Experiences” accessible than ever. The study of human intelligence and behaviour relies on what scenarios and conditions researchers can create for to their experiments. Immersive Experiences in research allows researchers full control over their experiments: they can measure all of the subject’s behaviour by accurately tracking every interaction it makes with the world, while having 100% control over the world it is immersed in. This 21 hours course aims to provide students with a basic understanding of how to build and design immersive interactive experiences that are intuitive and self-explanatory.
December 19
14:15
Critical intelligence: computational characteristics of human escape // Prof. Dominik R. Bach
University of Bonn, Germany
Animals have to cope with immediate threat to survive and reproduce. Many species employ complex and sophisticated defensive behaviours. Rapid decisions between these actions, without much leeway for cognitive or motor errors, poses a formidable computational problem. In this lecture, I will give an overview of these challenges, and potential theoretical solutions. I will then present data from experiments designed to reveal the biologically implemented decision algorithms. Here, humans are exposed to various threats in a fully-immersive virtual reality, in which they can escape and run for shelter. These data challenge the view that human escape behaviour is instinctive or hard-wired. Instead, the underlying algorithm appears goal-directed, and dynamically updates decisions as the environment changes. In contrast, information-seeking behaviour might rely on simpler computations.
March 20
14:15
Ecological investigation of real-life attention: Insights from the Virtual Cafe and Virtual Classroom // Prof. Elana Golumbic
Bar Ilan University
The construct of attention has been studied extensively by cognitive psychologists and neuroscientists, using carefully controlled laboratory experiments. These types of studies have shown that attention facilitates many essential cognitive abilities such as learning, memory, social communication, goal-directed behaviors, and self-control. However, despite extensive research on the cognitive and neural mechanisms of attention, most empirical research is carried out in labs under highly controlled conditions and using artificial paradigms that are a far cry from the challenges of attention in real-life environments. Advances in Virtual Reality and wearable neurophysiological technology now affords the opportunity to bridge the gap between the lab and real-life, and to study human attention under ecological conditions that simulate those faced with on a daily basis. In this talk, I will present a novel VR-based experiment platform for studying behavioral, ocular, physiological and neural manifestations of real-life attention. We focus specifically on two typical scenarios: The Virtual Café, where individual attempt to pay attention to natural speech of a partner amidst a chaotic background, and the Virtual Classroom, where student are challenged to pay attention to the teacher's lesson and resist distraction from external, taskirrelevant stimuli in the environment. In both cases, we show evidence for neural processing of both task-relevant and task-irrelevant stimuli, demonstrating the non-exclusive nature of real-life attention. Moreover, our results highlight important individual differences in the deployment of attention and sensitivity to irrelevant stimuli in the environment, which are often overlooked in traditional attention research.
March 11
14:15
Spatial cognition and interaction in extended
realities // Shachar Maidenbaum
Ben Gurion University
Recent years have seen amazing progress in the realms of virtual and augmented reality. From expensive niche tools they are transforming into mainstream consumer products, and in the process unlocking huge potential for research. This potential is twofold - on the one hand, we can use these tools to naturalistically test fundamental questions of how we perceive with and interact with our world, while on the other hand we need to research how humans interact with these new worlds in and of themselves. In my talk I will discuss recent work from our lab focusing on both of these aspects from a spatial prism . First, we will discuss questions of spatial memory and navigation using different reality modalities and interfaces, demonstrating how critical physical motion is for generating naturalistic spatial behavior, and for giving rise to missing neural representations of spatial behavior as we will demonstrate via ECoG recordings. Then we will discuss impossible environments and show how we can use such environments to generate and research sustained sensory clashes to investigate multisensory integration. From there we will turn to demonstrating how humans interact spatially with virtual vs. real objects in mixed reality settings, and the implications of these interactions for the development of practical rehabilitation tools. Finally, I will present some early work from our lab focusing on these tools’ potential for rehabilitation from stroke and for facing PTSD and other mental challenges.In conclusion, I will explore the advantages of employing VR in eye tracking research and offer insights into the potential future trajectory of the field.
June 5
14:15
The empathic brain:
A two-brain approach for understanding empathy // Prof. Simone Shamay-Tsoory
University of Haifa
Empathy allows us to understand and share one another’s emotional experiences. Despite the developments in the study of empathy, the vast majority of empathy paradigms focus only on passive observers, carrying out artificial empathy tasks in socially deprived environments. This approach significantly limits our understanding of interactive aspects of empathy and how empathic responses affect the distress of the sufferer. We recently proposed a brain model that characterizes how empathic reactions alleviate the distress of a target. In a series of experiments, we examined brain-to-brain coupling during empathic interactions. We show that, brain-to-brain coupling in the observation-execution (mirror) network increases in empathy conditions. Critically we found that brain-to-brain coupling predicts distress regulation in the target. We extend this work to understand interaction of groups and show aberrant synchrony in autism spectrum conditions. We conclude that employing this multi-brain approach may provide a highly controlled setting in which to study social behavior in health and disease.
May 1
14:15
Monitoring brain activity - out and about // Stefan Debener
Oldenburg University, Germany
Unlike other functional neuroimaging modalities, electroencephalography (EEG) shows promise in capturing human brain activity during natural behaviour and whole-body movements. Since human behaviour is context-dependent, advances in the field of cognitive neuroscience can be expected from mobile EEG research. However, interpreting brain activity recorded in complex, uncontrolled situations is very challenging and requires the availability of contextual information, such as sounds, movements or other physiological signals. Moreover, hardware should be unobtrusive and robust, without compromising signal quality. I will discuss the current state-of-the-art and report several studies using mobile EEG for the investigation of cognitive-motor interference and auditory attention tracking in uncontrolled environments.
Feb 19
14:15
What can we learn from eye tracking in virtual reality? // Prof. Yoni Pertzov
University of Haifa
The integration of eye tracking technology into virtual reality (VR) has ushered in a new era of experimentation, offering unique insights into human behavior. In this presentation, I will explore how eye tracking in VR unveils the intricacies of visual exploration within immersive environments, particularly in relation to memory and the interpretation of complex scenarios. In one set of studies, participants committed either a mock crime or an unrelated task. Subsequently, both guilty and innocent participants were immersed in a VR image of the crime scene, to which several salient modifications were made. Guilty individuals looked more (effect size = 0.88) and earlier (effect size = 1.00) on these alterations compared to their innocent counterparts. Notably, distinguishing between guilty and innocent participants based on their eye movements towards the modified areas proved significantly effective (AUC = 0.75). In a next study these results were replicated, and the effect of scene modifications was further examined. Except from shedding light on how attention deployment relates to memory, these studies offer promising implications for forensic applications. In a separate set of studies, participants observed a tense scenario in VR depicting two Palestinians attempting to navigate a checkpoint guarded by soldiers who suspected one of them of carrying explosives. Interestingly, divergent interpretations emerged among observers, with some anticipating the soldiers to open fire while others disagreed. These distinct viewpoints were reflected in different patterns of scanning, highlighting the nuanced connection between visual exploration and situational interpretations. In conclusion, I will explore the advantages of employing VR in eye tracking research and offer insights into the potential future trajectory of the field.
VR Seminar