Project description:Virtual reality (VR) is a potentially challenging social environment for effective communication and collaboration. Thus, we conducted a VR study to determine whether increased familiarity with a teammate would improve performance on a joint decision making task. Specifically, because attitude familiarity, or knowledge of another person's attitudes, has been correlated with better relationship functioning in the past, we anticipated that team performance would improve when teammates were first asked to discuss their task-relevant attitudes with one another. We also hypothesized that increased familiarity would be particularly useful in immersive VR, where typical social and other nonverbal cues were lacking. Twenty pairs recruited from a workplace environment were randomly assigned to either the Familiar or Control condition before completing a joint decision making task both in VR and on desktop monitors. The manipulation of attitude familiarity was successful: pairs in the Familiar condition were significantly more aware of their partners' unique task-relevant attitudes. Results found that in VR, Familiar pairs were more accurate at determining patterns to events. Additionally, for teams less experienced in VR, Familiar pairs were also more accurate at predicting future events. However, there was no meaningful statistical difference in pairs' ability to identify information. Familiar teams also took more time to answer questions, and we found no difference in self-reported communication quality. Overall, this was the first successful manipulation of attitude familiarity and results indicate that such an intervention may prove useful in a collaborative work environment, as Familiar teams demonstrated greater accuracy, especially in VR.
Project description:BackgroundAwake craniotomy (AC) with brain mapping for language and motor functions is often performed for tumors within or adjacent to eloquent brain regions. However, other important functions, such as vision and visuospatial and social cognition, are less frequently mapped, at least partly due to the difficulty of defining tasks suitable for the constrained AC environment.ObjectiveThe aim of this retrospective study was to demonstrate, through illustrative cases, how a virtual reality headset (VRH) equipped with eye tracking can open up new possibilities for the mapping of language, the visual field and complex cognitive functions in the operating room.MethodsVirtual reality (VR) tasks performed during 69 ACs were evaluated retrospectively. Three types of VR tasks were used: VR-DO80 for language evaluation, VR-Esterman for visual field assessment and VR-TANGO for the evaluation of visuospatial and social functions.ResultsSurgery was performed on the right hemisphere for 29 of the 69 ACs performed (42.0%). One AC (1.5%) was performed with all three VR tasks, 14 ACs (20.3%) were performed with two VR tasks and 54 ACs (78.3%) were performed with one VR task. The median duration of VRH use per patient was 15.5 min. None of the patients had "VR sickness". Only transitory focal seizures of no consequence and unrelated to VRH use were observed during AC. Patients were able to perform all VR tasks. Eye tracking was functional, enabling the medical team to analyze the patients' attention and exploration of the visual field of the VRH directly.ConclusionsThis preliminary experiment shows that VR approaches can provide neurosurgeons with a way of investigating various functions, including social cognition during AC. Given the rapid advances in VR technology and the unbelievable sense of immersion provided by the most recent devices, there is a need for ongoing reflection and discussions of the ethical and methodological considerations associated with the use of these advanced technologies in AC and brain mapping procedures.
Project description:PurposeIn this work, a virtual environment for interprofessional team training in laparoscopic surgery is proposed. Our objective is to provide a tool to train and improve intraoperative communication between anesthesiologists and surgeons during laparoscopic procedures.MethodsAn anesthesia simulation software and laparoscopic simulation software are combined within a multi-user virtual reality (VR) environment. Furthermore, two medical training scenarios for communication training between anesthesiologists and surgeons are proposed and evaluated. Testing was conducted and social presence was measured. In addition, clinical feedback from experts was collected by following a think-aloud protocol and through structured interviews.ResultsOur prototype is assessed as a reasonable basis for training and extensive clinical evaluation. Furthermore, the results of testing revealed a high degree of exhilaration and social presence of the involved physicians. Valuable insights were gained from the interviews and the think-aloud protocol with the experts of anesthesia and surgery that showed the feasibility of team training in VR, the usefulness of the system for medical training, and current limitations.ConclusionThe proposed VR prototype provides a new basis for interprofessional team training in surgery. It engages the training of problem-based communication during surgery and might open new directions for operating room training.
Project description:50,000 cells were injected orthotopically into the inguinal fat pad of a Nod-Scid-Gamma (NSG) immuno-compromised mouse. Injected cells were 80% unlabelled 4T1 cells (parental population), and 20% ZsGreen-labelled 4T1-T cells (clone isolated in Wagenblast et Al, Nature, 2015). Tumour were allowed to develop for 20 days, and then collected during necropsy. Disaggegated cells were processed through the 10X genomics Single Cell 3' gene expression pipeline. This data is intended as an example dataset for a novel virtual reality viewer for single-cell data described in Bressan et Al, Nat. Cancer, 2021 (submitted)
Project description:BACKGROUND:Multiplayer games have emerged as a promising approach to increase the motivation of patients involved in rehabilitation therapy. In this systematic review, we evaluated recent publications in health-related multiplayer games that involved patients with cognitive and/or motor impairments. The aim was to investigate the effect of multiplayer gaming on game experience and game performance in healthy and non-healthy populations in comparison to individual game play. We further discuss the publications within the context of the theory of flow and the challenge point framework. METHODS:A systematic search was conducted through EMBASE, Medline, PubMed, Cochrane, CINAHL and PsycINFO. The search was complemented by recent publications in robot-assisted multiplayer neurorehabilitation. The search was restricted to robot-assisted or virtual reality-based training. RESULTS:Thirteen articles met the inclusion criteria. Multiplayer modes used in health-related multiplayer games were: competitive, collaborative and co-active multiplayer modes. Multiplayer modes positively affected game experience in nine studies and game performance in six studies. Two articles reported increased game performance in single-player mode when compared to multiplayer mode. CONCLUSIONS:The multiplayer modes of training reviewed improved game experience and game performance compared to single-player modes. However, the methods reviewed were quite heterogeneous and not exhaustive. One important take-away is that adaptation of the game conditions can individualize the difficulty of a game to a player's skill level in competitive multiplayer games. Robotic assistance and virtual reality can enhance individualization by, for example, adapting the haptic conditions, e.g. by increasing haptic support or by providing haptic resistance. The flow theory and the challenge point framework support these results and are used in this review to frame the idea of adapting players' game conditions.
Project description:BACKGROUND:Interprofessional team training is needed to improve nurse-physician communication skills that are lacking in clinical practice. Using simulations has proven to be an effective learning approach for team training. Yet, it has logistical constraints that call for the exploration of virtual environments in delivering team training. OBJECTIVE:This study aimed to evaluate a team training program using virtual reality vs conventional live simulations on medical and nursing students' communication skill performances and teamwork attitudes. METHODS:In June 2018, the authors implemented nurse-physician communication team training using communication tools. A randomized controlled trial study was conducted with 120 undergraduate medical and nursing students who were randomly assigned to undertake team training using virtual reality or live simulations. The participants from both groups were tested on their communication performances through team-based simulation assessments. Their teamwork attitudes were evaluated using interprofessional attitude surveys that were administered before, immediately after, and 2 months after the study interventions. RESULTS:The team-based simulation assessment revealed no significant differences in the communication performance posttest scores (P=.29) between the virtual and simulation groups. Both groups reported significant increases in the interprofessional attitudes posttest scores from the baseline scores, with no significant differences found between the groups over the 3 time points. CONCLUSIONS:Our study outcomes did not show an inferiority of team training using virtual reality when compared with live simulations, which supports the potential use of virtual reality to substitute conventional simulations for communication team training. Future studies can leverage the use of artificial intelligence technology in virtual reality to replace costly human-controlled facilitators to achieve better scalability and sustainability of team-based training in interprofessional education. TRIAL REGISTRATION:ClinicalTrials.gov NCT04330924; https://clinicaltrials.gov/ct2/show/NCT04330924.
Project description:BACKGROUND:Functional characterization of single nucleotide variants (SNVs) involves two steps, the first step is to convert DNA to protein and the second step is to visualize protein sequences with their structures. As massively parallel sequencing has emerged as a leading technology in genomics, resulting in a significant increase in data volume, direct visualization of SNVs together with associated protein sequences/structures in a new user interface (UI) would be a more effective way to assess their potential effects on protein function. RESULTS:We have developed BioVR, an easy-to-use interactive, virtual reality (VR)-assisted platform for integrated visual analysis of DNA/RNA/protein sequences and protein structures using Unity3D and the C# programming language. It utilizes the cutting-edge Oculus Rift, and Leap Motion hand detection, resulting in intuitive navigation and exploration of various types of biological data. Using Gria2 and its associated gene product as an example, we present this proof-of-concept software to integrate protein and nucleic acid data. For any amino acid or nucleotide of interest in the Gria2 sequence, it can be quickly linked to its corresponding location on Gria2 protein structure and visualized within VR. CONCLUSIONS:Using innovative 3D techniques, we provide a VR-based platform for visualization of DNA/RNA sequences and protein structures in aggregate, which can be extended to view omics data.
Project description:Virtual reality (VR) technology plays a significant role in many biomedical applications. These VR scenarios increase the valuable experience of tasks requiring great accuracy with human subjects. Unfortunately, commercial VR controllers have large positioning errors in a micro-manipulation task. Here, we propose a VR-based framework along with a sensor fusion algorithm to improve the microposition tracking performance of a microsurgical tool. To the best of our knowledge, this is the first application of Kalman filter in a millimeter scale VR environment, by using the position data between the VR controller and an inertial measuring device. This study builds and tests two cases: (1) without sensor fusion tracking and (2) location tracking with active sensor fusion. The static and dynamic experiments demonstrate that the Kalman filter can provide greater precision during micro-manipulation in small scale VR scenarios.
Project description:ObjectivesEven though vestibular rehabilitation therapy (VRT) using head-mounted display (HMD) has been highlighted recently as a popular virtual reality platform, we should consider that HMD itself do not provide interactive environment for VRT. This study aimed to test the feasibility of interactive components using eye tracking assisted strategy through neurophysiologic evidence.MethodsHMD implemented with an infrared-based eye tracker was used to generate a virtual environment for VRT. Eighteen healthy subjects participated in our experiment, wherein they performed a saccadic eye exercise (SEE) under two conditions of feedback-on (F-on, visualization of eye position) and feedback-off (F-off, non-visualization of eye position). Eye position was continuously monitored in real time on those two conditions, but this information was not provided to the participants. Electroencephalogram recordings were used to estimate neural dynamics and attention during SEE, in which only valid trials (correct responses) were included in electroencephalogram analysis.ResultsSEE accuracy was higher in the F-on than F-off condition (P=0.039). The power spectral density of beta band was higher in the F-on condition on the frontal (P=0.047), central (P=0.042), and occipital areas (P=0.045). Beta-event-related desynchronization was significantly more pronounced in the F-on (-0.19 on frontal and -0.22 on central clusters) than in the F-off condition (0.23 on frontal and 0.05 on central) on preparatory phase (P=0.005 for frontal and P=0.024 for central). In addition, more abundant functional connectivity was revealed under the F-on condition.ConclusionConsidering substantial gain may come from goal directed attention and activation of brain-network while performing VRT, our preclinical study from SEE suggests that eye tracking algorithms may work efficiently in vestibular rehabilitation using HMD.
Project description:Effective interventions for increasing people's intention to get vaccinated are crucial for global health, especially considering COVID-19. We devised a novel intervention using virtual reality (VR) consisting of a consultation with a general practitioner for communicating the benefits of COVID-19 vaccination and, in turn, increasing the intention to get vaccinated against COVID-19. We conducted a preregistered online experiment with a 2×2 between-participant design. People with eligible VR headsets were invited to install our experimental application and complete the ten minute virtual consultation study at their own discretion. Participants were randomly assigned across two age conditions (young or old self-body) and two communication conditions (with provision of personal benefit of vaccination only, or collective and personal benefit). The primary outcome was vaccination intention (score range 1-100) measured three times: immediately before and after the study, as well as one week later. Five-hundred-and-seven adults not vaccinated against COVID-19 were recruited. Among the 282 participants with imperfect vaccination intentions (<100), the VR intervention increased pre-to-post vaccination intentions across intervention conditions (mean difference 8.6, 95% CI 6.1 to 11.1,p<0.0001). The pre-to-post difference significantly correlated with the vaccination intention one week later, ρ=0.20,p<0.0001. The VR intervention was effective in increasing COVID-19 vaccination intentions both when only personal benefits and personal and collective benefits of vaccination were communicated, with significant retention one week after the intervention. Utilizing recent evidence from health psychology and embodiment research to develop immersive environments with customized and salient communication efforts could therefore be an effective tool to complement public health campaigns.