VR for Clinical Training

Immersive Virtual Reality Training for Nursing Edge Cases

Preparing nurses for the unexpected — rare, high-stakes clinical scenarios that real-world training can't reliably deliver. Using UNLV's VR infrastructure, clinical nursing expertise, and AI-driven simulation to build competency before it's needed at the bedside.

3
Lead Investigators
18
GPU Workstations
VR+AI
Core Technologies
Edge
Case Focus

Nurses Can't Train for What They Rarely See

Clinical edge cases — rare but life-threatening events like obstetric emergencies, sudden cardiac arrest in atypical presentations, or rapidly deteriorating multi-system failures — are precisely the situations nurses are least prepared for, yet most need to handle without hesitation.

Traditional simulation training relies on mannequins and scripted scenarios. High-fidelity VR changes the equation entirely — allowing nurses to experience rare emergencies with realistic spatial, sensory, and time-pressure cues, and repeat scenarios as many times as needed until responses become instinct.

"The best time to encounter a rare emergency for the first time is in a simulator, not at the bedside."

This project brings together UNLV's clinical nursing leadership with one of the most capable VR and AI development teams in the region — combining Vanderlaan's expertise in maternal health and nursing quality with Taghva and Fonseca's deep track record in immersive technology and intelligent systems.

Immersive Clinical Simulation
18+
NVIDIA RTX A4000 GPU workstations available for concurrent VR development
3+
Device platforms supported: Quest, HTC VIVE, HoloLens 2, and more
$1M+
Prior funded VR/AR research and infrastructure investment by the team
Scenario repetitions possible — no patient risk, unlimited practice

The Edge Cases VR Can Teach

Dr. Vanderlaan's clinical expertise drives the scenario design — identifying the high-stakes, low-frequency situations where nursing preparedness matters most.

01

Obstetric Emergencies

Postpartum hemorrhage, eclamptic seizure, shoulder dystocia, amniotic fluid embolism — maternal emergencies that unfold in minutes and demand flawless coordinated response. VR simulation allows nurses to experience the full sensory and time pressure of these rare but catastrophic events repeatedly, building the muscle memory that saves lives.

02

Rapid Deterioration & Code Response

Recognizing subtle early warning signs of clinical deterioration — before a patient crashes — is a skill built through experience most nurses rarely get. VR scenarios simulate a patient's trajectory from stable to critical, training nurses to catch the signs, activate Medical Emergency Teams, and perform first-responder interventions with confidence.

03

High-Risk Neonatal Transitions

The first minutes of a compromised neonate's life are defined by nursing response speed and accuracy. Scenarios involving preterm delivery, meconium aspiration, and neonatal resuscitation in a VR delivery suite prepare nurses for emergencies where every second counts — and where real-world training opportunities are inherently limited.

04

Multi-System Failure & Triage

When a patient presents with overlapping, atypical symptoms across multiple body systems — sepsis with cardiac involvement, drug interaction crises, or rare autoimmune presentations — standard protocols break down. VR trains nurses to navigate clinical ambiguity, escalate appropriately, and maintain composure under diagnostic uncertainty.

05

Mass Casualty & Surge Events

Pandemic surges, natural disasters, and mass casualty events demand a level of sustained high-intensity nursing performance that traditional training cannot replicate. VR surge simulations stress-test triage decision-making, resource allocation under scarcity, and team communication across rapidly changing patient loads.

06

Neurological & Psychiatric Crises

Acute stroke recognition, status epilepticus management, and acute behavioral emergencies in nursing settings are high-stakes events nurses may encounter rarely but must handle immediately. Immersive simulation builds the pattern recognition and de-escalation skills that make the difference between effective and delayed response.

Why Virtual Reality — Not Just Simulation

Presence & Immersion

Full Sensory Engagement

Unlike mannequin simulation, VR creates a genuine sense of presence in a clinical environment — triggering the same stress responses, spatial reasoning, and motor behaviors a real emergency demands. What you train in VR transfers to the real ward.

Repeatability

Unlimited Safe Repetitions

A rare obstetric emergency happens once in a nurse's career — or never. In VR, it can happen 50 times. Each repetition refines response, builds automaticity, and surfaces gaps in knowledge that can be corrected before they become patient safety issues.

AI-Driven Assessment

Objective Performance Data

VR captures every action, timing, and decision in a training session. AI analysis of this behavioral data surfaces objective competency metrics — replacing subjective instructor assessment with reproducible, data-driven feedback that scales across an entire nursing cohort.

Scenario Control

Tunable Difficulty & Variability

VR scenarios can be instantly adjusted for acuity, patient demographics, comorbidities, and resource availability — creating an infinite matrix of training conditions. Each nurse receives tailored challenge levels based on their demonstrated competency.

Team Training

Multiplayer Clinical Teams

Emergency response is rarely a solo act. UNLV's VR infrastructure supports multiplayer environments where nurses, physicians, and respiratory therapists train together in a shared virtual space — building the interprofessional communication skills that critical situations demand.

Accessibility

Training Without Travel

High-fidelity simulation centers are expensive and geographically concentrated. A VR headset can be shipped anywhere — making elite clinical training accessible to rural hospitals, understaffed facilities, and nursing programs that can't afford a traditional simulation lab.

State-of-the-Art VR & AR Devices

UNLV's App Development Team has extensive hands-on experience developing and deploying applications across all major immersive computing platforms.

🥽

Meta Quest 2 / 3

Standalone wireless VR — no PC required. Ideal for deployment in hospital settings and remote training sites.

💻

HTC VIVE

PC-tethered, room-scale VR with precision tracking. Maximum fidelity for high-stakes simulation scenarios.

🔬

Microsoft HoloLens 2

Mixed reality headset overlaying holographic content onto the real world — ideal for procedural overlay training.

📱

iOS / Android AR

Augmented reality on mobile devices — accessible, deployable at scale for lower-acuity training scenarios.

🖥️

Windows & Desktop

Non-immersive desktop simulation for cognitive skill building, protocol review, and decision-tree training.

NVIDIA RTX A4000 Lab

18-machine GPU lab for real-time ray tracing, AI simulation rendering, and concurrent developer workstations.

Built in Unity

All UNLV VR/AR applications are developed in the Unity engine — the same platform used for the team's prior published work including the OCR-Enhanced AR Indoor Navigation System (AIVR 2022). Unity provides cross-platform deployment to all device targets from a single codebase, and supports real-time AI integration for adaptive scenario logic.

The Team Behind the Technology

A Decade of VR/AR Research at UNLV CS

The UNLV Department of Computer Science has been developing immersive technology since the earliest days of consumer VR — teaching Unity development, running funded AR/VR research projects, and building production-quality applications deployed on real devices for real users.

Teaching History

CS489 and CS689 Unity VR development courses producing Driving Simulators, Flight Simulators, and fully realized VR experiences on HTC VIVE, HoloLens 2, and Quest. The infrastructure for nursing simulation already exists — we're redirecting it.

Published VR Research

Dr. Fonseca's OCR-Enhanced Augmented Reality Indoor Navigation (AIVR 2022, IEEE) demonstrates the team's ability to publish peer-reviewed immersive technology research — a foundation for nursing VR publications to follow.

Funded Track Record

HEERF II ($60K, Fonseca), School of Public Health Walk2School2Day app (Taghva & Fonseca), NSF 222522 ($999,815, STEM AI), and NSF GPU cluster ($500K) — a proven funding history for applied technology projects.

Student Research Workforce

An active Game Development Club and pipeline of undergraduate and graduate VR researchers provide the development talent to build and iterate on clinical simulation environments rapidly and cost-effectively.

Prior Application

Walk2School2Day — Health-Focused App Development

Taghva and Fonseca previously built the Walk2School2Day mobile application for the School of Public Health, promoting healthier lifestyles for K-12 students and families. This project demonstrated the team's ability to execute health-focused technology projects from concept through production deployment.

Hardware Inventory

Ready-to-Use Device Fleet

GPU Workstations18 × NVIDIA RTX A4000
VR HeadsetsHTC VIVE, Quest 2
AR DevicesHoloLens 2
GPU ClusterNSF · Switch Cloud
Dev EngineUnity (cross-platform)
BackendOn-site ML servers

Clinical Vision. Computational Power.

Three investigators whose combined expertise makes this project uniquely possible.

Dr. Jennifer Vanderlaan

Dr. Jennifer Vanderlaan

Lead Investigator · Associate Professor, School of Nursing

A certified nurse-midwife and PhD-trained researcher, Dr. Vanderlaan provides the clinical foundation of this project — defining the edge cases that matter most, validating scenario realism, and ensuring that every simulation translates into genuine bedside competency. Her expertise in maternal emergencies and nursing quality measurement drives the training content.

Clinical Lead Scenario Design Maternal Health
Dr. Kazem Taghva

Dr. Kazem Taghva

Co-Investigator · Chair & Professor, Computer Science

Dr. Taghva brings decades of experience in applied AI, large-scale data systems, and computing infrastructure. As chair of Computer Science, he oversees the department's VR/AR research ecosystem — including the GPU lab, server infrastructure, and the student researcher pipeline that powers development at scale.

AI Systems Infrastructure Research Lead
Dr. Jorge Fonseca Cacho

Dr. Jorge Fonseca Cacho

Co-Investigator · Assistant Professor, Computer Science

Dr. Fonseca is the team's hands-on VR/AR engineer and health technology developer. His published work in OCR-enhanced AR navigation (AIVR 2022), combined with experience developing health-focused applications under federal and state grants, makes him the ideal technical lead for building the nursing simulation environments themselves.

VR/AR Dev Unity / AI Health Tech

Interested in VR Nursing Training?

We're looking to partner with nursing schools, healthcare systems, and simulation centers who want to push the frontier of clinical training. If you're interested in collaborating, piloting scenarios, or funding this work — we'd love to talk.

Contact the Team All Research