Preparing nurses for the unexpected — rare, high-stakes clinical scenarios that real-world training can't reliably deliver. Using UNLV's VR infrastructure, clinical nursing expertise, and AI-driven simulation to build competency before it's needed at the bedside.
Clinical edge cases — rare but life-threatening events like obstetric emergencies, sudden cardiac arrest in atypical presentations, or rapidly deteriorating multi-system failures — are precisely the situations nurses are least prepared for, yet most need to handle without hesitation.
Traditional simulation training relies on mannequins and scripted scenarios. High-fidelity VR changes the equation entirely — allowing nurses to experience rare emergencies with realistic spatial, sensory, and time-pressure cues, and repeat scenarios as many times as needed until responses become instinct.
"The best time to encounter a rare emergency for the first time is in a simulator, not at the bedside."
This project brings together UNLV's clinical nursing leadership with one of the most capable VR and AI development teams in the region — combining Vanderlaan's expertise in maternal health and nursing quality with Taghva and Fonseca's deep track record in immersive technology and intelligent systems.
Dr. Vanderlaan's clinical expertise drives the scenario design — identifying the high-stakes, low-frequency situations where nursing preparedness matters most.
Postpartum hemorrhage, eclamptic seizure, shoulder dystocia, amniotic fluid embolism — maternal emergencies that unfold in minutes and demand flawless coordinated response. VR simulation allows nurses to experience the full sensory and time pressure of these rare but catastrophic events repeatedly, building the muscle memory that saves lives.
Recognizing subtle early warning signs of clinical deterioration — before a patient crashes — is a skill built through experience most nurses rarely get. VR scenarios simulate a patient's trajectory from stable to critical, training nurses to catch the signs, activate Medical Emergency Teams, and perform first-responder interventions with confidence.
The first minutes of a compromised neonate's life are defined by nursing response speed and accuracy. Scenarios involving preterm delivery, meconium aspiration, and neonatal resuscitation in a VR delivery suite prepare nurses for emergencies where every second counts — and where real-world training opportunities are inherently limited.
When a patient presents with overlapping, atypical symptoms across multiple body systems — sepsis with cardiac involvement, drug interaction crises, or rare autoimmune presentations — standard protocols break down. VR trains nurses to navigate clinical ambiguity, escalate appropriately, and maintain composure under diagnostic uncertainty.
Pandemic surges, natural disasters, and mass casualty events demand a level of sustained high-intensity nursing performance that traditional training cannot replicate. VR surge simulations stress-test triage decision-making, resource allocation under scarcity, and team communication across rapidly changing patient loads.
Acute stroke recognition, status epilepticus management, and acute behavioral emergencies in nursing settings are high-stakes events nurses may encounter rarely but must handle immediately. Immersive simulation builds the pattern recognition and de-escalation skills that make the difference between effective and delayed response.
Unlike mannequin simulation, VR creates a genuine sense of presence in a clinical environment — triggering the same stress responses, spatial reasoning, and motor behaviors a real emergency demands. What you train in VR transfers to the real ward.
A rare obstetric emergency happens once in a nurse's career — or never. In VR, it can happen 50 times. Each repetition refines response, builds automaticity, and surfaces gaps in knowledge that can be corrected before they become patient safety issues.
VR captures every action, timing, and decision in a training session. AI analysis of this behavioral data surfaces objective competency metrics — replacing subjective instructor assessment with reproducible, data-driven feedback that scales across an entire nursing cohort.
VR scenarios can be instantly adjusted for acuity, patient demographics, comorbidities, and resource availability — creating an infinite matrix of training conditions. Each nurse receives tailored challenge levels based on their demonstrated competency.
Emergency response is rarely a solo act. UNLV's VR infrastructure supports multiplayer environments where nurses, physicians, and respiratory therapists train together in a shared virtual space — building the interprofessional communication skills that critical situations demand.
High-fidelity simulation centers are expensive and geographically concentrated. A VR headset can be shipped anywhere — making elite clinical training accessible to rural hospitals, understaffed facilities, and nursing programs that can't afford a traditional simulation lab.
UNLV's App Development Team has extensive hands-on experience developing and deploying applications across all major immersive computing platforms.
Standalone wireless VR — no PC required. Ideal for deployment in hospital settings and remote training sites.
PC-tethered, room-scale VR with precision tracking. Maximum fidelity for high-stakes simulation scenarios.
Mixed reality headset overlaying holographic content onto the real world — ideal for procedural overlay training.
Augmented reality on mobile devices — accessible, deployable at scale for lower-acuity training scenarios.
Non-immersive desktop simulation for cognitive skill building, protocol review, and decision-tree training.
18-machine GPU lab for real-time ray tracing, AI simulation rendering, and concurrent developer workstations.
All UNLV VR/AR applications are developed in the Unity engine — the same platform used for the team's prior published work including the OCR-Enhanced AR Indoor Navigation System (AIVR 2022). Unity provides cross-platform deployment to all device targets from a single codebase, and supports real-time AI integration for adaptive scenario logic.
The UNLV Department of Computer Science has been developing immersive technology since the earliest days of consumer VR — teaching Unity development, running funded AR/VR research projects, and building production-quality applications deployed on real devices for real users.
CS489 and CS689 Unity VR development courses producing Driving Simulators, Flight Simulators, and fully realized VR experiences on HTC VIVE, HoloLens 2, and Quest. The infrastructure for nursing simulation already exists — we're redirecting it.
Dr. Fonseca's OCR-Enhanced Augmented Reality Indoor Navigation (AIVR 2022, IEEE) demonstrates the team's ability to publish peer-reviewed immersive technology research — a foundation for nursing VR publications to follow.
HEERF II ($60K, Fonseca), School of Public Health Walk2School2Day app (Taghva & Fonseca), NSF 222522 ($999,815, STEM AI), and NSF GPU cluster ($500K) — a proven funding history for applied technology projects.
An active Game Development Club and pipeline of undergraduate and graduate VR researchers provide the development talent to build and iterate on clinical simulation environments rapidly and cost-effectively.
Taghva and Fonseca previously built the Walk2School2Day mobile application for the School of Public Health, promoting healthier lifestyles for K-12 students and families. This project demonstrated the team's ability to execute health-focused technology projects from concept through production deployment.
Three investigators whose combined expertise makes this project uniquely possible.
A certified nurse-midwife and PhD-trained researcher, Dr. Vanderlaan provides the clinical foundation of this project — defining the edge cases that matter most, validating scenario realism, and ensuring that every simulation translates into genuine bedside competency. Her expertise in maternal emergencies and nursing quality measurement drives the training content.
Dr. Taghva brings decades of experience in applied AI, large-scale data systems, and computing infrastructure. As chair of Computer Science, he oversees the department's VR/AR research ecosystem — including the GPU lab, server infrastructure, and the student researcher pipeline that powers development at scale.
Dr. Fonseca is the team's hands-on VR/AR engineer and health technology developer. His published work in OCR-enhanced AR navigation (AIVR 2022), combined with experience developing health-focused applications under federal and state grants, makes him the ideal technical lead for building the nursing simulation environments themselves.
We're looking to partner with nursing schools, healthcare systems, and simulation centers who want to push the frontier of clinical training. If you're interested in collaborating, piloting scenarios, or funding this work — we'd love to talk.
Contact the Team All Research