Augmented and Virtual Reality technologies focus on bridging the gap between the physical and digital realms, crafting immersive experiences that transform our perception of reality. Within healthcare, whether it be M Health Fairview surgeons using VR models to gain a deeper understanding of a newborn's chest anatomy, enhancing precision in a crucial tumor removal procedure, or AR workshops helping elderly patients understand age-related eye diseases through immersive learning, these heartwarming stories reveal a fascinating intersection between innovative technologies and patient care.
Emerging sectors include use-cases in clinical training and medical education, therapeutic and patient care applications, as well as patient data interaction and surgery guidance.
A 2024 study examining laparoscopic surgical training in Switzerland revealed a crucial finding:
Thirty-two (65%) heads of departments indicated that residents have sufficient practical exposure in the operating room, but the ability to work independently with obtaining the specialist title is seen critically (71%).
While laparoscopic surgery offers significant benefits, surgeons encounter countless obstacles, including reduced haptic perception and feedback, the fulcrum effect causing scaled movements, restricted manipulation freedom, and suboptimal ergonomics. With department chairs criticizing current medical training for its lack of a "competency-based curriculum," the integration of simulation-based technologies offers a valuable opportunity to enhance skills and build confidence.
As for a few sector-specific innovations, pediatric care is being transformed through immersive VR experiences that use interactive environments to engage children during medical procedures, reducing anxiety while providing real-time data on emotional responses, while mental health platforms are breaking down barriers by creating safe, virtual environments where patients can confront anxieties under close clinical supervision, all while enabling remote therapy at unprecedented scale. Moreover, physical rehabilitation is introduced to immersive environments that transform tedious recovery exercises into engaging challenges, complete with precise motion tracking that helps therapists optimize treatment plans. Perhaps most excitingly, the operating room itself is evolving as surgical planning tools convert traditional medical imaging into interactive 3D models, allowing surgeons to virtually rehearse procedures and anticipate complications before making a single incision.
In regards to more established corporations exploring healthcare-focused products in AR/VR, notable products include the Apple Vision Pro (UC San Diego utilized its technology in a surgical environment), Microsoft HoloLens (delivers clinical tutorials to medical students), Siemens using the Oculus Quest 2 for surgical modules to train surgeons and technicians on how to use a C-arm, and ByteDance offering employees new VR therapy benefits with XRHealth.
This investigation will additionally cover key drivers to growth, limitations, and emerging startups.
Key Drivers to Growth
In 2024, the North American market captured 41% of the global AR/VR in healthcare sector, valued at an estimated $1 billion. Under a CAGR of 24.81%, the market is estimated to reach $9.24 billion by 2030.
A key market catalyst is the accelerating adoption rate, reinforced by the rising number of FDA approvals. As of September 6th, 2024, the U.S Food and Drug Administration authorized 69 medical devices that incorporate AR/VR. A few of the devices include Pixee Medical’s Knee+ (designed to assist orthopedic surgeons during total knee arthroplasty procedures), Augmedics’s xvision Spine System (used to treat over 8,500 patients and implant more than 45,000 pedicle screws across 25 US states), and Luminopia One (binocular therapy that uses a VR headset to stream a child's favorite shows). As a key authority in validating healthcare innovations, the FDA's recent approvals of new technologies signal a positive step forward.
Luminopia Inc.’s virtual reality headset system is the first FDA-approved VR binocular therapy for amblyopia in children 4 to 7 years old.
Another significant market driver is the rapid pace of technological advancements within the industry. In the last year, innovations related to diagnostic imaging integrations, directional display technologies, and real-time 3D image rendering have enhanced accuracy for thousands of use cases.
Diagnostic imaging integrations represent a transformative advancement in medical technology, connecting traditional imaging modalities like X-rays, MRIs, and CT scans with AR systems to enhance visualization and accuracy. By overlaying three-dimensional digital information onto physical environments, surgeons are able to visualize critical anatomical structures with enhanced precision, ultimately improving decision-making. In surgical and interventional radiology settings, AR provides real-time guidance by integrating imaging data directly into the clinician’s field of view, enabling more accurate navigation and reducing the need for repeat imaging.
The underlying software processes raw imaging data through multiple stages: deep learning-based segmentation of anatomical structures (centered around AI-powered identification), volume rendering (e.g stereoscopic volumetric path tracing), and multi-modality fusion combining imaging types. Advanced waveguide-based optical displays and micro-LED arrays deliver high-brightness visualization, while eye-tracking systems ensure precise parallax correction and proper depth perception. The system maintains continuous calibration through real-time deformation compensation algorithms that account for tissue movement and patient positioning changes. Beyond clinical applications, AR significantly impacts medical education and training by offering immersive platforms where healthcare professionals can explore complex anatomical structures and refine procedural skills.
Shifting to directional display technologies, these innovations represent a cutting-edge approach in the realm of visual systems, enabling the precise control of light paths to direct visuals exclusively towards the viewer, enhancing clarity while minimizing energy loss. By leveraging advanced optics and waveguides, directional displays ensure that projected images are bright, crisp, and immersive.
In the context of AR/VR, directional displays play a crucial role in addressing challenges such as limited field-of-view, low resolution, and user discomfort. By focusing light through controlled optical paths, these systems allow for compact, transparent displays that seamlessly integrate virtual images into the user's field of vision without obstructing real-world views. A common challenge in AR/VR technology is vergence-accommodation mismatch, where the brain expects content at a specific distance, but discrepancies in focal depth result in dizziness and headaches. Through the aforementioned advancements in visual integration, medical professionals can work in more natural settings, mitigating any potential challenges with depth perception.
Lastly, real-time 3D image rendering has significantly transformed the market, driven by advancements in graphics processing units (GPUs), real-time ray tracing, and AI-powered rendering methods. Modern GPUs, with their parallel processing capabilities, enable complex computations to be performed simultaneously, speeding up rendering processes and delivering high-quality, interactive visualizations in real time. Real-time ray tracing improves image realism by casting virtual light rays that calculate object interaction, enhancing the precision of virtual environments. Additionally, AI-driven rendering techniques optimize image quality and reduce computational loads by intelligently predicting and refining visual elements, allowing for minimal hardware strain.A few notable milestones to take note of are the work of VA Immersive (an initiative of the U.S. Department of Veterans Affairs), who deployed more than 1,200 virtual reality headsets across more than 160 VA medical centers and outpatient clinics in all 50 U.S. states last July, the West Cancer Center & Research Institute announcing holographic doctor-patient visits for rural communities, and Montreal researchers building a virtual reality "emergency room" that can be used to train doctors to support injured children in life-threatening situations such as car crashes, falls, and fires.
Limitations
Barriers to growth in the market include high initial costs, issues with data privacy, and concerns with patient isolation. A study conducted by researchers from Wright State University’s College of Nursing and Health, in collaboration with Cincinnati Children’s Hospital Medical Center, revealed the following:
Initially, virtual reality training is more expensive, costing $327.78 per participant (totaling $106,951.14) compared to live disaster exercises at $229.79 per participant (totaling $18,617.54 per exercise). However, when development costs for VR are distributed over repeated training sessions across three years, VR becomes less expensive at $115.43 per participant, whereas the cost of live drills remains fixed.
Moreover, data privacy warrants attention. A 2024 study by JMIR XR and Spatial Computing highlighted significant privacy risks associated with XR technologies. Security risks include immersive manipulation attacks, which exploit unique XR features to compromise system integrity, as well as hardware exploitation targeting XR-specific components. Privacy concerns primarily involve the potential disclosure of sensitive user data, such as biometric and psychological profiles, which can be maliciously extrapolated.
Lastly, a 2022 study conducted by the Turner Institute for Brain and Mental Health at The Melbourne Clinic revealed that many professionals were apprehensive about the isolating effects of virtual reality on patients.
What that could mean or do in terms of their safety, if they were to have a panic attack as a result of the actual task…also patients who dissociate, feeling out of body, unreal, being in a virtual reality.” (P04, Psychologist)
Such immersive environments, while potentially therapeutic, can also heighten feelings of detachment or dissociation, leading to a sense of disconnection from reality. This isolation not only affects the patient’s experience but also poses challenges for clinicians in monitoring and addressing adverse reactions in real time, especially when patients are immersed in a virtual space.
However, whether it be ventures like Company (Un)hacked, a project enabling complex cyber security training in virtual reality, or Sandbox VR, a platform integrating group bonding through virtual reality, these limitations can be mitigated through thoughtful design.
Emerging Startups
Focusing on startups no later than Series B, here are a few innovative ventures solving critical issues within the market:
Alensia XR: Founded in 2023, the platform offers advanced mixed-reality technology to illuminate the human body in three dimensions for enhanced student comprehension in anatomy, neuroanatomy, and healthcare education. AlensiaXR's HoloAnatomy Suite, originally designed by Case Western University, offers immersive 3D learning experiences under three core applications: “Designer”, which lets instructors create custom presentations using a vast library of 3D anatomical art; “Viewer”, an XR app allowing students to explore the human body in 3D; and “Dashboard”, providing instructors with real-time control and monitoring of class activities. The suite features an Anatomical Library with over 6,000 detailed 3D illustrations of the human body, promoting flexible curriculum delivery and reducing the need for costly cadaveric resources.
Its second product, HoloAnatomy Neuro Suite, includes similar subcomponents, with a shifted focus towards offering 3D explorations of the nervous system, allowing learners to explore neural pathways, spinal cord cross-sections, and brainstem details. The suite enhances understanding, improves spatial comprehension, and offers innovative features for collaborative learning.
CEO Mark Day previously led sales for the Microsoft Hololens and is a London Business School graduate.
As for the venture’s traction, notable milestones include successful experiments published in the International Journal of Medical Education and the National Library of Medicine. These studies explored the effects of utilizing mixed reality solutions in medical education, revealing that such approaches "increase the capacity to retain acquired knowledge over a greater retention interval." Moreover, HoloAnatomy software is already in use by more than 20 institutions worldwide, ranging from TCU Burnett’s School of Medicine to Universidad del Norte. Last February, the team completed its $3 million Series A funding round, led by Sopris Capital.
Based on pilot studies, HoloAnatomy enabled medical students to learn anatomy twice as fast compared to cadaver dissection, retaining information 44% better when tested later.
Oxford Medical Simulation: Founded in 2017, OMS offers on-demand VR training solutions to medical institutions. Based in London, the team’s last fundraising round of $12.7M was led by Frog Capital (bringing its total funding to $15M / Series A / January 2024).
At its core, the team’s technology offers single-user software that offer flexible, repeatable simulations to enhance clinical reasoning and decision-making, accessible on-screen or in-headset, while the multi-user training environments enable healthcare teams to collaborate seamlessly, implementing treatment plans with comprehensive feedback (with a focus on essential skills like prioritization, delegation, and rapid task-switching). Through AI-driven voice interactions, users refine communication while mastering clinical procedures with intuitive hand controls and guided, step-by-step instructions.
The platform offers an extensive library of over 250 evidence-based cases, covering situations from diabetes emergencies to investigation interpretation, with specialized content for nursing, pediatrics, and mental health. The software provides immediate, personalized feedback to identify improvements, while precision analytics track progress. In addition, "OMS Create" enables scenario customization to align with specific learning objectives, complemented by comprehensive onboarding, regular check-ins, and impact reviews. It must be noted that the use of OMS offers significant cost and time savings compared to manikin-based simulation, enriching student experiences while increasing accessibility. OMS reduces staffing time and estate costs by 74%, with expenses ranging from $2.22 to $14.38 per use, significantly lower than $28.38 to $394.95 for physical simulations (ROI ranges from 120% to 3,000%, with annual savings of $64,000 reported in nursing education).
The founders, Dr. Jack Pottle and Michael Wallace, bring a diverse set of experiences, including clinical work for NHS England and consulting for Accenture and the Disabled Living Foundation. As for the venture’s traction, notable milestones include partnerships with Boston Children’s Hospital, Mayo Clinic, and the University of Oxford, alongside currently delivering over 35,000 simulations per month.
Deepwell DTX’s “Zengence”: Deepwell DTX, founded in 2022, combines interactive media with digital therapeutics to provide FDA-cleared mental health solutions. Through controlled breathing, Zengence (May 2024) transforms stress management into an exhilarating adventure. Players harness their breath to fuel spectacular abilities, execute precise movements, and maintain razor-sharp focus while battling and traversing treacherous terrain. As the intensity builds and chaos swirls, players discover a powerful paradox: the deeper their calm, the stronger they become.
Current chairman and founder Ryan Douglas was previously the CEO of Verilux (2009-2013) and Nextern (2005-2016), bringing extensive experience in medical device design and manufacturing.
The application, available on the Meta Quest platform, features 30 levels of breathwork and biofeedback-based gameplay. Daily challenges offer fresh, evolving scenarios to refine abilities and promote engagement. For a more serene experience, the “Zen Garden” provides a dedicated space to practice breathwork techniques, rewarding a player’s progress with bonuses that enhance regular gameplay. Zengence combines research-backed breathing techniques with immersive gameplay to reduce stress and build resilience. By playing an estimated fifteen minutes, three times a week, players are able to witness improvements in stress management.
The application boasts an average rating of 4.3 Stars (40 reviews), alongside being featured on sites like GeekWire, VentureBeat, and PC Gamer. Moreover, the platform’s technology has been endorsed by figures like Dave Anthony (creator of Call of Duty Black Ops), David Jen (Head of Finance at Google X) and Dr. Melita Moore (Vice President of Global Esports Federation).
The United States Department of Health & Human Services predicts that, by 2030, mental health will become a leading medical concern. Applications like Zengence by Deepwell DTX exemplify the transformative power of AR/VR in improving mental health, redefining the boundaries of immersive experiences.
eXeX (Expanded Existence): Founded in 2022, the team’s patented technology utilizes artificial intelligence and spatial computing to optimize surgical procedures. A few of the startup’s product offerings include “CreatorX”, which delivers AI-optimized surgical profiles customized for surgeons, procedures, and operating rooms, “ViewerX”, which provides a tablet-based solution for seamlessly setting up, performing, and tracking surgeries, and “ExperienceX”, which utilizes spatial computing to offer interactive, AI-driven capabilities directly within the sterile field.
A core component to the startup’s offerings include a procedure-specific digital twin, a virtual model that mirrors a surgeon’s plan, allowing for real-time insights and error detection before surgery. Surgeons visualize the entire operation in advance, while staff follow a tailored roadmap for efficiency. Unlike traditional systems, eXeX integrates seamlessly into existing workflows, automating schedules, equipment checks, and team communication, reducing downtime. AI analyzes surgical data to optimize procedures, predicting future needs and identifying best practices. The platform works seamlessly with popular devices, such as the Apple Vision Pro, to display critical information within the sterile field, enabling hands-free access to guides and expert support.
The software achieves 50% more accurate OR setups, accelerates team onboarding 6x faster, and streamlines procedural steps by 53%.
As for the team’s traction, its technology has been utilized in over 2,000 live surgeries across three continents, and has additionally won the Synapse Innovation Award for Tech Startup of the Year.
Last August, the team raised $5.8M in seed funding to accelerate the development of the platform and support its upcoming commercial debut (in the coming months). Dr. Robert Masson (CEO) is an internationally recognized neurosurgeon specializing in minimally invasive spine surgery and sports spine medicine, while Nicholas Cambata (COO) previously founded 8112 Studios.