The Rise and Rise of VR/MR/XR Flight Training Devices
- Daniel de Vries

- 3 days ago
- 5 min read

The global aviation industry is currently navigating a period of interesting technological change. Faced with escalating training costs, pilot shortages, and the logistical bottlenecks of legacy infrastructure, training organisations are turning to Extended Reality (XR); encompassing Virtual Reality (VR) and Mixed Reality (MR), to modernise their training options.

Let's be clear - XR devices aren't going to replace Level D Full Flight Simulators (FFS). The sheer physical fidelity, sustained motion capabilities, and zero-flight-time type rating approvals of a Level D FFS remain unmatched and indispensable for the critical stages of pilot certification. However, there is true value in XR and it lies in its ability to complement the FFS by supporting procedural familiarisation, accelerating muscle memory acquisition, and providing access to high-risk scenario training at a lower cost and physical footprint.
So, let's look at the technical architecture, regulatory breakthroughs, and the future trajectory of XR in aviation training.
Supplementing the training landscape
The integration of VR and MR into flight training is not a disruptive replacement, but a strategic extension of existing capabilities. Full flight simulators are capital-intensive assets that require massive facilities and tightly managed schedules.
By deploying compact, cost-effective XR flight training devices (FTDs), training centres can offload some tasks; such as standard operating procedures (SOPs), checklist flows, and cockpit familiarisation, away from the FFS. This ensures that when a pilot finally steps into a multi-million-dollar Level D FFS, they are already highly proficient in the aircraft's architecture, reserving expensive FFS time for the tasks where it excels and provides the most significant value.
Furthermore, XR allows pilots to safely and repeatedly practice high-risk scenarios that are dangerous to simulate in real aircraft. For helicopter pilots, this includes mastering the "energy triangle" of autorotations, recovering from a Vortex Ring State, or managing Loss of Tail Rotor Effectiveness (LTE) in confined mountainous terrain.
'Under the Hood'
The hardware and software powering today's advanced XR flight trainers have evolved far beyond consumer-grade gaming rigs. They are highly engineered systems designed to eliminate negative training and meet rigorous qualification standards.
Optics and Visual Fidelity

To replicate the visual demands of real-world flight, modern XR simulators utilise video see-through (VST) technology. By blending a fully rendered virtual external environment with high-definition passthrough cameras, pilots can see and interact with their actual hands, iPads, and physical cockpit panels.
Top-tier headsets, such as the Varjo XR-4 Focal Edition, utilise advanced liquid crystal displays that achieve near human-eye resolution. Crucially, they address the Vergence-Accommodation Conflict (VAC) (a common cause of visual fatigue in VR) by employing gaze-driven autofocus. Integrated eye-tracking data adjusts the focal distance in real-time, mimicking natural depth perception.
Mitigating Cybersickness via Latency and Motion
Cybersickness remains a primary human factors hurdle, traditionally triggered by visual-vestibular conflict (when the eyes perceive motion that the inner ear does not feel). To combat this, advanced MR simulators utilise a few different methods:
Ultra-Low Latency: Regulators like the European Union Aviation Safety Agency (EASA) mandate a combined sensor-plus-render latency of ≤20 milliseconds.
Mini Motion Platforms: Fully electric, six-degrees-of-freedom (6-DoF) motion bases, such as those used by TRU Simulation's Veris device, generate precise rotational and translational inertial cues that physically align with the visual data.
Control Loading Systems (CLS): Digital electric CLS provides high-fidelity, haptic force feedback on the flight controls, dynamically altering the resistance of the cyclic or yoke based on simulated airspeed and aerodynamic forces. While this doesn't directly negate cybersickness, it does improve overall immersion.
Navigating the Regulatory Airspace
The breakthrough moment for lightweight XR simulators was not purely technical; it was regulatory. Aviation authorities are inherently cautious, but the fidelity of modern XR devices has lead to a shift in the way major aviation safety agencies view them.
Earning Official Qualifications
In recent years, the industry has witnessed a few regulatory milestones:
Loft Dynamics: Achieved the first-ever EASA FTD Level 3 qualification for a VR simulator in 2021, followed by an FAA Level 7 FTD qualification for its Airbus H125 simulator in 2024.
Leonardo Helicopters: Secured EASA FTD Level 3 and FAA FTD Level 7 certification for its Virtual Extended Reality (VxR) training system.
Brunner Elektronik: Achieved EASA FNPT II certification for its NOVASIM MR DA42, marking the first fixed wing, mixed reality device to be qualified to European standards.
Evolving Frameworks

Currently, devices are largely qualified on a case-by-case basis via Special Conditions, which can be resource-intensive. However, the regulatory rulebook is catching up. EASA is actively working to consolidate FTD standards in line with ICAO Document 9625. This shift emphasises a task-to-tool approach; evaluating what a device achieves for training outcomes rather than requiring a specific hardware footprint (like a massive projection dome). This will pave the way for scalable, standardised approvals of XR systems globally.
The Future of Flight Training Devices?
The next evolution of XR training is the integration of Artificial Intelligence (AI) and biometric analytics, transforming the simulator from a passive environment into a digitally connected, active coaching ecosystem.
AI-Supported Debriefing
Systems are being developed that automatically evaluate a pilot’s maneuvers against established procedural standards. For instance, AI algorithms can analyze a landing approach, compare it to industry-wide data, and automatically generate a debrief report outlining root-mean-square (RMS) deviations in heading, altitude, and airspeed. This reduces the administrative burden on instructors and aligns perfectly with Evidence-Based Training (EBT) methodologies.
Biometric Workload Monitoring

Because pilots wear advanced headsets, simulators can now capture continuous biometric data. Integrated eye-tracking, pupillometry, and heart-rate variability (HRV) sensors can monitor a pilot's cognitive load and stress levels in real-time. High saccade rates or dilated pupils can indicate negative affect or cognitive overload, allowing instructors—or the AI—to objectively measure a pilot's situational awareness under pressure.
Novel Concepts for XR Aviation Training
While the current trajectory of XR is impressive, the unique capabilities of this technology open the door to entirely novel training concepts that are significantly more difficult in traditional FFS environments:
1. Neuro-Adaptive "Stress Inoculation" Scenarios
By tapping into the biometric telemetry (EEG, GSR, pupillometry) collected by the headset, future XR simulators could feature an AI engine that dynamically scales the difficulty of the flight environment in real-time. If the system detects the pilot's cognitive load dropping into a "comfort zone," it could automatically introduce a subtle system degradation; such as a slow hydraulic leak or deteriorating instrument meteorological conditions. Conversely, if the pilot reaches a state of cognitive overload, the AI could dynamically stabilise the weather to prevent negative training, ensuring the pilot is constantly kept at the optimal threshold of learning.
2. Global Asymmetric Multi-Domain Mission Rehearsal
XR devices require a fraction of the infrastructure of traditional simulators, meaning they can be easily networked from all parts of the globe. A pilot sitting in an XR cockpit in London could fly a search-and-rescue (SAR) mission while a hoist operator wearing VR haptic gloves (like the SenseGlove Nova 2) stands in a physical mock-up in New York. Both crew members interact with the same digital twin of the aircraft and the environment, allowing for complex Multi-Crew Cooperation (MCC) and Crew Resource Management (CRM) training across continents before they ever meet on the tarmac. This, of course, will require low latency connections between the devices, which is reasonably achievable with modern internet connections.
3. Haptic-Obfuscation "Blind Switchology" Labs
Using advanced pneumatic haptic gloves (such as HaptX), training programs could force pilots to build more robust muscle memory. In a novel XR scenario, the system could simulate a severe cockpit smoke event, intentionally blinding the visual feed inside the headset. The pilot would then be forced to rely entirely on the force-feedback and tactile cues of their haptic gloves to locate and operate emergency fire-suppression switches, training a level of spatial and tactile awareness that traditional simulators cannot safely replicate.
Where to from here?
Extended Reality will not replace the venerable Level D Full Flight Simulator, nor are they intended to. Instead, they represent a restructuring of the pilot training pipeline. By combining the immersive power of MR, the analytical capabilities of AI, and the vestibular stimulation of advanced mini-motion systems, the aviation industry is building a safer, more efficient, and more scalable blueprint for the next generation of aviators.




Comments