The demo is always impressive
A technician puts on the headset. A holographic overlay appears - wiring diagrams, torque specs, step-by-step inspection checklists hovering exactly where they need to be. No manual to flip through. No looking away from the aircraft. The system guides you through the task. Errors drop. Speed increases. Everyone in the conference room nods.
Then you go to an actual maintenance hangar.
I've spent time in those hangars. I've sat across from aircraft maintenance technicians and asked them, in detail, what their work actually looks like - not what the manual says it should look like, but what happens on a Tuesday morning with a tight turnaround window and a part that doesn't match the specification. What I found challenges almost everything the AR demo promises.
The Conference Room Is Not the Hangar
Aviation maintenance is one of the most safety-critical work environments in the world. An error here is not a bad user experience. It can be the beginning of a chain of events that ends in catastrophe. Technicians who have been doing this work for 10 or 20 years know this. Their caution is not a personality trait. It is a professional norm built on the understanding that the cost of a mistake is high and irreversible.
AR/XR systems enter this environment with a specific promise: reduce reliance on paper manuals, surface the right information at the right time, and guide technicians through complex inspection procedures. The technology is real. The potential is real. But between the demo and the hangar floor, something gets lost.
In a series of interviews I conducted with experienced maintenance technicians as part of NSF-funded research, I heard the same concern expressed in different ways. One technician put it simply: "I trust my hands. I don't know if I trust that thing yet."
That sentence tells you more about XR adoption in aviation than most market analyses.
What Technicians Are Actually Worried About
When you ask technicians whether they'd use an XR system, they don't say no. They say it depends - and then they list conditions that most demo environments never account for.
Physical ergonomics under task demands: Aircraft maintenance requires sustained physical work in cramped spaces, under poor lighting, with tools in hand and a body contorted around the structure. A headset that feels fine in a well-lit lab for 20 minutes becomes a different device at hour three of an inspection. Weight distribution, field-of-view occlusion, heat buildup - these matter enormously for a population whose job is physically demanding and whose alertness directly affects safety.
The trust problem with novel information: Experienced technicians have built their knowledge over years of pattern recognition. When an AR/XR system overlays information that doesn't match what they know - or worse, provides information they can't independently verify - it creates a conflict they have no protocol for resolving. Who do they trust: the system or their experience? Right now, there's no clear answer. And in a safety-critical environment, ambiguity is a problem.
Workflow disruption as a hidden cost. Maintenance tasks are not performed in isolation. Technicians coordinate with other crew members, reference multiple systems, and operate under real-time schedule pressure. A technology that interrupts coordination patterns or requires pausing a task to interact with a system interface creates friction that productivity metrics in controlled studies rarely capture.
Fear of dependency: Several technicians raised a concern that doesn't appear in most usability evaluations: what happens when the system goes down? When a technician has always had the XR system to guide them through a specific procedure, do they retain the ability to do that procedure without it? This is not a hypothetical concern for a population whose professional identity is built around knowing how to work under any conditions.
What a Usability Study Misses
Here's the deeper issue. Most XR usability studies - including many good ones - measure task completion, error rate, and time on task in controlled conditions. Participants complete a standardized procedure. Observers note deviations. The system is rated against the manual. If the AR condition outperforms the manual condition, the system is deemed effective.
But a usability study is not a deployment study. The gap between "this system performed better than a manual under these conditions" and "technicians will adopt and effectively use this system in their actual work environment" is enormous - and the field has not fully reckoned with it.
Adoption requires more than usability. It requires trust, and trust is built differently in safety-critical environments. Trust requires understanding how a system fails, not just how it performs. It requires the ability to verify the system's output independently. It requires a period of supervised use that allows technicians to develop calibrated confidence in the technology - confidence that is neither overtrust nor avoidance.
This is what the demo doesn't show. It shows a system working. It doesn't show a technician learning, over months, whether and when to trust it.
The Barriers That Survived the Hype Cycle
Based on research into XR implementation in aviation, the barriers to adoption are not primarily technical. The headsets are good enough. The tracking is accurate enough. The content management systems exist.
The barriers are organizational and human. They include:
- The absence of clear authority. Whose information takes precedence when the AR system and the technician's training disagree? Without a policy answer to this question, technicians default to their training — which is the right safety behavior, but it also means the AR system is sidelined precisely when it matters most.
- Training and qualification gaps. Introducing AR into aviation maintenance requires updating training curricula, qualification standards, and maintenance record systems. These are slow-moving, heavily regulated processes that technology adoption timelines routinely underestimate.
- Maintenance of manual competency. Regulators and technicians alike are asking whether AR dependency creates a competency risk. This is a real concern that adoption plans need to address explicitly, not assume away.
What Good Adoption Actually Looks Like
None of this means AR systems don't belong in aircraft maintenance. The evidence for their potential is real. Guided inspection procedures, remote collaboration support, and reduced reliance on paper documentation all represent genuine opportunities to reduce error rates and improve consistency. But the path to adoption runs through the technician's trust, not around it.
That means designing systems that fail gracefully and visibly - so technicians can understand what the system doesn't know. It means structured field trials with experienced technicians in their actual work environments, not controlled lab studies with participants who don't bear the professional consequences of an error. It means giving technicians ownership over the transition: involving them in configuration decisions, building in mechanisms for them to flag when the system is wrong, and treating their skepticism as information rather than resistance.
The technician who told me he didn't know if he trusted "that thing" yet was not being obstructionist. He was being exactly as careful as his job requires.
The question isn't whether AR will be adopted in aviation maintenance. The question is whether the people deploying it are willing to earn that trust properly - in the hangar, not the conference room.
I study human factors in extended reality and AI-enabled systems, with a focus on aviation maintenance and air traffic control. If this resonates with work you're doing in XR deployment or human-centered system design, I'd be glad to hear from you - find me on LinkedIn or read more about my research at mdrashi.com/research.