Airborne AR

Augmented Reality is an emerging technology, with most AR companies focusing on indoor uses cases, like an AR experience in the office, a work- or maintenance shop, or at home. But, Augmented Reality is interesting in other domains as well, in particular when airborne.

VR — AR — MR — WhatR these terms?

First let’s start with distinguishing interconnected, but different terms in this space. VR — Virtual Reality is a term used for when we’re removed from our surrounding environment, and put into an experience that has nothing to do with our immediate vicinity. AR — Augmented Reality is a term used when one is still embedded into his physical surroundings, with this experience being enriched by a visual overlay on top of reality. MR — Mixed Reality or Merged Reality is a new term for AR, with emphasis of real and virtual spaces being intertwined, with interactions in the real world having an effect on the virtual and vice versa.

Ground vs. airborne AR

Most AR offerings like Magic Leap or Microsoft Hololens are focusing on ground based, mostly indoor use cases, using a 3D mapping of the immediate environment of the user by 3D SLAM & similar technologies, hand gesture recognition or small control dongles, and a semantic identification of the environment using AI. The technology stack in these offerings assumes that the user is standing with two feet on the ground and usually in a confined space.

When it comes to airborne use cases, ground based offerings fail to work, as the base assumptions are not valid — the user is looking both at the scenery and the interior of the vehicle he is in. As soon as the aircraft starts to move, the user is no longer an inertial reference frame.

The nature of augmentation is also different. In indoors AR, the system sees & creates an understanding of what is augmented. In an airborne scenario, the target of augmentation is not visible in a lot of cases, due to low visibility (night time, fog, etc.), maybe it is obstructed by the airframe or is too far in the distance (100+ nautical miles) to be recognizable.

What is the focus of airborne AR?

Airborne Augmented Reality provides a geo-conformal overlay on the scenery, or an overlay on the interior of the aircraft, for example the instrument panel. Geo-conformal means that the augmentation is ‘stuck’ to particular geo-coordinates, and will stay there, even as the aircraft or the user moves around & rotates.

Airborne AR systems are also referred to as Head Mounted Displays (HMDs) or Head Worn Displays (HWDs) in the aviation context, although symbology shown on a Head Mounted Display is not necessarily AR.

Essential components

The main components of airborne AR systems typically are:

  • Precise vehicle position & attitude
  • Head tracking inside the cockpit (optical + IMU fusion, sometimes electromagnetic + IMU fusion)
  • Vehicle & Head position & orientation fusion
  • Symbology generation (a virtual globe)

If done right, the rendered virtual globe matches the real one.

It’s like Google Earth on Real Earth™!


Head Worn Display Augmented Reality systems allow to display any kind of symbology tied to geo-coordinates. Such a technology provides a platform where symbology ‘stays’ at the designated coordinate, even as the vehicle & user moves & rotates.

Symbology for the time being target pilot oriented use cases, and as such focus on navigation & flight safety — navigation points, airways, flight routes, procedures, traffic, etc. This is a good starting point, but the possibilities are much greater than that.

Comparison to Head Up Display systems

Augmented Reality / Head Mounted Display systems have the following advantages to Head Up Display systems:

  • 360° all-around symbology vs. forward-only
  • Can draw through the airframe — no blind spots
  • Can be fitted in any seat — front row, back row, passenger, etc.
  • Shared virtual space — symbology overlay shared between users

Precision is vital

According to the upcoming Head Worn Display recommendation (see below), when looking forward such systems are allowed a total error budget of 5 milliradian (cca. 0.28°), that is made up of errors from at least the following data sources: aircraft attitude, head tracking & the display. This sounds extremely low, but due to the fact that the overlay is created for objects far away, the smallest rotational error will result in a significant overlay discrepancy. This can be crucial in low visibility environments where the real scenery is not visible, and the pilot would rely on the augmentation only. The good news is as the nature of error is rotational, the closer one gets to the target object, the smaller the spacial error becomes.

One of the important aspect of these precision requirements is that it includes the aircraft attitude error, usually coming from the built in INS or AHRS system on board. Unfortunately most general aviation grade INS / AHRS systems have a much higher attitude error, especially regarding aircraft heading. To achieve a sufficiently precise AR experience on such an aircraft, additional, more precise INS systems have to be installed or the built-in systems have to be upgraded.


An Augmented Reality experience can be achieved in a number of ways, with the central point being that the user has to see ‘reality’ and an overlay on top of it. The most obvious approach is to have a see-through display, exposing the scenery in the background, and drawing on top of it. This display can be head worn (e.g. a pair of glasses or mounted on a helmet), or can be a window. Or, a VR glass can be used with a live video feed showing the composited live image of the environment & overlay on top — although this is unusual.

NASA Fused Reality demonstration 2015

Hackernoon provides a good overview on the various display types available — let’s review them briefly.


A waveguide as the name suggests is a physical structure in optics that guides a light wave to the user’s eye. This is done by means of internal reflection and the contraption controls the movement of light between entry and exit. There are multiple types of waveguides, which all share the principle concept of operation.


Digital Light Projector (DLP) Microdisplays and Liquid Crystal on Silicon (LCoS) Microdisplays are also emerging, with the latter being used in the Magic Leap One and the Microsoft Hololens.


Rockwell Collins

The Rockwell Collins F-35 Gen III Helmet Mounted Display System is a highly specialized HWD system for military use with a corresponding price point.

Thales TopMax

The Thales TopMax is the result of re-puprosing the Thales Visionix Scorpion helmet based system for civilian use.

Elbit SkyLens

The Elbit SkyLens is positioned as a ‘wearable HUD’.

Aero Glass

Aero Glass is a smart glass based full AR aviation solution. (disclaimer: the author of this article is the founder of Aero Glass.)


Red6AR is positioned as a helmet based military training solution.

Use cases

a scene from Oblivion, showing an airspace boundary

For pilots, the obvious use cases are navigation, safety and training. For navigation, vital information can be overlaid on top of reality such as navigation points, airspace boundaries, airways, intended flight route, traffic or even the contour of the ground & mountains. For training, virtual objects such as obstacle courses, virtual enemies, aircraft in formation, refueling aircraft or similar objects can be visualized.

If the aircraft is equipped with an autonomy system, such system may reach non-obvious conclusions, that is difficult for a pilot or passengers to trace and may surprise or startle them. In such cases, and AR solution can visualize the information the autonomy system has used and the conclusion it has reached as an overlay on reality, with also visualizing the ‘next steps’ to be taken by the aircraft, so that the pilot & passengers are aware of why things are happening and what is going to happen. This can reassure all on board that everything is happening as per autonomy system expectations.

Additional crew members can also benefit from an AR visualization overlay — such as targeting symbology, the ideal drop point for logistics operations or search & rescue related symbology — e.g. visual cues about the person to be rescued.

A shared virtual space can benefit communication between crew members, multiple aircraft, or between air traffic control and aircraft.

a scene from Iron Man, showing an urban tourism related overlay of a Ferris wheel

Passengers can also benefit from an AR overlay, as they can be reassured about the flight — are we at the right location? Within the appropriate airspace boundaries? Are we following our intended flight path? Are the other aircraft far enough? In addition, tourism related information can be shown as an overlay above an urban environment, which can be very accommodating in an urban flying taxi scenario. Advertisements can be embedded in the overlay as well.

Cockpit integration

Due to the need of having to track the users head, some level of cockpit integration is required by all HMD / HWD solutions. While some of the current solutions look intrusive, future solutions need not be. With the area of computer vision advancing rapidly, future optical tracking solutions will not need special ‘markers’ placed in the cockpit, but will track the shape of the cockpit as it is. Or, may use face recognition & tracking technology to track the users face. In case of helmet based use cases, tracking the helmet is even simpler.

Would a windshield display work?

Having to wear a helmet or smart glasses is quite obtrusive. A windshield projection based system would allow for a more seamless experience — but would revert functionality to that of Head Up Displays, so that:

  • Only straight glass works as a display surface (for now)
  • 360° all around view is lost
  • Visuals through airframe (blind spot) capability lost
  • Overlay only correct for a single person
  • Head tracking still needed — could be a type of face tracking

Despite these setbacks, such an approach might be useful for passengers, as with a windshield projection, people would simply look through the a side window and see an overlay, that might not be as precise as needed by the pilot, but still useful & interesting.

Optical focus & alignment

Most displays used in HMDs show their visuals at a single, pre-set optical focus, or distance. This means that when a person wants to see what the display is showing, his eyes have to focus at that particular distance. This breaks the augmentation immersive effect, when the augmented scene is not at the same distance — and most of the time it’s not. Imaging looking at a table 1 meter (3 feet) away, with an augmented symbology on top of it using a smart glass having a focus distance at 5 meters (15 feet). Your eyes will try to focus at 1 meters and 5 meters (3 feet and 15 feet) at the same time, which is confusing and breaks the immersion.

Unfortunately currently there is no real solution to this problem in a practical form factor. Magic Leap offers two planes for visualization — one nearer, the second further. But that is still two pre-set focus distance planes. Creal3D works on a complete solution for this problem (see video above), but is currently in a prototype phase, with a form factor too big to be worn on our heads.

For the aviation use case, most symbology is quite far, and thus close to an infinity focus. Thus, infinity focus displays should work comfortably most of the time. Unfortunately, off the shelf smart glass products have optics with an optical focus distance of typically somewhere between 5–8 meters (15–24 feet), due to being tailored for indoor use.

Prescription lenses

A large portion of the population wears prescription lenses, thus when using smart glasses, they obviously clash with the existing prescription glasses. Some smart glasses solve this by allowing to ‘snap in’ a prescription insert into the smart glass. This insert has to be custom made, making this approach similar to custom prescription sunglasses and not suitable for casual use. Other approaches, especially helmet based systems, allow a prescription lens ‘below’ the head worn display.

Form factor & public acceptance

Current smart glasses are big, heavy & ugly, limiting public acceptance for everyday use cases. Consumer electronics companies like Apple, Samsung, etc. plan to release consumer grade smart glasses by 2022.

Professional users (e.g. pilots) are more open to use specialized hardware, with more bulky smart glasses possibly integrated with headsets or helmets.

Regulatory context

The SAE G-10 HWD (Head Worn Display) committee has been working on a new recommendation called the Aerospace Recommended Practice 6377 based on ARP 5288 (HUD), since 2016. The goal of the committee is a spec for Head Worn Displays as a primary navigation devices for CFR Part 23, 25, 27 and 29. The recommendation deals with flight safety, obstruction of vision, geo-conformality precision and other topics. The geo-conformality precision requirement for forward looking content is 5 milliradian (cca. 0.28°), while sideways precision requirements are more relaxed.


Augmented Reality in aviation is an exciting space, that adds to the safety of aviation, can open up addition uses like training and can also be exciting for passengers. Due to the nature of technologies used, ‘ground based’ AR solutions do not work in an aviation context, thus specialized solutions are needed. In due time, Head Worn Display solutions will be certified as primary navigation devices on aircraft, making current HUDs or glass cockpits obsolete.

Imagine flying like a bird — without an instrument panel blocking your view, just a pair of glasses and all the scenery laid out in front of you!

Ákos Maróy is an aviation innovation expert, commercial pilot, the founder or Aero Glass — an award winning AR startup for pilots.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ákos Maróy

CEO | CTO | Company Founder | Entrepreneur | Engineering Manager | Commercial Pilot