top of page
An image consisting of mission computers

Past Achievements / Experimental Work

We have previously created technology applications in the areas described in this section however we are currently focusing elsewhere. For inquiries on this work please email:

Image depicting Autonodyne enabled mission computers

Autonodyne Enabled Mission Computers

We have modified existing Avidyne FAA-certified mission computers that were originally designed to be the principal device that human pilots use to operate conventional aircraft, talk with air traffic control, navigate their way through the airspace and a host of other functions.


The human pilots interact directly with the devices through on-screen displays and input devices like knobs and buttons. Autonodyne was able to enhance that system to also allow full interaction and control from off-board locations.


In other words, for our Optionally Piloted Vehicle (OPV) systems we’ve built and flown, the principal operator was located on the ground using Autonodyne RCU-1000 control stations, connected over datalink to the aircraft and manipulated the on-board mission computer software from the ground.

In this case, we were able to retain all of that FAA-certified code in the aircraft units and add software to facilitate that off-board control inputs. 

In the case of clean-sheet design mission computer creation, we have created software that was optimized for uncrewed aircraft when the concept of a human on-board looking at displays did not apply.

Vehicle navigation, health and status monitoring, subsystem management and the rest are still required to be performed and we were able to do so using much more efficient software that didn’t need to make accommodations for humans on-board.

Smart Automation & Simplified  Vehicle Operation

In aviation, Smart Automation is defined as automating elements of flight preparations and flight control. Specifically, it targets context-sensitive tasks to reduce the workload peaks and overall workload of humans.  Some examples, to name a few, include:

  • Automating checklists.

  • Extensive monitoring and data logging of a flight system’s state.

  • Using connectivity to create a “distributed cockpit.”

  • Automatic route planning based on vehicle state or contingency.

  • Vastly simplifying “cockpit” displays to be less aviation-specific and more in-line with visualizations people are used to.

  • Enabling reduced crew operations.

Autonodyne treats Smart Automation as an initial “back stop” to mitigate the human as a single point of failure in decision making. In the long run, predominantly human-run operations will give way to ultra-reliable automation. Ultimately enabling full autonomy.

Flight system and flight mode complexity keeps increasing. As does the variation between aircraft and software versions in fielded legacy systems. These few trends alone highlight the need for a simpler approach.

Autonodyne believes Smart Automation starts with flight critical but deterministic tasks (e.g., nominal checklist usage, system monitoring, etc.). In addition, as a pilot’s experience and confidence in Smart Automation increases, the automation pendulum will swing to also include non-deterministic tasks (flight/mission planning, contingency planning, decision support, self-preservation, reaction to imminent threats) in the transition from those human-run operations to ultra-reliable automation.

We are currently operating an Optionally Piloted Vehicle (experimental Cessna 182) where we experiment with smart automation, simplified vehicle operations, and advanced forms of pilot assistance. We have also modified a Cirrus SR-22 to serve as an OPV and flying testbed for these technologies.

Augmented Reality (AR) Work

An image depicting someone using AR

Autonodyne’s has conducted considerable research and development to create an Augmented Reality (AR) control station using commercially available AR devices like the Microsoft HoloLens, or Meta 2. Our AR flight-testing operations have identified a few areas where AR can have a profound impact. They include:

  • Having 3D holographic representations of control station functionality.

  • Being able to remotely maintain a flight vehicle.

  • Enabling the operator to monitor and supervise a “theater” of operations in 3D.

  • Providing innovative interactive swarm control.

An image showing what the AR would look like
bottom of page