top of page


KEF Robotics’s mission is to leverage cameras to provide onboard safety and autonomy capabilities to new and existing aircraft.  We write lightweight, robust, and adaptable software to transform camera images into information, giving an aircraft what it needs - it’s location, it’s trajectory, and it’s surroundings - to fly with confidence.

KEF partners with aircraft developers to design, implement and integrate autonomy on their aircraft. Partnering with KEF allows an aircraft developer to upgrade their aircraft without hiring their own software team, and benefit from KEF's years of development and thousands of hours of flight time. 

If you're interested in upgrading your aircraft with the latest autonomy solutions, contact KEF to get flying fast.

KEF's tech capabilities.


KEF can leverage a single camera and optical flow to provide low-drift 'relative' navigation to an aircraft.  Our Tailwind software works on nearly any camera - EO and IR, rolling or global shutters, with fixed or gimballed mounting.  Our software  provides robust, high rate, low-drift outputs  whether day or night, without GPS or landmarks.  

Visual navigation is one of KEF’s most advanced capabilities, and a technical realm in which we can claim true leadership and best-in-class performance.  We’ve flown aircraft as small as 2 pounds while using and outdated processor, and we’ve flown aircraft as fast as 100 mph in 0 lux conditions.  Our solution represents a step-change in reliability and performance for GPS-denied onboard navigation.


KEF can use visual landmarks to determine an aircraft’s location relative to a map, providing a ‘Global’ position estimate much like GPS.  Our software provides a solution regardless of the season or time of day and is robust to weather and reduced visibility.  Using the latest advances in Machine Learning, on networks suitable for the limited amount of computational power onboard small aircraft, we’re broadening the types of aircraft that can leverage this powerful capability to truly replace GPS.


Stereo cameras allow our software to detect hazards using EO and IR cameras and plan safe trajectories around obstacles.   We’re working on numerous improvements to our hazard detection to provide for high-rate, omni-directional hazard mapping and ‘monocular’ depth estimation techniques.  Our work on passive, day-night hazard detection methods for the Army is pushing the boundaries of what is possible with cameras and Machine Learning to allow small aircraft to fly safely even in cluttered urban environments.


Getting an aircraft to the right spot safely is the first task of KEF’s software, but autonomous aircraft are far more useful if they know what to do when they get ‘there’.  KEF leverages machine learning to recognize and react to mission-relevant observations.  We train lightweight networks to classify, recognize, and localize objects of interest to our customers, allowing aircraft to perform mission services intelligently and return home when the work is done.

KEF’s strengths in this area aren’t in the ML itself, but in its deployment on complex aerial systems in closed-loop flight, when computational run-time constraints are acute.

Field Test Prep
bottom of page