Autonomy holds a great promise by improving the applications, safety, and efficiency of flight. If little operator input is necessary, unmanned rotorcraft have a wide range of applications ranging from cargo delivery to inspection. Currently unmanned rotorcraft are underutilized because they either have to fly on preplanned missions at high altitude or require careful teleoperation. A capable autonomous rotorcraft will have to react quickly to previously unknown obstacles, land at unprepared sites, and fly with semantic information to enable long-term autonomy in cluttered environments.
In this talk we present how pushing the performance and safety of these systems requires us to develop novel approaches in perception and motion planning. In particular we address how a supervisory layer in the motion planning system can improve safety, a sensor steering system enables us to optimize coverage for safe trajectories, and how semantic information can help us guide the rotorcraft.
While great results have been demonstrated, fundamental limitations remain in the fragile, myopic and static nature of these systems. In our research we are addressing these issues by developing rich planning problem representations and approaches that can adapt to and solve these problems. This will permit unmanned rotorcraft to operate where they have their greatest advantage: In unstructured, unknown environments at low-altitude.
See more on this video at
https://www.microsoft.com/en-us/research/video/safe-robust-autonomous-flight-challenging-conditions/