Link Search Menu Expand Document

Guaranteeing Safety of Learned Perception Modules via Measurement-Robust Control Barrier Functions

Paper PDF Code

Authors

Sarah Dean (UC Berkeley)*; Andrew Taylor (Caltech); Ryan Cosner (Caltech); Benjamin Recht (UC Berkeley); Aaron Ames (Caltech)

Interactive Session

2020-11-17, 11:10 - 11:40 PST | PheedLoop Session

Abstract

Modern nonlinear control theory seeks to develop feedback controllers that endow systems with properties such as safety and stability. The guarantees ensured by these controllers often rely on accurate estimates of the system state for determining control actions. In practice, measurement model uncertainty can lead to error in state estimates that degrades these guarantees. In this paper, we seek to unify techniques from control theory and machine learning to synthesize controllers that achieve safety in the presence of measurement model uncertainty. We define the notion of a Measurement-Robust Control Barrier Function (MR-CBF) as a tool for determining safe control inputs when facing measurement model uncertainty. Furthermore, MR-CBFs are used to inform sampling methodologies for learning-based perception systems and quantify tolerable error in the resulting learned models. We demonstrate the efficacy of MR-CBFs in achieving safety with measurement model uncertainty on a simulated Segway system.

Video

Reviews and Rebuttal

Reviews & Rebuttal


Conference on Robot Learning 2020