CoRL 2020, Spotlight Talk 211: Probably Approximately Correct Vision-Based Planning using Motion …

“**Probably Approximately Correct Vision-Based Planning using Motion Primitives**
Sushant Veer (Princeton University)*; Anirudha Majumdar (Princeton University)
Publication: http://corlconf.github.io/paper_211/

**Abstract**
This paper presents an approach for learning vision-based planners that provably generalize to novel environments (i.e., environments unseen during training). We leverage the Probably Approximately Correct (PAC)-Bayes framework to obtain an upper bound on the expected cost of policies across all environments. Minimizing the PAC-Bayes upper bound thus trains policies that are accompanied by a certificate of performance on novel environments. The training pipeline we propose provides strong generalization guarantees for deep neural network policies by (a) obtaining a good prior distribution on the space of policies using Evolutionary Strategies (ES) followed by (b) formulating the PAC-Bayes optimization as an efficiently-solvable parametric convex optimization problem. We demonstrate the efficacy of our approach for producing strong generalization guarantees for learned vision-based motion planners through two simulated examples: (1) an Unmanned Aerial Vehicle (UAV) navigating obstacle fields with an onboard vision sensor, and (2) a dynamic quadrupedal robot traversing rough terrains with proprioceptive and exteroceptive sensors.”

YouTube Source for this Robot AI Video

AI video(s) you might be interested in …