WayFASTER

WayFASTER: a Self-Supervised Traversability Prediction for Increased Navigation Awareness

Field Robotics Engineering and Science Hub (FRESH), Illinois Autonomous Farm, University of Illinois at Urbana-Champaign (UIUC), IL
Correspondence to {mvalve2, girishc}@illinois.edu
Accepted for the 2024 IEEE International Conference on Robotics and Automation (ICRA 2024)

Accurate and robust navigation in unstructured environments requires fusing data from multiple sensors. Such fusion ensures that the robot is better aware of its surroundings, including areas of the environment that are not immediately visible, but were visible at a different time. To solve this problem, we propose a method for traversability prediction in challenging outdoor environments using a sequence of RGB and depth images fused with pose estimations. Our method, termed WayFASTER (Waypoints-Free Autonomous System for Traversability with Enhanced Robustness), uses experience data recorded from a receding horizon estimator to train a self-supervised neural network for traversability prediction, eliminating the need for heuristics. Our experiments demonstrate that our method excels at avoiding geometric obstacles, and correctly detects that traversable terrains, such as tall grass, can be navigable. By using a sequence of images, WayFASTER significantly enhances the robot's awareness of its surroundings, enabling it to predict the traversability of terrains that are not immediately visible. This enhanced awareness contributes to better navigation performance in environments where such predictive capabilities are essential.

WayFASTER is able to safely navigate a robot in natural environments while keeping track of the untraversable areas for longer.

WayFASTER Overview

WayFASTER is a modular architecture. In our method, a sequence of images are used to predict a local traversability map in the bird's-eye-view. The traverasbility map is used by a cost function for the model predictive control (MPC) block. The controller generates locally optimal goal-oriented trajectories of good traction that can safely guide the robot.
Schematic