CropNav: a Framework for Autonomous
Navigation in Real Farms

Mateus Valverde Gasparino1,2
Vitor Akihiro Hisano Higuti2
Arun Narenthiran Sivakumar1,2
Andres Eduardo Baquero Velasquez1
Marcelo Becker3
Girish Chowdhary1
1 Field Robotics Engineering and Science Hub (FRESH), Illinois Autonomous Farm, University of Illinois at Urbana-Champaign (UIUC), IL
2 EarthSense Inc., Champaign, IL, USA
3 University of São Paulo, São Carlos, SP, Brazil
Correspondence to {mvalve2, girishc}
Under review

Small robots that can operate under the plant canopy can enable new possibilities in agriculture. However, unlike larger autonomous tractors, autonomous navigation for such under canopy robots remains an open challenge because Global Navigation Satellite System (GNSS) is unreliable under the plant canopy. We present a hybrid navigation system that autonomously switches between different sets of sensing modalities to enable full field navigation, both inside and outside of crop. By choosing the appropriate path reference source, the robot can accommodate for loss of GNSS signal quality and leverage row-crop structure to autonomously navigate. However, such switching can be tricky, and difficult to execute over scale. Our system provides a solution by automatically switching between an exteroceptive sensing based system, such as Light Detection And Ranging (LiDAR) row-following navigation and waypoints path tracking. In addition, we show how our system can detect when the navigate fails and recover automatically extending the autonomous time and mitigating the necessity of human intervention. Our system shows an improvement of about 750 m per intervention over GNSS-based navigation and 500 m over row following navigation.