Whole farm autonomous navigation with automatic mode switching
          As the leader of this project, we created a hybrid navigation system that autonomously switches
            between different sensing modalities. The system accommodates for the unreliable Global Navigation Satellite System (GNSS)
            under the plant canopy by leveraging row-crop structure to autonomously navigate. This method showed significant improvements
            in achieving about 750 m per intervention over GNSS-based navigation and 500 m over row-following navigation.
        
       
      
        
        
          
          
Traversability prediction for unstructured outdoor environments
          This was a project led by me where we demonstrated a self-supervised approach for predicting traversability
            for mobile robots in outdoor unstructured environments. The method, denominated WayFAST, uses RGB and depth data to predict a
            traversability map based on kinodynamic models that estimate traction. This allows for training
            a traversability prediction neural network in a self-supervised manner without requiring heuristics utilized by previous methods.
            Field testing in various environments, including sandy beaches, forest canopies, and snow-covered grass fields, demonstrated
            the effectiveness of WayFAST in avoiding obstacles and untraversable terrain.
        
       
      
        
        
          
          
Model predictive control for agricultural navigation
          I was responsible for designing a model predictive control system that precisely guided a mobile robot
            in challenging agricultural environments, while also being robust to disturbances. This project required a deep understanding
            of both control theory and the unique challenges presented by agricultural environments. This controller design is able to
            work with either visual or LiDAR-based perception.
        
       
      
        
        
          
Under-canopy visual-based navigation
          As part of this project, I contributed to the development of CropFollow, a system for visually guided
            autonomous navigation of under-canopy farm robots. Using machine learning and model predictive control, we overcame the
            challenges of unreliable GPS and LiDAR, high sensing cost, and challenging farm terrain. I specifically worked on the design
            and implementation of the model predictive control for robust perception with low-cost cameras. Our system outperformed
            state-of-the-art LiDAR-based systems in extensive field testing spanning over 25 km, making a significant impact in the
            field of precision agriculture.
        
       
      
        
        
          
Neural network-based adaptive MPC
          In this project, I participated in the development and implementation of a neural network-based adaptive
            model predictive control system. This system was designed to learn uncertainties and model mismatches, allowing it to correct the model
            dynamics in real-time. By using a neural network model, we were able to create a control system can adapt to highly complex uncertainties,
            providing safe control even in the face of changing conditions and uncertainties.
        
       
      
        
        
          
Mirã's navigation
          I played a key role in developing the Mirã robot's navigation system, which uses advanced perception
            and mapping techniques to navigate through soybean crops with accuracy and safety. The robot's embedded soil analysis
            capabilities are also a major breakthrough, providing farmers with real-time data about their crops. Working on the Mirã
            project was a challenging but rewarding experience, and I am proud to have contributed to a major advancement in precision agriculture.
        
       
      
        
        
          
Mirã's perception system
          I played a key role in the design and development of a three-dimensional perception system for
            the Mirã robot. This system was based on LiDAR sensors and creates a costmap from geometrical features of the scene
            to provide a safe and reliable autonomous navigation for the robot.
        
       
      
        
        
          
Design of the Mirã robot
          I am proud to have been a part of the team responsible for the development of the Mirã robot.
            This innovative robot was specifically designed at the University of São Paulo for soil analysis in soybean crops,
            and I was directly involved in the design and building the robotic platform, as well as the programming of the low-level
            algorithms and controllers that make this robot move. It was a challenging project, but the end result was a highly functional and efficient robot that
            is already making a real difference in the field. Being a part of this team was a fantastic experience, and I am excited
            to continue working on projects that have a positive impact on the world.