Abstract

This project aims to demonstrate that distance detection using LiDAR, on top of accurate object classification, is necessary to enhance an autonomous surface vehicle’s ability to navigate dynamic water environments by accounting for the dimension of depth in path planning. The annual RoboBoat competition challenges autonomous surface vehicles to navigate a dynamic aquatic environment using computer vision, path planning, and obstacle avoidance. To meet these demands, our vehicle, Team Inspiration’s RoboBoat, is equipped with a long-range camera as well as a LiDAR sensor. The project demonstrates the fusion of RGB camera data with LiDAR depth data and a path planning algorithm taking advantage of the new depth information. The main result was that the boat followed the path outlined by the buoys much more accurately, avoiding veering off course entirely.




Hardware Components

Team Inspiration RoboBoat