Software System

Architecture

Our codebase is developed in Python and organized into modular components for scalability and maintainability. The primary modules include, Sensor Core, Localization Core, Robot Control Core and Mission Planner Core. We utilize the Robot Operating System (ROS) to facilitate interprocess communication across these modules.

Sensor data from the IMU, FOG, DVL, and barometer are published to ROS topics and received by the RobotControl class via custom APIs. This data, combined with PID controllers, determines the appropriate directional and magnitude commands for both translational and rotational movements. These commands are published to a mavros topic and interpreted by the flight controller, which converts them into PWM signals for the thrusters.

Localization & Navigation

Our navigation algorithm mirrors real-world mission planning by dividing the competition field into a grid and executing a predefined sequence of movements. Translational motion is monitored using a Doppler Velocity Log (DVL), while the Inertial Measurement Unit (IMU) and Fiber Optic Gyroscope (FOG) provide accurate azimuth heading data.

This system enables the AUV to travel to the vicinity of each mission target, where it relies on onboard cameras to detect objects within their limited range. Upon arrival, the AUV initiates a search routine by yawing within a defined angular sweep relative to the expected target position. For instance, when locating the Slalom channel, the AUV may yaw ±60 degrees. If no visual detection occurs, it proceeds forward incrementally, repeating the process until the pipes are identified.

For tasks requiring precise positioning, such as during the Torpedoes mission or Græy’s standby phase before returning home, we are implementing a station-keeping feature. This function uses DVL data to capture the AUV’s current position vector. A PID control loop then issues forward and lateral corrections to counteract drift, allowing the AUV to hold its position until the function is terminated by another system process.

The VectorNav IMU is integrated into our software system using its official Software Development Kit (SDK). To enhance data accuracy, we performed bench-top evaluations simulating thruster-induced vibrations using MATLAB. These experiments verified the effectiveness of a single-stage, low-pass Finite Impulse Response (FIR) filter, which we subsequently implemented in the AUV software. 

Noisy VS Filter data

Computer Vision (CV)

We developed a robust object detection system leveraging OpenCV2 and YOLOv8 (You Only Look Once, version 8). Given the challenges posed by underwater environments—such as fluctuating sunlight angles and pervasive blue hues—we prioritized shape and pattern recognition over color-based methods, which tend to be less reliable under these conditions.

To streamline the training process, we implemented HSV-based color segmentation for automatic labeling of Slalom pipes (see Appendix B.6). This enabled rapid generation of training data and improved detection accuracy in complex lighting environments.

Running Computer Vision on remote control footage.

Mission Planing

Each mission is controlled by a system of event flags, which can be triggered by either a mission timeout or a boolean indicator denoting mission success or failure. When a flag is activated, the AUV transitions to the next mission in the sequence. For example, upon successfully dropping the markers for the Bins task, a boolean flag is set to indicate completion, prompting the navigation system to proceed to the following mission. Once Onyx has completed all assigned tasks and all corresponding flags are triggered, she will initiate communication with Græy to begin the Return Home sequence.

Testing

We followed a systems engineering approach, breaking down mission requirements into simple components to structure our testing. Unit and component tests—conducted via in-air and bench setups—enable quick validation before integration. Subsystem tests assess individual functions or simple missions (e.g., Coin Toss), while system-level tests evaluate full mission sequencing and sensor integration. Each test is documented using a standardized template, with detailed logs shared on our team blog