Science Models
sciencefairprojects.co.in
Science Fair Projects

Published on Sep 05, 2023

Abstract

The objective:

Conventional ground robot navigation and path finding is often inefficient and time-consuming, especially in a maze-like environment. Aerial monocular vision, however, provides a novel perspective in finding the path finding for robot navigation. Aerial monocular vision in combination with ground robot was compared to solely ground robot navigation for operational time.

Methods/Materials

A ground robotics platform was based off an iRobot Create and laptop. Aerial vision was achieved through the Parrot AR.Drone quadrocopter with a built-in camera. A laptop was connected to the camera feed of the quadrocopter via socket connections to its wireless network. Java programming language was used for both quadrocopter control and image processing.

The quadrocopter was initiated and hovered above the robot and maze environment. Images acquired were initially processed to classify regions as either obstacle or traversable area. Start and end point regions were then classified within the image.

A breadth first search (BFS) algorithm was employed to determine the shortest navigational path that avoids obstacles. When a traversable path between the detected start and end points is found, the ground robot is sent movement vector commands to navigate around the obstacles.

Results


After a series of trial runs, the novel navigation yielded an average run time of 38.45 seconds while the conventional navigation resulted in an average run time of 140.57 seconds. The addition of aerial vision from the quadrocopter resulted in a 72.6 percent improvement in operation time for the ground robot.

Conclusions/Discussion

These findings demonstrate rich data provided from aerial monocular vision significantly enhances and improves robot navigation. The increased complexity of a multi-modal robotics platform yielded improvements in navigation time.

Aerial monocular vision in combination with ground robot was compared against solely ground robot navigation in operational time.

Science Fair Project done By Kenny Lei