Undergrad Research Project - Turtlebot Human Tracking with Kinect

Fall 2015

Student
Raghav Goyal
Advisor
Stephanie Rosenthal
Project description

This project focuses on robot perception and decision-making with regards to explaining how and why robots make certain decisions in a given situation. The primary goal is to add object detection and tracking functionality to the Turtlebot robot, specifically focusing on detecting and interacting with humans. These capabilities will allow the robot to perceive, analyze, and interact with its environment based on information extracted from visual feeds, in essence adding a new platform for interaction for the robot. We will build upon this new feature by having the robot complete reactionary tasks by examining its environment.

We will utilize the Microsoft Kinect sensor for image processing needs. By integrating the Kinect with the robot’s Robot Operating System (ROS) framework, we will be able to use visual feedback data to drive the robot’s decision-making processes. The Kinect’s speech recognition offers another medium of interaction and will be used to gain input from humans that will enhance the robot’s decision capabilities. A modular approach will be used for image processing algorithms to allow for the flexibility of changing object detection techniques to handle various types of objects. Prototyping and preliminary testing will be conducted on a Turtlebot simulation before moving onto the real robot for further testing and improvement.

The minimal expectation for this study is to have the Kinect integrated with the Turtlebot software stack and be able to control robot actions using information gathered by the Kinect. We will evaluate our progress by having the robot perform tasks autonomously. The primary task is having the robot track and follow a human. Secondary objectives that build upon the minimal expectation include having the robot follow another robot, using speech recognition to follow certain commands, and integrating LEDs or other methods to communicate the robot’s thought process in real time.

Return to project list