CSE 591: Human-aware Robotics

Class: CAVC 359, Tue/Thu, 9:00--10:15 AM

Office Hours: BYENG 558, Tue/Thu, 10:30--11:30AM


Course home Syllabus Schedule Student Projects

Selected Project Demos


Group 1: Making PeopleBot a People’s Bot


For a robot to act naturally in a human-robot teaming scenario, it needs to actively participate in a team. To achieve this, we designed a robotic system that is competent in recognizing known human entities using their face features and voice modulations. To actively participate, the robot first needs to understand the discussion, which was achieved using Google Speech to Text conversion. This text can be send to NLP modules to understand the contextual meaning, which was beyond the scope of this project. The questions we tried to answer are, who are the members of the meeting, who is speaking in the discussion, and what is being spoken (in the terms of understanding the exact text conversion).

Group 2: EEG-based Attainment of Human-Robot Kumbaya


The aim of this project is to identify pathways of integrating electroencephalographic (EEG) feedback to inform the high level task planning process of a robot. The first part of the project deals with the isolation of EEG signals that may be useful in this regard, and validating their applicability to human-robot teaming scenarios. The second part of the project seeks to exploit the affective states detected in the first part to train a learning agent to produce or conform to expectations entailed by them.

Group 3: American Sign Language Detection and Recognition


Sign language is widely used in the world. It acts an important role in life of those people who are unable to speak or listen. For the disable, they are more likely in need of assistance from robot than average people. And for a robot, understanding what people try to express using sign language is the first step to help them. That is the reason why we want to build a system to capture and recognize sign language.

Group 7: An Adaptive Game-Playing Robot CSE 591 Final Project Report
(PEOPLE'S CHOICE DEMO AWARD!)


This project describes a robotic system capable of playing the competitive board game Checkers against a human opponent. The novelty of the system lies in the manner in which it plays: unlike typical computer-based games, our system attempts to model the confidence of the human opponent based on their hand movements and facial expressions. This confidence is used to infer how skilled the opponent is at Checkers, and the robot will scale its level of play to that of the human in order to provide a satisfying experience. We evaluate the system by integrating it with a physical UR-5 robot and successfully demonstrate that it is capable of playing against a human opponent.