Show simple item record

dc.contributor.advisorRoppel, Thaddeus
dc.contributor.authorYuan, Bowen
dc.date.accessioned2017-11-30T16:23:34Z
dc.date.available2017-11-30T16:23:34Z
dc.date.issued2017-11-30
dc.identifier.urihttp://hdl.handle.net/10415/6024
dc.description.abstractAutonomous mobile robots (ARM’s) are defined by their ability to perform some tasks independently from direct human interaction. However, interaction with humans at some point is necessary. There are some situations in which humans can give instructions to robots most effectively with physical gestures. Example include scenarios that require silence, such as covert military operations or hospital rooms, loud assemblies, underwater, or in space. Artificial neural networks (ANN’s) have long been used for pattern recognition. If well- trained with a suitable dataset, an ANN can provide a satisfactory result for a complex task. ROS (Robot Operating System) is a mainstream software framework for robotics research throughout the world. It can streamline the development of robots through code reuse. This thesis describes a partial solution to help robots understand human gesture by combining an ANN and ROS, with a Kinect sensor as the primary input. Five gestures are trained, and then recognized when performed by people other than the trainer. The overall success rate of gesture recognition in this study is 80 percent. Some gestures are recognized with more than 90 percent success.en_US
dc.subjectElectrical and Computer Engineeringen_US
dc.titleNeural Network Based Gesture Recognition Roboten_US
dc.typeMaster's Thesisen_US
dc.embargo.lengthen_US
dc.embargo.statusNOT_EMBARGOEDen_US
dc.contributor.committeeReeves, Stanley
dc.contributor.committeeGong, Xiaowen


Files in this item

Show simple item record