{"64825":{"#nid":"64825","#data":{"type":"news","title":"How Can Robots Get Our Attention?","body":[{"value":"\u003Cp\u003EGetting someone\u2019s attention can be easy with a loud noise or a shout, but what if the situation calls for a little more tact? How can a robot use subtle cues to attract a human\u2019s notice and tell when it has captured it? In a preliminary study, researchers at the Georgia Institute of Technology have found that they can program a robot to understand when it gains a human\u2019s attention and when it falls short. The research is being presented today at the Human-Robot Interaction conference in Lausanne, Switzerland.\u003C\/p\u003E\u003Cp\u003E\u201cThe primary focus was trying to give Simon, our robot, the ability to understand when a human being seems to be reacting appropriately, or in some sense is interested now in a response with respect to Simon and to be able to do it using a visual medium, a camera,\u201d said Aaron Bobick, professor and chair of the School of Interactive Computing in Georgia Tech\u2019s College of Computing.\u003C\/p\u003E\u003Cp\u003EUsing the socially expressive robot Simon, from Assistant Professor Andrea Thomaz\u2019s Socially Intelligent Machines lab, researchers wanted to see if they could tell when he had successfully attracted the attention of a human who was busily engaged in a task and when he had not. \u003C\/p\u003E\u003Cp\u003E\u201cSimon would make some form of a gesture, or some form of an action when the user was present, and the computer vision task was to try to determine whether or not you had captured the attention of the human being,\u201d said Bobick.\u003C\/p\u003E\u003Cp\u003EWith close to 80 percent accuracy Simon was able to tell, using only his cameras as a guide, whether someone was paying attention to him or ignoring him.\u003C\/p\u003E\u003Cp\u003E\u201cWe would like to bring robots into the human world. That means they have to engage with human beings, and human beings have an expectation of being engaged in a way similar to the way other human beings would engage with them,\u201d said Bobick.\u003C\/p\u003E\u003Cp\u003E\u201cOther human beings understand turn-taking. They understand that if I make some indication, they\u2019ll turn and face someone when they want to engage with them and they won\u2019t when they don\u2019t want to engage with them. In order for these robots to work with us effectively, they have to obey these same kinds of social conventions, which means they have to perceive the same thing humans perceive in determining how to abide by those conventions,\u201d he added.\u003C\/p\u003E\u003Cp\u003EResearchers plan to go further with their investigations into how Simon can read communication cues by studying whether he can tell by a person\u2019s gaze whether they are paying attention or using elements of language or other actions.\u003C\/p\u003E\u003Cp\u003E\u201cPreviously people would have pre-defined notions of what the user should do in a particular context and they would look for those,\u201d said Bobick. \u201cThat only works when the person behaves exactly as expected. Our approach, which I think is the most novel element, is to use the user\u2019s current behavior as the baseline and observe what changes.\u201d\u003C\/p\u003E\u003Cp\u003EThe research team for this study consisted of Bobick, Thomaz, doctoral student Jinhan Lee and undergraduate student Jeffrey Kiser.\u003C\/p\u003E\u003Cp\u003EVideo: \u003Ca href=\u0022http:\/\/www.youtube.com\/watch?v=F1SOoGJGT3I\u0026amp;feature=channel_video_title\u0022 target=\u0022_blank\u0022\u003EHow Can Robots Get Our Attention?\u003C\/a\u003E\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EA research group including Aaron Bobick and Andrea Thomaz (\u003Cem\u003Eboth Interactive Computing\u003C\/em\u003E) has found that they can program a robot to understand when it gains a human\u2019s attention and when it falls short. \u003Cem\u003ESource: GT Communications \u0026amp; Marketing\u003C\/em\u003E\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":"","uid":"27154","created_gmt":"2011-03-08 13:32:53","changed_gmt":"2016-10-08 03:08:22","author":"Louise Russo","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2011-03-08T00:00:00-05:00","iso_date":"2011-03-08T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"groups":[{"id":"47223","name":"College of Computing"}],"categories":[],"keywords":[{"id":"12286","name":"Aaron Bobick"},{"id":"11526","name":"Andrea Thomaz"},{"id":"11892","name":"RIM@GT"},{"id":"1356","name":"robot"},{"id":"667","name":"robotics"},{"id":"168887","name":"simon"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EDavid Terraso\u003C\/p\u003E\u003Cp\u003E404-385-2966\u003C\/p\u003E","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}