{"64794":{"#nid":"64794","#data":{"type":"news","title":"Teaching Robots to Move Like Humans","body":[{"value":"\u003Cp\u003EWhen people communicate, the way they move has as much to do\nwith what they\u2019re saying as the words that come out of their mouths. But what\nabout when robots communicate with people? How can robots use non-verbal communication\nto interact more naturally with humans? Researchers at the Georgia Institute of\nTechnology found that when robots move in a more human-like fashion, with one\nmovement leading into the next, that people can not only better recognize what\nthe robot is doing, but they can also better mimic it themselves. The research\nis being presented today at the Human-Robot Interaction conference in Lausanne,\nSwitzerland.\u003C\/p\u003E\n\n\n\n\u003Cp\u003E\u201cIt\u2019s important to build robots that meet people\u2019s social\nexpectations because we think that will make it easier for people to understand\nhow to approach them and how to interact with them,\u201d said Andrea Thomaz, assistant\nprofessor in the School of Interactive Computing at Georgia Tech\u2019s College of\nComputing.\u003C\/p\u003E\n\n\n\n\u003Cp\u003EThomaz, along with Ph.D. student Michael Gielniak, conducted\na study in which they asked how easily people can recognize what a robot is\ndoing by watching its movements.\u003C\/p\u003E\n\n\n\n\u003Cp\u003E\u201cRobot motion is typically characterized by jerky movements,\nwith a lot of stops and starts, unlike human movement which is more fluid and\ndynamic,\u201d said Gielniak. \u201cWe want humans to interact with robots just as they\nmight interact with other humans, so that it\u2019s intuitive.\u201d\u003C\/p\u003E\n\n\n\n\u003Cp\u003EUsing a series of human movements taken in a motion-capture\nlab, they programmed the robot, Simon, to perform the movements. They also optimized\nthat motion to allow for more joints to move at the same time and for the\nmovements to flow into each other in an attempt to be more human-like. They\nasked their human subjects to watch Simon and identify the movements he made.\u003C\/p\u003E\n\n\n\n\u003Cp\u003E\u201cWhen the motion was more human-like, human beings were able\nto watch the motion and perceive what the robot was doing more easily,\u201d said\nGielniak.\u003C\/p\u003E\n\n\n\n\u003Cp\u003EIn addition, they tested the algorithm they used to create\nthe optimized motion by asking humans to perform the movements they saw Simon\nmaking. The thinking was that if the movement created by the algorithm was\nindeed more human-like, then the subjects should have an easier time mimicking\nit. Turns out they did.\u003C\/p\u003E\n\n\n\n\u003Cp\u003E\u201cWe found that this optimization we do to create more\nlife-like motion allows people to identify the motion more easily and mimic it\nmore exactly,\u201d said Thomaz.\u003C\/p\u003E\n\n\n\n\u003Cp\u003EThe research that Thomaz and Gielniak are doing is part of a\ntheme in getting robots to move more like humans move. In future work, the pair\nplan on looking at how to get Simon to perform the same movements in various\nways.\u003C\/p\u003E\u003Cp\u003E\u201cSo, instead of having the robot move the exact same way\nevery single time you want the robot to perform a similar action like waving, you\nalways want to see a different wave so that people forget that this is a robot\nthey\u2019re interacting with,\u201d said Gielniak.\u003C\/p\u003E\u003Cp\u003EVideo: \u003Ca href=\u0022http:\/\/www.youtube.com\/watch?v=GHmWtnqkObg\u0022 target=\u0022_blank\u0022\u003ETeaching Robots to Move Like Humans\u003C\/a\u003E\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EAndrea Thomaz (\u003Cem\u003EInteractive Computing\u003C\/em\u003E) hopes that building robots to \u0022meet people\u0027s social expectations\u0022 will make it easier for people to interact with them. \u003Cem\u003ESource: GT Communications \u0026amp; Marketing\u003C\/em\u003E\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Researchers find people can better understand robot movements when robots move in a more human way."}],"uid":"27154","created_gmt":"2011-03-07 13:58:37","changed_gmt":"2016-10-08 03:08:18","author":"Louise Russo","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2011-03-07T00:00:00-05:00","iso_date":"2011-03-07T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"groups":[{"id":"47223","name":"College of Computing"}],"categories":[],"keywords":[{"id":"11526","name":"Andrea Thomaz"},{"id":"654","name":"College of Computing"},{"id":"4887","name":"GVU Center"},{"id":"12250","name":"Michael Gielniak"},{"id":"11892","name":"RIM@GT"},{"id":"1356","name":"robot"},{"id":"166848","name":"School of Interactive Computing"},{"id":"168887","name":"simon"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EDavid Terraso\u003C\/p\u003E\u003Cp\u003E404-385-2966\u003C\/p\u003E","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}