{"115111":{"#nid":"115111","#data":{"type":"news","title":"Teach Your Robot Well (Georgia Tech Shows How)","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003EATLANTA \u2013 March 8, 2012 \u2013\u003C\/strong\u003E Within a decade, personal robots \ncould become as common in U.S. homes as any other major appliance, and \nmany if not most of these machines will be able to perform innumerable \ntasks not explicitly imagined by their manufacturers. This opens up a \nwider world of personal robotics, in which machines are doing anything \ntheir owners can program them to do\u2014without actually being programmers.\u003C\/p\u003E\u003Cp\u003ELaying some helpful groundwork for this world is, a new study by researchers in Georgia Tech\u2019s \u003Ca href=\u0022http:\/\/robotics.gatech.edu\/\u0022 target=\u0022_self\u0022\u003ECenter for Robotics \u0026amp; Intelligent Machines\u003C\/a\u003E\n (RIM), who have identified the types of questions a robot can ask \nduring a learning interaction that are most likely to characterize a \nsmooth and productive human-robot relationship. These questions are \nabout certain features of tasks, more so than labels of task components \nor real-time demonstrations of the task itself, and the researchers \nidentified them not by studying robots, but by studying the everyday \n(read: non-programmer) people who one day will be their masters. The \nfindings were detailed in the paper, \u201cDesigning Robot Learners that Ask \nGood Questions,\u201d presented this week in Boston at the \u003Ca href=\u0022http:\/\/hri2012.org\/program\/\u0022 target=\u0022_blank\u0022\u003E7th ACM\/IEEE Conference on Human-Robot Interaction\u003C\/a\u003E (HRI).\u003C\/p\u003E\u003Cp\u003E\u201cPeople\n are not so good at teaching robots because they don\u2019t understand the \nrobots\u2019 learning mechanism,\u201d said lead author Maya Cakmak, Ph.D. student\n in the School of Interactive Computing. \u201cIt\u2019s like when you try to \ntrain a dog, and it\u2019s difficult because dogs do not learn like humans \ndo. We wanted to find out the best kinds of questions a robot could ask \nto make the human-robot relationship as \u2018human\u2019 as it can be.\u201d\u003C\/p\u003E\u003Cp\u003ECakmak\u2019s\n study attempted to discover the role \u201cactive learning\u201d concepts play in\n human-robot interaction. In a nutshell, active learning refers to \ngiving machine learners more control over the information they receive. \nSimon, a humanoid robot created in the lab of Andrea Thomaz (assistant \nprofessor in the Georgia Tech\u2019s School of Interactive Computing, and \nco-author), is well acquainted with active learning; Thomaz and Cakmak \nare programming him to learn new tasks by asking questions.\u003C\/p\u003E\u003Cp\u003ECakmak designed two separate experiments (\u003Ca href=\u0022http:\/\/www.youtube.com\/watch?v=6FKaEOSVczM\u0022 target=\u0022_blank\u0022\u003Esee video\u003C\/a\u003E):\n first, she asked human volunteers to assume the role of an inquisitive \nrobot attempting to learn a simple task by asking questions of a human \ninstructor. Having identified the three main question types (feature, \nlabel and demonstration), Cakmak tagged each of the participants\u2019 \nquestions as one of the three. The overwhelming majority (about 82 \npercent) of questions were feature queries, showing a clear cognitive \npreference in human learning for this query type.\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EType of question\u003C\/strong\u003E\u0026nbsp;\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp;\u0026nbsp; \u003Cstrong\u003EExample\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003ELabel query\u0026nbsp;\u0026nbsp; \u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp; \u201cCan I pour salt like this?\u0022\u003C\/p\u003E\u003Cp\u003EDemonstration query\u0026nbsp;\u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u0026nbsp; \u201cCan you show me how to pour salt from here?\u201d\u003C\/p\u003E\u003Cp\u003EFeature query\u0026nbsp;\u0026nbsp; \u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp;\u0026nbsp; \u201cCan I pour salt from any height?\u201d\u003C\/p\u003E\u003Cp\u003ENext,\n Cakmak recruited humans to teach Simon new tasks by answering the \nrobot\u2019s questions and then rating those questions on how \u201csmart\u201d they \nthought they were. Feature queries once again were the preferred \ninterrogatory, with 72 percent of participants calling them the smartest\n questions.\u003C\/p\u003E\u003Cp\u003E\u201cThese findings are important because they help give \nus the ability to teach robots the kinds of questions that humans would \nask,\u201d Cakmak said. \u201cThis in turn will help manufacturers produce the \nkinds of robots that are most likely to integrate quickly into a \nhousehold or other environment and better serve the needs we\u2019ll have for\n them.\u201d\u003C\/p\u003E\u003Cp\u003EGeorgia Tech is fielding five of the 38 papers accepted for \u003Ca href=\u0022http:\/\/hri2012.org\/program\/\u0022 target=\u0022_blank\u0022\u003EHRI\u2019s technical program\u003C\/a\u003E, making it the largest academic contributor to the conference. Those five include:\u003C\/p\u003E\u003Cul\u003E\u003Cli\u003E\u201c\u003Ca href=\u0022http:\/\/www.cc.gatech.edu\/social-machines\/papers\/cakmak12_hri_active.pdf\u0022 target=\u0022_self\u0022\u003EDesigning Robot Learners that Ask Good Questions\u003C\/a\u003E,\u201d by Maya Cakmak and Andrea L. Thomaz\u003C\/li\u003E\u003Cli\u003E\u201cReal World Haptic Exploration for Telepresence of the Visually Impaired,\u201d by Chung Hyuk Park and Ayanna M. Howard\u003C\/li\u003E\u003Cli\u003E\u201cThe\n Domesticated Robot: Design Guidelines for Assisting Older Adults to Age\n in Place,\u201d by Jenay Beer, Cory-Ann Smarr, Tiffany Chen, Akanksha \nPrakash, Tracy Mitzner, Charles Kemp and Wendy Rogers\u003C\/li\u003E\u003Cli\u003E\u201c\u003Ca href=\u0022http:\/\/www.cc.gatech.edu\/social-machines\/papers\/gielniak12_hri_exaggeration.pdf\u0022 target=\u0022_self\u0022\u003EEnhancing Interaction Through Exaggerated Motion Synthesis\u003C\/a\u003E,\u201d by Michael Gielniak and Andrea Thomaz\u003C\/li\u003E\u003Cli\u003E\u201c\u003Ca href=\u0022%20A%20Human-Robot%20Interaction%20Perspective\u0022 target=\u0022_self\u0022\u003ETrajectories and Keyframes for Kinesthetic Teaching: A Human-Robot Interaction Perspective\u003C\/a\u003E,\u201d by Baris Akgun, Maya Cakmak, Jae Wook Yoo and Andrea L. Thomaz\u003C\/li\u003E\u003C\/ul\u003E\u003Cp\u003EAll\n five papers describe research geared toward the realization of in-home \nrobots assisting humans with everyday activities. Ph.D. student Baris \nAkgun\u2019s paper, for example, assumes the same real-life application \nscenario as Cakmak\u2019s\u2014a robot learning new tasks from a \nnon-programmer\u2014and examines whether robots learn more quickly from \ncontinuous, real-time demonstrations of a physical task, or from \nisolated key frames in the motion sequence. The research is nominated \nfor Best Paper at HRI 2012.\u003C\/p\u003E\u003Cp\u003E\u201cGeorgia Tech is certainly a leader in\n the field of human-robot interaction; we have more than 10 faculty \nacross campus for whom HRI is a primary research area,\u201d Thomaz said. \n\u201cAdditionally, the realization of \u2018personal robots\u2019 is a shared vision \nof the whole robotics faculty\u2014and a mission of the RIM research center.\u201d\u003C\/p\u003E\u003Cp\u003E###\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EContacts\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EMichael Terrazas\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EAssistant Director of Communications\u003C\/p\u003E\u003Cp\u003ECollege of Computing at Georgia Tech\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022mailto:mterraza@cc.gatech.edu\u0022\u003Emterraza@cc.gatech.edu\u003C\/a\u003E\u003C\/p\u003E\u003Cp\u003E404-245-0707\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":[{"value":"Study on human-robot interaction highlights strong Institute showing at conference"}],"field_summary":[{"value":"\u003Cp\u003E\u003Cstrong\u003EATLANTA \u2013 March 8, 2012 \u2013 \u003C\/strong\u003EA new study by Maya Cakmak and Andrea Thomaz (\u003Cem\u003EInteractive\nComputing\u003C\/em\u003E) identifies the types of questions a robot can ask during a\nlearning interaction that are most likely to characterize a smooth and\nproductive human-robot relationship. \u003Cem\u003ESource: Office of Communications\u003C\/em\u003E\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":"","uid":"27174","created_gmt":"2012-03-08 10:51:03","changed_gmt":"2016-10-08 03:11:48","author":"Mike Terrazas","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2012-03-08T00:00:00-05:00","iso_date":"2012-03-08T00:00:00-05:00","tz":"America\/New_York"},"extras":[],"hg_media":{"115121":{"id":"115121","type":"image","title":"Simon Learning","body":null,"created":"1449178241","gmt_created":"2015-12-03 21:30:41","changed":"1475894733","gmt_changed":"2016-10-08 02:45:33","alt":"Simon Learning","file":{"fid":"194225","name":"simon_learning.jpg","image_path":"\/sites\/default\/files\/images\/simon_learning_0.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/simon_learning_0.jpg","mime":"image\/jpeg","size":188218,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/simon_learning_0.jpg?itok=ahd9evLW"}}},"media_ids":["115121"],"groups":[{"id":"47223","name":"College of Computing"}],"categories":[],"keywords":[{"id":"11526","name":"Andrea Thomaz"},{"id":"9167","name":"machine learning"},{"id":"26421","name":"maya cakmak"},{"id":"26431","name":"personal robots"},{"id":"12920","name":"RIM center"},{"id":"26441","name":"robot learning"},{"id":"12919","name":"robotics \u0026 intelligent machines"},{"id":"166848","name":"School of Interactive Computing"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EMichael Terrazas\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022mailto:mterraza@cc.gatech.edu\u0022\u003Emterraza@cc.gatech.edu\u003C\/a\u003E\u003C\/p\u003E\u003Cp\u003E404-245-0707\u003C\/p\u003E","format":"limited_html"}],"email":["mterraza@cc.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}