{"624130":{"#nid":"624130","#data":{"type":"news","title":"\u0027MacGyver\u0027-like Robot Can Build Own Tools By Assessing Form, Function of Supplies","body":[{"value":"\u003Cp\u003EThanks to new technology that enables them to create simple tools, robots may be on the verge of their own version of the Stone Age.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EUsing a novel capability to reason about shape, function, and attachment of unrelated parts, researchers have for the first time successfully trained an intelligent agent to create basic tools by combining objects.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe breakthrough comes from Georgia Tech\u0026rsquo;s \u003Ca href=\u0022http:\/\/www.rail.gatech.edu\/\u0022\u003ERobot Autonomy and Interactive Learning\u003C\/a\u003E (RAIL) research lab and is a significant step toward enabling intelligent agents to devise more advanced tools that could prove useful in hazardous \u0026ndash; and potentially life-threatening \u0026ndash; environments.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe concept may sound familiar. It\u0026rsquo;s called \u0026ldquo;MacGyvering,\u0026rdquo; based off the name of a 1980s \u0026mdash; and recently rebooted \u0026mdash; television series. In the series, the title character is known for his unconventional problem-solving ability using differing resources available to him.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EFor years, computer scientists and others have been working to provide robots with similar capabilities. In their new robot-MacGyvering work, RAIL lab researchers led by Associate Professor \u003Cstrong\u003ESonia Chernova\u003C\/strong\u003E used as a starting point a robotics technique previously developed by former Georgia Tech Professor \u003Cstrong\u003EMike Stilman\u003C\/strong\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn this latest work, a robot trained using the team\u0026rsquo;s novel approach is given a set of optional parts and told to make a specific tool. Much like its human counterparts, the robot first examines the shapes of each part and how one might be attached to another.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EUsing machine learning, the robot is trained to match form to function \u0026ndash; which object shapes facilitate a particular outcome \u0026ndash; from numerous examples of everyday objects. For example, by learning that the concavity of bowls enables them to hold liquids, it makes use of this knowledge when constructing a spoon. Similarly, the robots were taught how to attach objects together from examples of materials that could be pierced or grasped.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn the study, researchers successfully created hammers, spatulas, scoops, squeegees, and screwdrivers.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;The screwdriver was particularly interesting because the robot combined pliers and a coin,\u0026rdquo; said \u003Cstrong\u003ELakshmi Nair\u003C\/strong\u003E, a Ph.D. student in the \u003Ca href=\u0022http:\/\/www.ic.gatech.edu\u0022\u003ESchool of Interactive Computing\u003C\/a\u003E and one of the researchers on the project. \u0026ldquo;It reasoned that the pliers were able to grasp something and said that the coin sort of matched the head of a screwdriver. Put them together, and it creates an effective tool.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECurrently, the robot is limited only to the shape and attachment. It cannot yet effectively reason about particular material properties, a crucial step in advancing to a real-world scenario.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022https:\/\/www.ic.gatech.edu\/news\/623044\/robot-able-instantly-identify-household-materials-using-near-infrared-light\u0022\u003E\u003Cstrong\u003E[RELATED: Robot Able to Instantly Identify Household Materials Using Near-Infrared Light]\u003C\/strong\u003E\u003C\/a\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;People reason that hammers are sturdy and strong, so you wouldn\u0026rsquo;t make a hammer out of foam blocks,\u0026rdquo; Nair said. \u0026ldquo;We want to reach that level of reasoning in our work, which is something we\u0026rsquo;re working on now.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe inspiration for the work comes from the popular story of Apollo 13, the doomed seventh crewed flight of the Apollo space program. After an oxygen tank in the ship\u0026rsquo;s service module exploded two days into the mission, crew members were forced to make makeshift modifications to the carbon dioxide removal system.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EDespite a dangerously tight window of time and extremely high tension among all aboard and at mission control, the rescue proved successful. Nair and collaborators hope this research will prove foundational to future robotics technology that could reason faster and without the burden of stress.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026ldquo;They were able to make this filter, but the solution took a long time to come up with,\u0026rdquo; Nair said. \u0026ldquo;We want to make robots that can assist humans in these kinds of scenarios to take the pressure off of them to come up with innovative solutions and potentially save their lives.\u0026rdquo;\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThis work was presented at the 2019 Robotics: Science and Systems conference in a paper titled \u003Ca href=\u0022http:\/\/www.roboticsproceedings.org\/rss15\/p09.pdf\u0022\u003E\u003Cem\u003EAutonomous Tool Construction Using Part Shape and Attachment Prediction \u003C\/em\u003E\u003C\/a\u003E(Lakshmi Nair, \u003Cstrong\u003ENithin Shrivatsav\u003C\/strong\u003E, \u003Cstrong\u003EZackory Erickson\u003C\/strong\u003E, Sonia Chernova). It is supported in part by grants from the \u003Ca href=\u0022https:\/\/www.nsf.gov\/\u0022\u003ENational Science Foundation\u003C\/a\u003E and the \u003Ca href=\u0022https:\/\/www.onr.navy.mil\/\u0022\u003EOffice of Naval Research\u003C\/a\u003E.\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"The breakthrough is a significant step toward enabling intelligent agents to devise more advanced tools that could prove useful in hazardous and potentially life-threatening environments."}],"uid":"33939","created_gmt":"2019-08-07 21:04:09","changed_gmt":"2019-08-12 20:08:25","author":"David Mitchell","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2019-08-07T00:00:00-04:00","iso_date":"2019-08-07T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"624128":{"id":"624128","type":"image","title":"Robot MacGyvering - Lakshmi Nair 1","body":null,"created":"1565210646","gmt_created":"2019-08-07 20:44:06","changed":"1565210646","gmt_changed":"2019-08-07 20:44:06","alt":"Lakshmi Nair stands next to a robotic arm with tool parts on a table","file":{"fid":"237702","name":"Macgyvering MAIN.jpg","image_path":"\/sites\/default\/files\/images\/Macgyvering%20MAIN.jpg","image_full_path":"http:\/\/www.tlwarc.hg.gatech.edu\/\/sites\/default\/files\/images\/Macgyvering%20MAIN.jpg","mime":"image\/jpeg","size":200873,"path_740":"http:\/\/www.tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/Macgyvering%20MAIN.jpg?itok=g6K5xKB2"}}},"media_ids":["624128"],"related_links":[{"url":"http:\/\/rail.gatech.edu","title":"Robot Autonomy and Interactive Learning Lab"},{"url":"https:\/\/www.ic.gatech.edu\/content\/robotics-computational-perception","title":"Robotics and Computational Perception Research at Georgia Tech"}],"groups":[{"id":"47223","name":"College of Computing"},{"id":"1299","name":"GVU Center"},{"id":"576481","name":"ML@GT"},{"id":"431631","name":"OMS"},{"id":"50876","name":"School of Interactive Computing"}],"categories":[],"keywords":[{"id":"181920","name":"cc-research; ic-ai-ml; ic-robotics"}],"core_research_areas":[{"id":"39501","name":"People and Technology"},{"id":"39521","name":"Robotics"}],"news_room_topics":[{"id":"71881","name":"Science and Technology"}],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003EDavid Mitchell\u003C\/p\u003E\r\n\r\n\u003Cp\u003ECommunications Officer\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Ca href=\u0022mailto:david.mitchell@cc.gatech.edu\u0022\u003Edavid.mitchell@cc.gatech.edu\u003C\/a\u003E\u003C\/p\u003E\r\n","format":"limited_html"}],"email":[],"slides":[],"orientation":[],"userdata":""}}}