<nodes> <node id="174741">  <title><![CDATA[Squirrels and Birds Inspire Researchers to Create Deceptive Robots]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Using deceptive behavioral patterns of squirrels and birds, researchers at the Georgia Institute of Technology have developed robots that are able to deceive each other. The research is funded by the Office of Naval Research and is led by Professor Ronald Arkin, who suggests the applications could be implemented by the military in the future. The research is <a href="http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&amp;arnumber=6365199&amp;contentType=Journals+%26+Magazines&amp;sortType%3Dasc_p_Sequence%26filter%3DAND%28p_IS_Number%3A6365196%29">highlighted</a> in the November/December 2012 edition of IEEE Intelligent Systems.</p><p>Arkin and his team learned by reviewing biological research results that squirrels gather acorns and store them in specific locations. The animal then patrols the hidden caches, routinely going back and forth to check on them. When another squirrel shows up, hoping to raid the hiding spots, the hoarding squirrel changes its behavior. Instead of checking on the true locations, it visits empty cache sites, trying to deceive the predator.&nbsp;</p><p>Arkin and his Ph.D. student Jaeeun Shim implemented the same strategy into a robotic model and demonstration. The deceptive behaviors worked. The deceiving robot lured the “predator” robot to the false locations, delaying the discovery of the protected resources.</p><p>“This application could be used by robots guarding ammunition or supplies on the battlefield,” said Arkin, a Regents Professor in Georgia Tech’s School of Interactive Computing. “If an enemy were present, the robot could change its patrolling strategies to deceive humans or another intelligent machine, buying time until reinforcements are able to arrive.”</p><p>Click <a href="http://play.media.gatech.edu/s/gatech.edu/comm/d8f76719-6053-590e-9092-7d6fd2299df2">here</a> to see a lab video of the demonstration.</p><p>Arkin and his student Justin Davis have also created a simulation and demo based on birds that might bluff their way to safety. In Israel, Arabian babblers in danger of being attacked will sometimes join other birds and harass their predator. This mobbing process causes such a commotion that the predator will eventually give up the attack and leave.</p><p>Arkin's team investigated whether a simulated babbler is more likely to survive if it fakes or feigns strength when it doesn't exist. The team’s simulations, based on biological models of dishonesty and the handicap principle, show that deception is the best strategy when the addition of deceitful agents pushes the size of the group to the minimum level required to frustrate the predator enough for it to flee. He says the reward for deceit in a few of the agents sometimes outweighs the risk of being caught.</p><p>“In military operations, a robot that is threatened might feign the ability to combat adversaries without actually being able to effectively protect itself,” said Arkin. “Being honest about the robot’s abilities risks capture or destruction. Deception, if used at the right time in the right way, could possibly eliminate or minimize the threat.”</p><p>From the Trojan Horse to D-Day, deception has always played a role during wartime. In fact, there is an entire Army field manual on its use and value in the battlefield. But Arkin is the first to admit that there are serious ethical questions regarding robot deception behavior with humans.</p><p>“When these research ideas and results leak outside the military domain, significant ethical concerns can arise,” said Arkin. “We strongly encourage further discussion regarding the pursuit and application of research on deception for robots and intelligent machines.” &nbsp;</p><p>This isn’t the first time Arkin has worked in this field. In 2010, he and Georgia Tech Research Institute Research Engineer Alan Wagner studied how robots <a href="http://www.gatech.edu/newsroom/release.html?nid=60881">could use deceptive behavior to hide</a> from humans or other intelligent machines.</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1354531358</created>  <gmt_created>2012-12-03 10:42:38</gmt_created>  <changed>1475896398</changed>  <gmt_changed>2016-10-08 03:13:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Using deceptive behavioral patterns of squirrels and birds, researchers at the Georgia Institute of Technology have developed robots that are able to deceive each other.]]></teaser>  <type>news</type>  <sentence><![CDATA[Using deceptive behavioral patterns of squirrels and birds, researchers at the Georgia Institute of Technology have developed robots that are able to deceive each other.]]></sentence>  <summary><![CDATA[<p>Using deceptive behavioral patterns of squirrels and birds, researchers at the Georgia Institute of Technology have developed robots that are able to deceive each other. The research is funded by the Office of Naval Research and is led by Professor Ronald Arkin, who suggests the applications could be implemented by the military in the future. The research is <a href="http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&amp;arnumber=6365199&amp;contentType=Journals+%26+Magazines&amp;sortType%3Dasc_p_Sequence%26filter%3DAND%28p_IS_Number%3A6365196%29">highlighted</a> in the November/December 2012 edition of IEEE Intelligent Systems.</p>]]></summary>  <dateline>2012-12-03T00:00:00-05:00</dateline>  <iso_dateline>2012-12-03T00:00:00-05:00</iso_dateline>  <gmt_dateline>2012-12-03 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-385-2966 </p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>174711</item>          <item>174721</item>          <item>174771</item>      </media>  <hg_media>          <item>          <nid>174711</nid>          <type>image</type>          <title><![CDATA[Deceptive Robots]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2012-12-03_at_9.19.30_am.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2012-12-03_at_9.19.30_am_0.png]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/screen_shot_2012-12-03_at_9.19.30_am_0.png]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2012-12-03_at_9.19.30_am_0.png?itok=Ihw_TCSf]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Deceptive Robots]]></image_alt>                    <created>1449179022</created>          <gmt_created>2015-12-03 21:43:42</gmt_created>          <changed>1475894816</changed>          <gmt_changed>2016-10-08 02:46:56</gmt_changed>      </item>          <item>          <nid>174721</nid>          <type>image</type>          <title><![CDATA[Deceptive Robots 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2012-12-03_at_9.18.46_am.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2012-12-03_at_9.18.46_am_0.png]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/screen_shot_2012-12-03_at_9.18.46_am_0.png]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2012-12-03_at_9.18.46_am_0.png?itok=WZ27FHrY]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Deceptive Robots 2]]></image_alt>                    <created>1449179022</created>          <gmt_created>2015-12-03 21:43:42</gmt_created>          <changed>1475894816</changed>          <gmt_changed>2016-10-08 02:46:56</gmt_changed>      </item>          <item>          <nid>174771</nid>          <type>image</type>          <title><![CDATA[Ronald Arkin]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron_arkin_sept_2009.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron_arkin_sept_2009_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/ron_arkin_sept_2009_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron_arkin_sept_2009_0.jpg?itok=K40uBeaP]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ronald Arkin]]></image_alt>                    <created>1449179022</created>          <gmt_created>2015-12-03 21:43:42</gmt_created>          <changed>1475894816</changed>          <gmt_changed>2016-10-08 02:46:56</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.cc.gatech.edu/aimosaic/faculty/arkin/]]></url>        <title><![CDATA[Ronald Arkin Research Page]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/ai/robot-lab/online-publications/jaeeun_sab2012_final.pdf]]></url>        <title><![CDATA[Paper: Biologically-Inspired Deceptive Behavior for a Robot]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/ai/robot-lab/online-publications/deception_in_mobbing.pdf]]></url>        <title><![CDATA[Paper: Mobbing Behavior and Deceit and its role in Bio-inspired Autonomous Robotic Agents]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>          <keyword tid="14444"><![CDATA[ron arkin]]></keyword>          <keyword tid="166848"><![CDATA[School of Interactive Computing]]></keyword>          <keyword tid="171241"><![CDATA[Squirrels]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="172561">  <title><![CDATA[Swarm robots perform classical &#039;scores&#039; inside Georgia Tech&#039;s GritsLab]]></title>  <uid>27560</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1353507496</created>  <gmt_created>2012-11-21 14:18:16</gmt_created>  <changed>1475893569</changed>  <gmt_changed>2016-10-08 02:26:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[public library]]></publication>  <article_dateline>2012-11-20T00:00:00-05:00</article_dateline>  <iso_article_dateline>2012-11-20T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2012-11-20T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://engt.co/WxgLm9]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="165841">  <title><![CDATA[Robots Get Around by Mimicking Primates]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>By mimicking how primates visualise an unfamiliar environment - a process called mental rotation - researchers are building a new kind of guidance system for robots. Ronald Arkin <em>(Interactive Comp) </em>is leading the effort to incorporate this technique into software for controlling robots. <em>Source: New Scientist</em><br /><br /></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1351506836</created>  <gmt_created>2012-10-29 10:33:56</gmt_created>  <changed>1475893564</changed>  <gmt_changed>2016-10-08 02:26:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Robots Get Around by Mimicking Primates]]></publication>  <article_dateline>2012-10-26T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-10-26T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-10-26T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.newscientist.com/article/mg21628885.700-robots-get-around-by-mimicking-primates.html]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="48071"><![CDATA[Ronal Arkin; robots; primates]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252741">  <title><![CDATA[Finally, A Robot With The Ingenuity Of MacGyver]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383949875</created>  <gmt_created>2013-11-08 22:31:15</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[cultural history of trees]]></publication>  <article_dateline>2012-10-15T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-10-15T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-10-15T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.redorbit.com/news/technology/1112712697/robot-rescue-macgyver-georgia-tech-101512/]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>          <keyword tid="16551"><![CDATA[Mike Stilman]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="160721">  <title><![CDATA[Robots Using Tools: With New Grant, Researchers Aim to Create ‘MacGyver’ Robot]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Robots are increasingly being used in place of humans to explore hazardous and difficult-to-access environments, but they aren’t yet able to interact with their environments as well as humans. If today’s most sophisticated robot was trapped in a burning room by a jammed door, it would probably not know how to locate and use objects in the room to climb over any debris, pry open the door, and escape the building.</p><p>A research team led by Professor Mike Stilman at the Georgia Institute of Technology hopes to change that by giving robots the ability to use objects in their environments to accomplish high-level tasks. The team recently received a three-year, $900,000 grant from the Office of Naval Research to work on this project.</p><p>“Our goal is to develop a robot that behaves like MacGyver, the television character from the 1980s who solved complex problems and escaped dangerous situations by using everyday objects and materials he found at hand,” said Stilman, an assistant professor in the School of Interactive Computing at Georgia Tech. “We want to understand the basic cognitive processes that allow humans to take advantage of arbitrary objects in their environments as tools. We will achieve this by designing algorithms for robots that make tasks that are impossible for a robot alone possible for a robot with tools.”</p><p>The research will build on Stilman’s previous work on navigation among movable obstacles that enabled robots to autonomously recognize and move obstacles that were in the way of their getting from point A to point B.</p><p>“This project is challenging because there is a critical difference between moving objects out of the way and using objects to make a way,” explained Stilman. “Researchers in the robot motion planning field have traditionally used computerized vision systems to locate objects in a cluttered environment to plan collision-free paths, but these systems have not provided any information about the objects’ functions.”</p><p>To create a robot capable of using objects in its environment to accomplish a task, Stilman plans to develop an algorithm that will allow a robot to identify an arbitrary object in a room, determine the object’s potential function, and turn that object into a simple machine that can be used to complete an action. Actions could include using a chair to reach something high, bracing a ladder against a bookshelf, stacking boxes to climb over something, and building levers or bridges from random debris.</p><p>By providing the robot with basic knowledge of rigid body mechanics and simple machines, the robot should be able to autonomously determine the mechanical force properties of an object and construct motion plans for using the object to perform high-level tasks.</p><p>For example, exiting a burning room with a jammed door would require a robot to travel around any fire, use an object in the room to apply sufficient force to open the stuck door, and locate an object in the room that will support its weight while it moves to get out of the room.</p><p>Such skills could be extremely valuable in the future as robots work side-by-side with military personnel to accomplish challenging missions.</p><p>“The Navy prides itself on recruiting, training and deploying our country’s most resourceful and intelligent men and women,” said Paul Bello, director of the cognitive science program in the Office of Naval Research (ONR). “Now that robotic systems are becoming more pervasive as teammates for warfighters in military operations, we must ensure that they are both intelligent and resourceful. Professor Stilman’s work on the ‘MacGyver-bot’ is the first of its kind, and is already beginning to deliver on the promise of mechanical teammates able to creatively perform in high-stakes situations.”</p><p>To address the complexity of the human-like reasoning required for this type of scenario, Stilman is collaborating with researchers Pat Langley and Dongkyu Choi. Langley is the director of the Institute for the Study of Learning and Expertise (ISLE), and is recognized as a co-founder of the field of machine learning, where he championed both experimental studies of learning algorithms and their application to real-world problems. Choi is an assistant professor in the Department of Aerospace Engineering at the University of Kansas.</p><p>Langley and Choi will expand the cognitive architecture they developed, called ICARUS, which provides an infrastructure for modeling various human capabilities like perception, inference, performance and learning in robots.</p><p>“We believe a hybrid reasoning system that embeds our physics-based algorithms within a cognitive architecture will create a more general, efficient and structured control system for our robot that will accrue more benefits than if we used one approach alone,” said Stilman.</p><p>After the researchers develop and optimize the hybrid reasoning system using computer simulations, they plan to test the software using Golem Krang, a humanoid robot designed and built in Stilman’s laboratory to study whole-body robotic planning and control.</p><p>&nbsp;<em>This research is sponsored by the Department of the Navy, Office of Naval Research, through grant number N00014-12-1-0143. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Office of Naval Research.</em></p><p>&nbsp;</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1349770940</created>  <gmt_created>2012-10-09 08:22:20</gmt_created>  <changed>1475896378</changed>  <gmt_changed>2016-10-08 03:12:58</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[New project is designed to teach robots how to use objects in the environment to accomplish high-level tasks]]></teaser>  <type>news</type>  <sentence><![CDATA[New project is designed to teach robots how to use objects in the environment to accomplish high-level tasks]]></sentence>  <summary><![CDATA[<p>A Georgia Tech research team has received a grant from the Office of Naval Research to work on a project that intends to teach robots how to use objects in their environment to accomplish high-level tasks.</p>]]></summary>  <dateline>2012-10-09T00:00:00-04:00</dateline>  <iso_dateline>2012-10-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-10-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon<br />Research News &amp; Publications Office<br /> <a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a> <br /> 404-894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>160691</item>          <item>160701</item>          <item>160711</item>      </media>  <hg_media>          <item>          <nid>160691</nid>          <type>image</type>          <title><![CDATA[MacGyver Grant, Photo 1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[macgyver-1-cropped_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/macgyver-1-cropped_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/macgyver-1-cropped_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/macgyver-1-cropped_0_0.jpg?itok=99JL9P7P]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MacGyver Grant, Photo 1]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894796</changed>          <gmt_changed>2016-10-08 02:46:36</gmt_changed>      </item>          <item>          <nid>160701</nid>          <type>image</type>          <title><![CDATA[MacGyver Grant, Photo 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[macgyver-robot-9680.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/macgyver-robot-9680_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/macgyver-robot-9680_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/macgyver-robot-9680_0.jpg?itok=RSagP2F7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MacGyver Grant, Photo 2]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894796</changed>          <gmt_changed>2016-10-08 02:46:36</gmt_changed>      </item>          <item>          <nid>160711</nid>          <type>image</type>          <title><![CDATA[MacGyver Grant, Photo 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[macgyver-robot-9651.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/macgyver-robot-9651_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/macgyver-robot-9651_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/macgyver-robot-9651_0.jpg?itok=LNUTWw4p]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[MacGyver Grant, Photo 3]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894796</changed>          <gmt_changed>2016-10-08 02:46:36</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.cc.gatech.edu/~mstilman/]]></url>        <title><![CDATA[Mike Stillman Website]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>          <link>        <url><![CDATA[http://www.ic.gatech.edu/about]]></url>        <title><![CDATA[School of Interactive Computing]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="45961"><![CDATA[Golem Krang]]></keyword>          <keyword tid="45951"><![CDATA[MacGyver]]></keyword>          <keyword tid="11527"><![CDATA[Mike Stillman]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="158691">  <title><![CDATA[Tony Yezzi Named as Ken Byers Professor]]></title>  <uid>27241</uid>  <body><![CDATA[<p>Anthony J. Yezzi has been named as a Ken Byers Professor, effective October 1. Dr. Yezzi has been a faculty member with the School of Electrical and Computer Engineering at Georgia Tech since 1999.</p><p>As the director of the Laboratory of Computational Computer Vision, Dr. Yezzi has over 20 years of research experience in image processing, computer vision, and shape optimization using geometric partial differential equations. Applications of his research group's work include medical imaging, 3D surface reconstruction, and visual tracking. An active industrial consultant, Dr. Yezzi recently formed Vintinura Imaging, Inc., a startup company that is being hosted by ATDC/VentureLab and is focused on image analysis solutions, particularly those connected with shape detection, tracking, and optimization. He is the author of almost 200 peer-reviewed publications and holds a patent for user-interactive 3D medical image MRI segmentation which has been used within General Electric's MRI image analysis package for the past 10 years.</p><p>A strong proponent of international education opportunities, Dr. Yezzi has spent the last five years fostering international relationships between Georgia Tech and top Italian engineering universities. He has been the director of the dual master's degree program between Georgia Tech and Politecnico di Torino (PdT) since 2008. He has also been instrumental in developing joint Ph.D. programs between Georgia Tech and both PdT and Politecnico di Milano that were recently approved by the University System of Georgia Board of Regents.&nbsp;</p><p>&nbsp;</p><p>&nbsp;</p>]]></body>  <author>Jackie Nemeth</author>  <status>1</status>  <created>1349196507</created>  <gmt_created>2012-10-02 16:48:27</gmt_created>  <changed>1475896374</changed>  <gmt_changed>2016-10-08 03:12:54</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[ECE Professor Anthony J. Yezzi has been named as a Ken Byers Professor, effective October 1.]]></teaser>  <type>news</type>  <sentence><![CDATA[ECE Professor Anthony J. Yezzi has been named as a Ken Byers Professor, effective October 1.]]></sentence>  <summary><![CDATA[<p>ECE Professor Anthony J. Yezzi has been named as a Ken Byers Professor, effective October 1.</p>]]></summary>  <dateline>2012-10-02T00:00:00-04:00</dateline>  <iso_dateline>2012-10-02T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-10-02 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jackie.nemeth@ece.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jackie Nemeth</p><p>School of Electrical and Computer Engineering</p><p>404-894-2906</p><p><a href="mailto:jackie.nemeth@ece.gatech.edu">jackie.nemeth@ece.gatech.edu</a></p><p>&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>159731</item>      </media>  <hg_media>          <item>          <nid>159731</nid>          <type>image</type>          <title><![CDATA[Anthony J. Yezzi]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[tony_yezzi_2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/tony_yezzi_2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/tony_yezzi_2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/tony_yezzi_2_0.jpg?itok=rgxAVUdb]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Anthony J. Yezzi]]></image_alt>                    <created>1449178896</created>          <gmt_created>2015-12-03 21:41:36</gmt_created>          <changed>1475894794</changed>          <gmt_changed>2016-10-08 02:46:34</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/faculty-staff/fac_profiles/bio.php?id=116]]></url>        <title><![CDATA[Anthony J. Yezzi]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/research/labs/lccv/]]></url>        <title><![CDATA[Lab of Computational Computer Vision]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="145"><![CDATA[Engineering]]></category>      </categories>  <news_terms>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="145"><![CDATA[Engineering]]></term>      </news_terms>  <keywords>          <keyword tid="45311"><![CDATA[Anthony J. Yezzi]]></keyword>          <keyword tid="109"><![CDATA[Georgia Tech]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39541"><![CDATA[Systems]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="252751">  <title><![CDATA[How Rethink Robotics Built Its New Baxter Robot Worker]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Building robots for these small and medium-size companies “is a fantastic opportunity,” says&nbsp;<a href="http://www.hichristensen.net/">Henrik I. Christensen</a>, a professor of robotics at the Georgia Institute of Technology, in Atlanta, who’s an expert in industrial automation. (He has no ties to Rethink.) There are many tasks, he says, that don’t require the speed and precision of today’s industrial robots, and these tasks are begging to be automated.  </p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383950716</created>  <gmt_created>2013-11-08 22:45:16</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[ Linda Wills]]></publication>  <article_dateline>2012-09-18T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-09-18T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-09-18T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://spectrum.ieee.org/robotics/industrial-robots/rethink-robotics-baxter-robot-factory-worker]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="78861"><![CDATA[Henrik I. Christensen]]></keyword>          <keyword tid="78811"><![CDATA[Institute for Robotics and Intelligent Machines]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="155051">  <title><![CDATA[Robot researchers learn lessons of lizard locomotion]]></title>  <uid>27560</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1347955997</created>  <gmt_created>2012-09-18 08:13:17</gmt_created>  <changed>1475893557</changed>  <gmt_changed>2016-10-08 02:25:57</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[backpack supply drive]]></publication>  <article_dateline>2012-09-17T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-09-17T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-09-17T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.reuters.com/video/2012/09/17/reuters-tv-robot-researchers-learn-lessons-of-lizar?videoId=237799756&amp;videoChannel=118065]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="154791">  <title><![CDATA[Andrea Thomaz Named "Brilliant 10"]]></title>  <uid>27659</uid>  <body><![CDATA[<p>The top-tier science magazine,<em>&nbsp;Popular Science</em>, has named <strong>Andrea Thomaz, assistant professor in Georgia Tech’s School of Interactive Computing</strong>, one of 2012’s “Brilliant 10,” an award given by the publication to ten scientists under 40 whose innovations will change the world.&nbsp; Thomaz, along with nine other researchers, is featured in the October issue of the magazine.</p><p>As Director of the College of Computing’s Socially Intelligent Machines research lab, Thomaz ‘s research focuses on all aspects of human-robot interaction and, specifically, on machines that learn new tasks and goals from ordinary people in everyday environments. This research works from the assumption that machines meant to learn from people can better take advantage of the ways in which people naturally approach teaching.</p><p>Through the development of new computational models, Thomaz is working to build machines that participate in social learning environments.&nbsp; As a result, she has improved the performance of a machine's learning behavior through attention to human interaction and improving the experience of the human teacher by designing interactive learning algorithms based on how people teach, in order to develop a smooth human-robot relationship. Thomaz’s work with robotics opens up a wider world of personal robotics, in which machines are doing anything their owners can program them to do—without actually being programmers.<br /><br />In 2009, Thomaz was awarded the prestigious “MIT Tech Review 2009 Young Innovators Under 35” for her work in robot-human interaction and the development of Simon.&nbsp; Additionally, she has been named a College of Computing Professor of Excellence for her outstanding contributions to the Institute and to her field of study. Thomaz holds a Ph.D. from the Massachusetts Institute of Technology.</p>]]></body>  <author>Christopher Ernst</author>  <status>1</status>  <created>1347886773</created>  <gmt_created>2012-09-17 12:59:33</gmt_created>  <changed>1475896370</changed>  <gmt_changed>2016-10-08 03:12:50</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>news</type>  <sentence><![CDATA[]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2012-09-13T00:00:00-04:00</dateline>  <iso_dateline>2012-09-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-09-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.cc.gatech.edu/news/andrea-thomaz-named-brilliant-10]]></url>        <title><![CDATA[Source: CoC Newsroom]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1299"><![CDATA[GVU Center]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>      </news_terms>  <keywords>      </keywords>  <core_research_areas>          <term tid="39501"><![CDATA[People and Technology]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="252821">  <title><![CDATA[Robotics Student Wins President’s Undergraduate Research Award]]></title>  <uid>27255</uid>  <body><![CDATA[<p>Ramya Ramakrishnan, who is advised by <a href="http://robotics.gatech.edu/team/faculty/thomaz">Andrea Thomaz</a>, received the <a href="http://www.undergradresearch.gatech.edu/pura" target="_blank">President’s Undergraduate Research Award (PURA)</a>. This award will support the project “Improving Robot Behavior through Both Self and Human Social Learning.”</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383999825</created>  <gmt_created>2013-11-09 12:23:45</gmt_created>  <changed>1475896518</changed>  <gmt_changed>2016-10-08 03:15:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>news</type>  <sentence><![CDATA[]]></sentence>  <summary><![CDATA[<p>Ramya Ramakrishnan (advisor Dr. Andrea Thomaz) received the President’s Undergraduate Research Award (PURA). This award will support the project “Improving Robot Behavior through Both Self and Human Social Learning.”</p>]]></summary>  <dateline>2012-09-10T00:00:00-04:00</dateline>  <iso_dateline>2012-09-10T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-09-10 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[josie@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications<br />404-385-8551</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="79371"><![CDATA[Andrea L. Thomaz]]></keyword>          <keyword tid="9266"><![CDATA[President&#039;s Undergraduate Research Awards]]></keyword>          <keyword tid="1421"><![CDATA[PURA]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="152661">  <title><![CDATA[Ayanna Howard Named as Motorola Foundation Professor]]></title>  <uid>27241</uid>  <body><![CDATA[<p>Ayanna Howard has been named as the Motorola Foundation Professor, effective August 15. She is a professor in the School of Electrical and Computer Engineering at Georgia Tech.</p><p>Dr. Howard started working at Georgia Tech in 2005, where she serves as the director of Human-Automation Systems Lab in ECE and as the advisor of eight Ph.D. students and two M.S. thesis students. In addition, Dr. Howard serves as the chair of the Institute's multidisciplinary robotics Ph.D. program.</p><p>Her area of research is centered around the concept of humanized intelligence, the process of embedding human cognitive ability into the control path of autonomous systems. Dr. Howard's work has resulted in over 100 peer-reviewed publications about projects ranging from scientific rover navigation in glacier environments to assistive robots for the home. Some new research directions include robotic applications for child therapy and rehabilitation, tele-presence for persons with visual impairments, haptic and wearable device interfaces, behavior modeling for diagnosis and intervention, and medical and health care mobile apps.<br /><br />To date, Dr. Howard's work has been highlighted through a number of awards, articles, and televised interviews and programs, including <em>USA Today</em>, <em>TIME Magazine</em>, CNN, and American Public Television. Her recent awards include the 2008 Georgia Tech Faculty Woman of Distinction Award, the 2008 ECE Outreach Award, and the 2009 Janice A. Lumpkin Educator of the Year Award, given by the National Society of Black Engineers.</p><p>&nbsp;</p>]]></body>  <author>Jackie Nemeth</author>  <status>1</status>  <created>1347203110</created>  <gmt_created>2012-09-09 15:05:10</gmt_created>  <changed>1475896367</changed>  <gmt_changed>2016-10-08 03:12:47</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[ECE Professor Ayanna Howard has been named as the Motorola Foundation Professor, effective August 15.]]></teaser>  <type>news</type>  <sentence><![CDATA[ECE Professor Ayanna Howard has been named as the Motorola Foundation Professor, effective August 15.]]></sentence>  <summary><![CDATA[<p>ECE Professor Ayanna Howard has been named as the Motorola Foundation Professor, effective August 15.</p>]]></summary>  <dateline>2012-09-09T00:00:00-04:00</dateline>  <iso_dateline>2012-09-09T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-09-09 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jackie.nemeth@ece.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jackie Nemeth</p><p>School of Electrical and Computer Engineering</p><p>404-894-2906</p><p><a href="mailto:jackie.nemeth@ece.gatech.edu">jackie.nemeth@ece.gatech.edu</a></p><p>&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>152671</item>      </media>  <hg_media>          <item>          <nid>152671</nid>          <type>image</type>          <title><![CDATA[Ayanna Howard]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ayanna_howard.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ayanna_howard_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/ayanna_howard_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ayanna_howard_0.jpg?itok=v2ybtrpK]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ayanna Howard]]></image_alt>                    <created>1449178848</created>          <gmt_created>2015-12-03 21:40:48</gmt_created>          <changed>1475894787</changed>          <gmt_changed>2016-10-08 02:46:27</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/]]></url>        <title><![CDATA[School of Electrical and Computer Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.ece.gatech.edu/faculty-staff/fac_profiles/bio.php?id=135]]></url>        <title><![CDATA[Profile]]></title>      </link>          <link>        <url><![CDATA[http://humanslab.ece.gatech.edu/humansWeb/Home.html]]></url>        <title><![CDATA[Human-Automation Systems Lab]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1255"><![CDATA[School of Electrical and Computer Engineering]]></group>      </groups>  <categories>          <category tid="136"><![CDATA[Aerospace]]></category>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="154"><![CDATA[Environment]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="136"><![CDATA[Aerospace]]></term>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="154"><![CDATA[Environment]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="109"><![CDATA[Georgia Tech]]></keyword>          <keyword tid="166855"><![CDATA[School of Electrical and Computer Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39441"><![CDATA[Bioengineering and Bioscience]]></term>          <term tid="39451"><![CDATA[Electronics and Nanotechnology]]></term>          <term tid="39501"><![CDATA[People and Technology]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="144381">  <title><![CDATA[Micron-Scale Swimming Robots Could Deliver Drugs & Carry Cargo Using Simple Motion]]></title>  <uid>27303</uid>  <body><![CDATA[<p>When you’re just a few microns long, swimming can be difficult. At that size scale, the viscosity of water is more like that of honey, and momentum can’t be relied upon to maintain forward motion.</p><p>Microorganisms, of course, have evolved ways to swim in spite of these challenges, but tiny robots haven’t quite caught up. Now a team of researchers at the Georgia Institute of Technology has used complex computational models to design swimming micro-robots that could overcome these challenges to carry cargo and navigate in response to stimuli such as light.</p><p>When they’re actually built some day, these simple micro-swimmers could rely on volume changes in unique materials known as hydrogels to move tiny flaps that will propel the robots. The micro-devices could be used in drug delivery, lab-on-a-chip microfluidic systems – and even as micro-construction robots working in swarms.</p><p>The simple micro-swimmers were described July 23 in the online advance edition of the journal <em>Soft Matter</em>, published by the Royal Society of Chemistry in the United Kingdom.</p><p>“We believe that our simulations will give experimentalists a reason to pursue development of these micro-swimmers to go beyond what is available now,” said <a href="http://www.me.gatech.edu/faculty/alexeev">Alexander Alexeev</a>, an assistant professor in the <a href="http://www.me.gatech.edu/">George W. Woodruff School of Mechanical Engineering</a> at Georgia Tech. “We wanted to demonstrate the principle of how robots this small could move by determining what is important and what would need to be used to build a real system.”</p><p>The simple swimmer designed by Alexeev and collaborators Hassan Masoud and Benjamin Bingham consists of a responsive gel body about ten microns long with two propulsive flaps attached to opposite sides. A steering flap sensitive to specific stimuli would be located at the front of the swimmer.</p><p>The responsive gel body would undergo periodic expansions and contractions triggered by oscillatory chemical reactions, oscillating magnetic or electric fields, or by cycles of temperature change. These expansions and contractions – the chemical swelling and de-swelling of the material – would create a beating motion in the rigid propulsive flaps attached to each side of the micro-swimmer. Combined with the movement of the gel body, the beating motion would move the micro-swimmer forward.</p><p>The trajectory of the micro-swimmer would be controlled by a flexible steering flap on its front. The flap would be made of a material that deforms based on changes in light intensity, temperature or magnetic field.</p><p>“The combination of these flaps and the oscillating body creates a very nice motion that we believe can be used to propel the swimmer,” said Alexeev. “To build a device that is autonomous and self-propelling at the micron-scale, we cannot build a tiny submarine. We have to keep it simple.”</p><p>Key to the operation of the micro-swimmer would be the latest generation of hydrogels, materials whose volume changes in a cyclical way. The hydrogels would serve as “chemical engines” to provide the motion needed to move the device’s propulsive flaps. Such materials currently exist and are being improved upon for other applications.</p><p>“We are using the state-of-the art in materials science, changing the properties of the material,” explained Masoud, a Ph.D. candidate in the School of Mechanical Engineering. “We have combined the materials with the principles of hydrodynamics at the small scale to develop this new swimmer.”</p><p>As part of their modeling, the researchers examined the effects of flaps of different sizes and properties. They also studied how flexible the micro-swimmer’s body needed to be to produce the kind of movement needed for swimming.</p><p>“You can’t swim at the small scale in the same way you swim at the large scale,” Alexeev said. “There is no inertia, which is how you keep moving at the large scale. What happens at the small scale is counterintuitive to what you expect at the large scale.”</p><p>The computational fluid modeling the researchers used allowed them to study a wide range of parameters in materials, oscillation rates and flexibility. What they learned, Alexeev said, will give experimentalists a starting point for actually building prototypes of the flexible gel robots.</p><p>“We have captured the solid mechanics of the periodically-oscillating body, the fluid dynamics of moving through the viscous liquid, and the coupling between the two,” he said. “From a computational fluid dynamics standpoint, it’s not an easy problem to model at this scale.”</p><p>Ultimately, the researchers hope to work with an experimental team to actually build the micro-swimmers. Combining their theoretical work with actual experiments could be a powerful approach to building robots on this size scale.</p><p>“This is a simulation that we hope to see in real life one day,” Alexeev said. “We have learned how experimentalists can pursue fabrication of these devices without extensive trial-and-error. We can use the simulations to look inside what will happen by using the laws of physics to explain it.”</p><p>The researchers envision groups of micro-swimmers carrying cargo through microfluidic chips or other devices. Swarms of them could one day work together as tiny construction robots moving materials to desired locations for assembly.</p><p>But the micro-swimmers won’t win any Olympic competitions. Alexeev estimates that their top speed could be on the order of a few micrometers per second – which should be enough to accomplish their mission.</p><p>“If your body is micrometers in size, that kind of speed is really not too bad,” he said. “The swimming speed will be rather slow, but at that size scale, you don’t really need to go very fast since you only need to go short distances.”</p><p><strong>Citation</strong>: Hassan Masoud, Benjamin I. Bingham and Alexander Alexeev, Soft Matter, 2012, Advance Article. DOI: 10.1039/C2SM25898F.<br /><br /><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 309</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).</p><p><strong>Writer</strong>: John Toon</p><p>&nbsp;</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1344205038</created>  <gmt_created>2012-08-05 22:17:18</gmt_created>  <changed>1475896356</changed>  <gmt_changed>2016-10-08 03:12:36</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Computational modeling shows how micro-swimmers could overcome the challenges of swimming at the micron scale.]]></teaser>  <type>news</type>  <sentence><![CDATA[Computational modeling shows how micro-swimmers could overcome the challenges of swimming at the micron scale.]]></sentence>  <summary><![CDATA[<p>Researchers have used complex computational models to design micro-swimmers that could overcome the challenges of swimming at the micron scale. These autonomous micro-robots could carry cargo and navigate in response to stimuli such as light.</p>]]></summary>  <dateline>2012-08-05T00:00:00-04:00</dateline>  <iso_dateline>2012-08-05T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-08-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p>(404) 894-6986</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>144371</item>      </media>  <hg_media>          <item>          <nid>144371</nid>          <type>image</type>          <title><![CDATA[Image of Simulated Micro-Swimmer]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[microswimmer.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/microswimmer_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/microswimmer_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/microswimmer_0.jpg?itok=Y3zxbE0z]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Image of Simulated Micro-Swimmer]]></image_alt>                    <created>1449178739</created>          <gmt_created>2015-12-03 21:38:59</gmt_created>          <changed>1475894777</changed>          <gmt_changed>2016-10-08 02:46:17</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="39581"><![CDATA[Alexander Alexeev]]></keyword>          <keyword tid="39591"><![CDATA[computational modeling]]></keyword>          <keyword tid="3356"><![CDATA[hydrogel]]></keyword>          <keyword tid="39571"><![CDATA[micro-robot]]></keyword>          <keyword tid="39561"><![CDATA[micro-swimmer]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>      </keywords>  <core_research_areas>          <term tid="39471"><![CDATA[Materials]]></term>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="252811">  <title><![CDATA[Robots, Re-shoring and America’s Manufacturing Renaissance]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>In this review of&nbsp;robots helping to raise the curtain on a new U.S. industrial landscape, Henrik Christensen addresses the fear of robots replacing workers, saying these fears are natural but unfounded. He cites the 2011 Metra Martech market research that claims that the robotics industry will create one million new jobs; robots and humans will be manufacturing things “together” for a long time to come.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383998400</created>  <gmt_created>2013-11-09 12:00:00</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Dhekne]]></publication>  <article_dateline>2012-08-02T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-08-02T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-08-02T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.roboticstrends.com/industry_manufacturing/article/robots_re_shoring_and_americas_manufacturing_renaissance]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="78861"><![CDATA[Henrik I. Christensen]]></keyword>          <keyword tid="215"><![CDATA[manufacturing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="143171">  <title><![CDATA[New Robots Giving the Disabled Independence]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Charlie Kemp (<em>Interactive Comp</em>) and his team are working to build robots that help the disabled with everyday tasks; a stroke victim is already successfully using one of the robots in his home. <em>Source: CBS News</em></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1343645042</created>  <gmt_created>2012-07-30 10:44:02</gmt_created>  <changed>1475893551</changed>  <gmt_changed>2016-10-08 02:25:51</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Roswell Biotechnologies]]></publication>  <article_dateline>2012-07-30T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-07-30T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-07-30T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.cbsnews.com/8301-18563_162-57481945/new-robots-giving-the-disabled-independence/?tag=showDoorFlexGridLeft;flexGridMod]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="39141"><![CDATA[Charlie Kemp; College of Engineering; Robotics; Disabled; Robots;]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="140221">  <title><![CDATA[Musical Glove Improves Sensation, Mobility for People with Spinal Cord Injury]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Georgia Tech researchers have created a wireless, musical glove that may improve sensation and motor skills for people with paralyzing spinal cord injury (SCI).</p><p>The gadget was successfully used by individuals with limited feeling or movement in their hands due to tetraplegia. These individuals had sustained their injury more than a year before the study, a time frame when most rehab patients see very little improvement for the remainder of their lives.&nbsp; Remarkably, the device was primarily used while the participants were going about their daily routines.</p><p>The device is called <a href="http://www.youtube.com/watch?v=Zi6t89pi17c">Mobile Music Touch</a> (MMT). The glove, which looks like a workout glove with a small box on the back, is used with a piano keyboard and vibrates a person’s fingers to indicate which keys to play. While learning to play the instrument, several people with SCI experienced improved sensation in their fingers.</p><p>Researchers at Georgia Tech and Atlanta’s <a href="http://www.shepherd.org/">Shepherd Center</a> recently completed a study focusing on people with weakness and sensory loss due to SCI.</p><p>“After our preliminary work in 2011, we suspected that the glove would have positive results for people with SCI,” said Ph.D. graduate Tanya Markow, the project’s leader. “But we were surprised by how much improvement they made in our study. For example, after using the glove, some participants were able to feel the texture of their bed sheets and clothes for the first time since their injury.”</p><p>Markow worked with individuals with SCI who had limited feeling or movement in their hands. Each suffered a spinal injury more than a year prior to the study. The eight-week project required study participants to practice playing the piano for 30 minutes, three times a week.&nbsp; Half used the MMT glove to practice; half did not.</p><p>The MMT system works with a computer, MP3 player or smart phone. A song, such as Ode to Joy, is programmed into a device, which is wirelessly linked to the glove. As the musical notes are illuminated on the correct keys on the piano keyboard, the gadget sends vibrations to “tap” the corresponding fingers. The participants play along, gradually memorizing the keys and learning additional songs.&nbsp;</p><p>However, these active learning sessions with MMT were not the primary focus of the study.&nbsp; The participants also wore the glove at home for two hours a day, five days a week, feeling only the vibration (and not playing the piano).&nbsp; Previous studies showed that wearing the MMT system passively in this manner helped participants learn songs faster and retain them better.&nbsp; The researchers hoped that the passive wearing of the device would also have rehabilitative effects.&nbsp;</p><p>At the end of the study, participants performed a variety of common grasping and sensation tests to measure their improvement.&nbsp; Those who used the MMT system performed significantly better than those who just learned the piano normally.</p><p>“Some people were able to pick up objects more easily,” said Markow. “Another said he could immediately feel the heat from a cup of coffee, rather than after a delay.”</p><p>Markow believes the increased motor abilities could be caused by renewed brain activity that sometimes becomes dormant in persons with SCI. The vibration might be triggering activity in the hand’s sensory cortex, which leads to firing in the brain’s motor cortex. Markow would like to expand the study to include functional MRI results.&nbsp;</p><p>The glove has evolved in recent years under the leadership of Georgia Tech’s Thad Starner and Ellen Yi-Luen Do, as well as Deborah Backus, director of multiple sclerosis research at Shepherd Center. The initial concept, <a href="http://www.gatech.edu/newsroom/release.html?nid=39815">Piano Touch</a>, developed with the team by then master’s student Kevin Huang, demonstrated that people could easily learn to play the piano by wearing the glove and feeling its vibrations. It didn’t take long for Starner to see the larger health benefits.</p><p>“Equipment used for hand rehabilitation may seem monotonous and boring to some, and doesn’t provide any feedback or incentive,” said Starner, who oversees the <a href="http://www.cc.gatech.edu/%7Ethad/">Contextual Computing Group</a>. “Mobile Music Touch overcomes each of those challenges and provides surprising benefits for people with weakness and sensory loss due to SCI. It’s a great example of how wearable computing can change people’s lives.”</p><p>Starner is an associate professor in the School of Interactive Computing. Do is a professor in the Schools of Interactive Computing and Industrial Design.</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1342429150</created>  <gmt_created>2012-07-16 08:59:10</gmt_created>  <changed>1475896349</changed>  <gmt_changed>2016-10-08 03:12:29</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech has created a wireless, musical glove that may improve sensation and motor skills for people with paralyzing spinal cord injury.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech has created a wireless, musical glove that may improve sensation and motor skills for people with paralyzing spinal cord injury.]]></sentence>  <summary><![CDATA[<p>Researchers at Georgia Tech and Atlanta's Shepherd Center have created a wireless, musical glove that may improve sensation and motor skills for people with spinal cord injuries (SCI). The gadget, Mobile Music Touch, was successfully used by individuals with tetraplegia who suffered their injury more than year before the study, a time frame when most rehab patients see very little improvement for the remainder of their lives.</p>]]></summary>  <dateline>2012-07-17T00:00:00-04:00</dateline>  <iso_dateline>2012-07-17T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-07-17 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />Media Relations<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-385-2966</p><p>&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>140181</item>          <item>140191</item>          <item>140201</item>      </media>  <hg_media>          <item>          <nid>140181</nid>          <type>image</type>          <title><![CDATA[Mobile Music Touch Glove 1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[dscn1051.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/dscn1051_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/dscn1051_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/dscn1051_0.jpg?itok=JkiD-jaZ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mobile Music Touch Glove 1]]></image_alt>                    <created>1449178710</created>          <gmt_created>2015-12-03 21:38:30</gmt_created>          <changed>1475894771</changed>          <gmt_changed>2016-10-08 02:46:11</gmt_changed>      </item>          <item>          <nid>140191</nid>          <type>image</type>          <title><![CDATA[Mobile Music Touch Glove 2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[dscn1056.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/dscn1056_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/dscn1056_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/dscn1056_0.jpg?itok=VqFcYzt3]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mobile Music Touch Glove 2]]></image_alt>                    <created>1449178710</created>          <gmt_created>2015-12-03 21:38:30</gmt_created>          <changed>1475894771</changed>          <gmt_changed>2016-10-08 02:46:11</gmt_changed>      </item>          <item>          <nid>140201</nid>          <type>image</type>          <title><![CDATA[Mobile Music Touch Glove 3]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[dscn1057.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/dscn1057_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/dscn1057_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/dscn1057_0.jpg?itok=1louUD6D]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Mobile Music Touch Glove 3]]></image_alt>                    <created>1449178710</created>          <gmt_created>2015-12-03 21:38:30</gmt_created>          <changed>1475894771</changed>          <gmt_changed>2016-10-08 02:46:11</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.youtube.com/watch?v=Zi6t89pi17c]]></url>        <title><![CDATA[Mobile Music Touch Demonstration]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/~thad/]]></url>        <title><![CDATA[Contextual Computing Group]]></title>      </link>          <link>        <url><![CDATA[http://www.cc.gatech.edu/]]></url>        <title><![CDATA[College of Computing]]></title>      </link>          <link>        <url><![CDATA[http://www.coa.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech College of Architecture]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>          <category tid="42891"><![CDATA[Georgia Tech Arts]]></category>          <category tid="42941"><![CDATA[Art Research]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="42891"><![CDATA[Georgia Tech Arts]]></term>          <term tid="42941"><![CDATA[Art Research]]></term>          <term tid="148"><![CDATA[Music and Music Technology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="654"><![CDATA[College of Computing]]></keyword>          <keyword tid="1946"><![CDATA[GVU]]></keyword>          <keyword tid="38081"><![CDATA[Mobile Music Touch]]></keyword>          <keyword tid="1942"><![CDATA[Piano Touch]]></keyword>          <keyword tid="1944"><![CDATA[Thad Starner]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="140771">  <title><![CDATA[Lizard bot shows how to scamper over the sand]]></title>  <uid>27195</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Daniel Goldman's CRAB lab creates lizard bot that can run on sand which&nbsp;could provide insights that will allow better Martian rovers to be built.</p>]]></body>  <author>Colly Mitchell</author>  <status>1</status>  <created>1342532163</created>  <gmt_created>2012-07-17 13:36:03</gmt_created>  <changed>1475893548</changed>  <gmt_changed>2016-10-08 02:25:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Lizard bot shows how to scamper over the sand]]></publication>  <article_dateline>2012-07-17T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-07-17T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-07-17T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.newscientist.com/article/mg21528735.500-lizard-bot-shows-how-to-scamper-over-the-sand.html]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="1292"><![CDATA[Parker H. Petit Institute for Bioengineering and Bioscience (IBB)]]></group>      </groups>  <categories>          <category tid="147"><![CDATA[Military Technology]]></category>          <category tid="150"><![CDATA[Physics and Physical Sciences]]></category>      </categories>  <keywords>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="38331"><![CDATA[Lizard bot shows how to scamper over the sand]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252831">  <title><![CDATA[Georgia Tech Student Wins Best Student Paper Award at Robotics Science & Systems Meeting]]></title>  <uid>27255</uid>  <body><![CDATA[<p>Authored by&nbsp;Ph.D. student Feifei Qian, "Walking and Running on Yielding and Fluidizing Ground"&nbsp;has been awarded the Best Student Paper at the <a href="http://www.roboticsproceedings.org/rss08/index.html" target="_blank">Robotics Science &amp; Systems (RSS) 2012 meeting</a> in Sydney, Australia.</p><p>Qian is advised by <a href="http://robotics.gatech.edu/team/faculty/goldman">Daniel Goldman</a>, and the paper discusses his work with the detailed locomotor mechanics of a small, lightweight robot (DynaRoACH, 10 cm, 25 g), revealing a mechanism by which small animals can achieve high performance on granular substrates, which also advances the design and control of small robots in deformable terrains.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1384000631</created>  <gmt_created>2013-11-09 12:37:11</gmt_created>  <changed>1475896518</changed>  <gmt_changed>2016-10-08 03:15:18</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>news</type>  <sentence><![CDATA[]]></sentence>  <summary><![CDATA[]]></summary>  <dateline>2012-07-13T00:00:00-04:00</dateline>  <iso_dateline>2012-07-13T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-07-13 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[josie@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />IRIM Marketing Communications<br />404-385-8551</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>      </media>  <hg_media>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="8862"><![CDATA[Student Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="134"><![CDATA[Student and Faculty]]></term>          <term tid="8862"><![CDATA[Student Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="252801">  <title><![CDATA[Human-like Eye Movement Could Aid Robots]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383997867</created>  <gmt_created>2013-11-09 11:51:07</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Hungtang Ko]]></publication>  <article_dateline>2012-07-06T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-07-06T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-07-06T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.upi.com/Science_News/Technology/2012/07/06/Human-like-eye-movement-could-aid-robots/UPI-26341341607592/#ixzz20FdmkHrL]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="138981">  <title><![CDATA[Robot Vision: Muscle-Like Action Allows Camera to Mimic Human Eye Movement]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.</p><p>Key to the new control system is a piezoelectric cellular actuator that uses a novel biologically inspired technology that will allow a robot eye to move more like a real eye. This will be useful for research studies on human eye movement as well as making video feeds from robots more intuitive. The research is being conducted by Ph.D. candidate Joshua Schultz under the direction of assistant professor <a href="http://www.me.gatech.edu/faculty/ueda">Jun Ueda</a>, both from the <a href="http://www.me.gatech.edu/">George W. Woodruff School of Mechanical Engineering</a> at the Georgia Institute of Technology.</p><p>“For a robot to be truly bio-inspired, it should possess actuation, or motion generators, with properties in common with the musculature of biological organisms,” said Schultz. “The actuators developed in our lab embody many properties in common with biological muscle, especially a cellular structure. Essentially, in the human eye muscles are controlled by neural impulses. Eventually, the actuators we are developing will be used to capture the kinematics and performance of the human eye.”</p><p>Details of the research were presented June 25, 2012, at the IEEE International Conference on Biomedical Robotics and Biomechatronics in Rome, Italy. The research is funded by National Science Foundation. Schultz also receives partial support from the Achievement Rewards for College Scientists (ARCS) Foundation.</p><p>Ueda, who leads the Georgia Tech Bio-Robotics and Human Modeling Laboratory in the School of Mechanical Engineering, said this novel technology will lay the groundwork for investigating research questions in systems that possess a large number of active units operating together. The application ranges from industrial robots, medical and rehabilitation robots to intelligent assistive robots.</p><p>“Robustness against uncertainty of model and environment is crucial for robots physically interacting with humans and environments,” said Ueda. “Successful integration relies on the coordinated design of control, structure, actuators and sensors by considering the dynamic interaction among them.”</p><p>Piezoelectric materials expand or contract when electricity is applied to them, providing a way to transform input signals into motion. This principle is the basis for piezoelectric actuators that have been used in numerous applications, but use in robotics applications has been limited due to piezoelectric ceramic's minuscule displacement. &nbsp;</p><p>The cellular actuator concept developed by the research team was inspired by biological muscle structure that connects many small actuator units in series or in parallel.</p><p>The Georgia Tech team has developed a lightweight, high speed approach that includes a single-degree of freedom camera positioner that can be used to illustrate and understand the performance and control of biologically inspired actuator technology. This new technology uses less energy than traditional camera positioning mechanisms and is compliant for more flexibility.</p><p>“Each muscle-like actuator has a piezoelectric material and a nested hierarchical set of strain amplifying mechanisms,” said Ueda. “We are presenting a mathematical concept that can be used to predict the performance as well as select the required geometry of nested structures. We use the design of the camera positioning mechanism’s actuators to demonstrate the concepts.”</p><p>The scientists’ research shows mechanisms that can scale up the displacement of piezoelectric stacks to the range of the ocular positioning system. In the past, the piezoelectric stacks available for this purpose have been too small.</p><p>“Our research shows a two-port network model that describes compliant strain amplification mechanisms that increase the stroke length of the stacks,” said Schultz. “Our findings make a contribution to the use of piezoelectric stack devices in robotics, modeling, design and simulation of compliant mechanisms. It also advances the control of systems using a large number of motor units for a given degree of freedom and control of robotic actuators.”</p><p>In the study, the scientists sought to resolve a previous conundrum. A cable-driven eye could produce the eye’s kinematics, but rigid servomotors would not allow researchers to test the hypothesis for the neurological basis for eye motion.</p><p>Some measure of flexibility could be used in software with traditional actuators, but it depended largely on having a continuously variable control signal and it could not show how flexibility could be maintained with quantized actuation corresponding to neural recruitment phenomena.</p><p>“Each muscle-like actuator consists of a piezoelectric material and a nested hierarchical set of strain amplifying mechanisms,” said Ueda. “Unlike traditional actuators, piezoelectric cellular actuators are governed by the working principles of muscles - namely, motion results by discretely activating, or recruiting, sets of active fibers, called motor units.</p><p>“Motor units are linked by flexible tissue, which serves a two-fold function,” said Ueda. “It combines the action potential of each motor unit, and presents a compliant interface with the world, which is critical in unstructured environments.”</p><p>The Georgia Tech team has presented a camera positioner driven by a novel cellular actuator technology, using a contractile ceramic to generate motion. The team used 16 amplified piezoelectric stacks per side.</p><p>The use of multiple stacks addressed the need for more layers of amplification. The units were placed inside a rhomboidal mechanism. The work offers an analysis of the force-displacement tradeoffs involved in the actuator design and shows how to find geometry that meets the requirement of the camera positioner, said Schultz.</p><p>“The goal of scaling up piezoelectric ceramic stacks holds great potential to more accurately replicate human eye motion than previous actuators,” noted Schultz. “Future work in this area will involve implantation of this technology on a multi-degree of freedom device, applying open and closed loop control algorithms for positioning and analysis of co-contraction phenomena.”</p><p>Future research by his team will continue to focus on the development of a design framework for highly integrated robotic systems. This ranges from industrial robots to medical and rehabilitation robots to intelligent assistive robots. <br /><br /><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 309</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).<br /><strong>Writer</strong>: Sarah E. Goodwin</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1341495522</created>  <gmt_created>2012-07-05 13:38:42</gmt_created>  <changed>1475896349</changed>  <gmt_changed>2016-10-08 03:12:29</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Stacks of piezoelectric actuators that simulate the action of real muscles could give robots more human-like eyes.]]></teaser>  <type>news</type>  <sentence><![CDATA[Stacks of piezoelectric actuators that simulate the action of real muscles could give robots more human-like eyes.]]></sentence>  <summary><![CDATA[<p>Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.</p>]]></summary>  <dateline>2012-07-05T00:00:00-04:00</dateline>  <iso_dateline>2012-07-05T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-07-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>138951</item>          <item>138961</item>          <item>138971</item>      </media>  <hg_media>          <item>          <nid>138951</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/piezoelectric-vision1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision1_0.jpg?itok=F81Tfwsj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision1]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>          <item>          <nid>138961</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/piezoelectric-vision2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision2_0.jpg?itok=X2G1wwV7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision2]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>          <item>          <nid>138971</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision4]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision4_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/piezoelectric-vision4_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision4_0.jpg?itok=q47qc-Gl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision4]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="7699"><![CDATA[piezoelectric]]></keyword>          <keyword tid="37861"><![CDATA[piezoelectric actuator]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>          <keyword tid="820"><![CDATA[vision]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="138041">  <title><![CDATA[Shimi the Dancing Robotic Smartphone Dock]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Gil Weinberg (<em>Interactive Computing</em>) has developed a one-foot-tall (30 cm) smartphone-enabled robot called Shimi, whichi can recommend songs, dance to the beat and play tunes based on listener feedback.<em> Source: Gizmag</em></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1340797988</created>  <gmt_created>2012-06-27 11:53:08</gmt_created>  <changed>1475893546</changed>  <gmt_changed>2016-10-08 02:25:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[student registration]]></publication>  <article_dateline>2012-06-28T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-06-28T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-06-28T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://bit.ly/M4AtkR]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="143"><![CDATA[Digital Media and Entertainment]]></category>          <category tid="148"><![CDATA[Music and Music Technology]]></category>      </categories>  <keywords>          <keyword tid="1939"><![CDATA[Gil Weinberg]]></keyword>          <keyword tid="1180"><![CDATA[Music]]></keyword>          <keyword tid="1309"><![CDATA[music technology]]></keyword>          <keyword tid="37381"><![CDATA[musical companion]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="168949"><![CDATA[Shimi]]></keyword>          <keyword tid="168908"><![CDATA[smartphone]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="261011">  <title><![CDATA[Georgia Tech Robot Repairs Road Ruptures]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1386976905</created>  <gmt_created>2013-12-13 23:21:45</gmt_created>  <changed>1475893614</changed>  <gmt_changed>2016-10-08 02:26:54</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[sidewalk]]></publication>  <article_dateline>2012-06-26T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-06-26T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-06-26T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.govtech.com/transportation/Georgia-Tech-Robot-Repairs-Road-Ruptures.html]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="416"><![CDATA[GTRI]]></keyword>          <keyword tid="82181"><![CDATA[pavement crack detection system]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="2352"><![CDATA[robots]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252781">  <title><![CDATA[Advanced Manufacturing Will Drive U.S. Economic Engine]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Recognizing the importance of advanced manufacturing in rebuilding the economy, the Obama administration created an Office of Manufacturing Policy and the $500 million “Advanced Manufacturing Partnership” (AMP), which calls for the creation of 500,000 credentialed workers in advanced manufacturing with industry certifications. The Georgia Institute of Technology, one of six academic institutions part of the steering committee for the AMP, is a leader in manufacturing robotics technology and will use a gift of nearly $1 million in robotics equipment from Coca-Cola Bottling Co. to create a Manufacturing Robotics Logistics Laboratory on campus.</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383997258</created>  <gmt_created>2013-11-09 11:40:58</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[autonomic nervous system]]></publication>  <article_dateline>2012-06-22T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-06-22T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-06-22T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.areadevelopment.com/EconomicsGovernmentPolicy/Summer2012/Advanced-Manufacturing-drives-USA-economic-engine-255422.shtml]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="14299"><![CDATA[Advanced Manufacturing Partnership]]></keyword>          <keyword tid="215"><![CDATA[manufacturing]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="136951">  <title><![CDATA[Mexican Jumping Bean Robots Rock and Roll]]></title>  <uid>27560</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1340268737</created>  <gmt_created>2012-06-21 08:52:17</gmt_created>  <changed>1475893546</changed>  <gmt_changed>2016-10-08 02:25:46</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Delta Jacket]]></publication>  <article_dateline>2012-06-20T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-06-20T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-06-20T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://bit.ly/KyNQZj]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>      </categories>  <keywords>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252791">  <title><![CDATA[Pentagon&#039;s Robot Sewing Machines Aim at China&#039;s Factories]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1383997544</created>  <gmt_created>2013-11-09 11:45:44</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Anglel Cabrera]]></publication>  <article_dateline>2012-06-07T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-06-07T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-06-07T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.technewsdaily.com/5841-pentagon-robot-sewing-machines.html]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="131891">  <title><![CDATA[Could We Trust Killer Robots?]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Since 2006, Ronald Arkin (<em>Computer Science</em>) has been working to develop robot drones that are capable not only of carrying out pinpoint attacks but of deciding on their own when it is permissible to fire on a particular target. <em>Source: Wall Street Journal</em></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1337686083</created>  <gmt_created>2012-05-22 11:28:03</gmt_created>  <changed>1475893540</changed>  <gmt_changed>2016-10-08 02:25:40</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Anton Leykin]]></publication>  <article_dateline>2012-05-20T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-05-20T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-05-20T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://online.wsj.com/article/SB10001424052702303448404577410032825529656.html?mod=googlenews_wsj]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50875"><![CDATA[School of Computer Science]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="147"><![CDATA[Military Technology]]></category>      </categories>  <keywords>          <keyword tid="3336"><![CDATA[army]]></keyword>          <keyword tid="34141"><![CDATA[Drones]]></keyword>          <keyword tid="525"><![CDATA[military]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="14444"><![CDATA[ron arkin]]></keyword>          <keyword tid="11106"><![CDATA[Ronald Arkin]]></keyword>          <keyword tid="7263"><![CDATA[unmanned]]></keyword>          <keyword tid="545"><![CDATA[Weapons]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="128531">  <title><![CDATA[Robot Reveals the Inner Workings of Brain Cells]]></title>  <uid>27206</uid>  <body><![CDATA[<p>Gaining access to the inner workings of a neuron in the living brain offers a wealth of useful information: its patterns of electrical activity, its shape, even a profile of which genes are turned on at a given moment. However, achieving this entry is such a painstaking task that it is considered an art form; it is so difficult to learn that only a small number of labs in the world practice it.</p><p>But that could soon change: Researchers at MIT and the Georgia Institute of Technology have developed a way to automate the process of finding and recording information from neurons in the living brain. The researchers have shown that a robotic arm guided by a cell-detecting computer algorithm can identify and record from neurons in the living mouse brain with better accuracy and speed than a human experimenter.</p><p>The new automated process eliminates the need for months of training and provides long-sought information about living cells’ activities. Using this technique, scientists could classify the thousands of different types of cells in the brain, map how they connect to each other, and figure out how diseased cells differ from normal cells.</p><p>The project is a collaboration between the labs of Ed Boyden, associate professor of biological engineering and brain and cognitive sciences at MIT, and <a href="http://www.me.gatech.edu/faculty/forest.shtml" target="_blank">Craig Forest</a>, an assistant professor in the <a href="http://www.me.gatech.edu" target="_blank">George W. Woodruff School of Mechanical Engineering at Georgia Tech</a>.</p><p>“Our team has been interdisciplinary from the beginning, and this has enabled us to bring the principles of precision machine design to bear upon the study of the living brain,” Forest says. His graduate student, Suhasa Kodandaramaiah, spent the past two years as a visiting student at MIT, and is the lead author of the study, which appears in the May 6 issue of <a href="http://dx.doi.org/10.1038/nmeth.1993" target="_blank"><em>Nature Methods</em></a>.</p><p>The method could be particularly useful in studying brain disorders such as schizophrenia, Parkinson’s disease, autism and epilepsy, Boyden says. “In all these cases, a molecular description of a cell that is integrated with [its] electrical and circuit properties … has remained elusive,” says Boyden, who is a member of MIT’s Media Lab and McGovern Institute for Brain Research. “If we could really describe how diseases change molecules in specific cells within the living brain, it might enable better drug targets to be found.”</p><p><strong>Automation</strong></p><p>Kodandaramaiah, Boyden and Forest set out to automate a 30-year-old technique known as whole-cell patch clamping, which involves bringing a tiny hollow glass pipette in contact with the cell membrane of a neuron, then opening up a small pore in the membrane to record the electrical activity within the cell. This skill usually takes a graduate student or postdoc several months to learn.</p><p>Kodandaramaiah spent about four months learning the manual patch-clamp technique, giving him an appreciation for its difficulty. “When I got reasonably good at it, I could sense that even though it is an art form, it can be reduced to a set of stereotyped tasks and decisions that could be executed by a robot,” he says.</p><p>To that end, Kodandaramaiah and his colleagues built a robotic arm that lowers a glass pipette into the brain of an anesthetized mouse with micrometer accuracy. As it moves, the pipette monitors a property called electrical impedance — a measure of how difficult it is for electricity to flow out of the pipette. If there are no cells around, electricity flows and impedance is low. When the tip hits a cell, electricity can’t flow as well and impedance goes up.</p><p>The pipette takes two-micrometer steps, measuring impedance 10 times per second. Once it detects a cell, it can stop instantly, preventing it from poking through the membrane. “This is something a robot can do that a human can’t,” Boyden says.</p><p>Once the pipette finds a cell, it applies suction to form a seal with the cell’s membrane. Then, the electrode can break through the membrane to record the cell’s internal electrical activity. The robotic system can detect cells with 90 percent accuracy, and establish a connection with the detected cells about 40 percent of the time.</p><p>The researchers also showed that their method can be used to determine the shape of the cell by injecting a dye; they are now working on extracting a cell’s contents to read its genetic profile.</p><p>Development of the new technology was funded primarily by the National Institutes of Health, the National Science Foundation and the MIT Media Lab.</p><p><strong>New era for robotics</strong></p><p>The researchers recently created a startup company, Neuromatic Devices, to commercialize the device.</p><p>The researchers are now working on scaling up the number of electrodes so they can record from multiple neurons at a time, potentially allowing them to determine how different parts of the brain are connected.</p><p>They are also working with collaborators to start classifying the thousands of types of neurons found in the brain. This “parts list” for the brain would identify neurons not only by their shape — which is the most common means of classification — but also by their electrical activity and genetic profile.</p><p>“If you really want to know what a neuron is, you can look at the shape, and you can look at how it fires. Then, if you pull out the genetic information, you can really know what’s going on,” Forest says. “Now you know everything. That’s the whole picture.”</p><p>Boyden says he believes this is just the beginning of using robotics in neuroscience to study living animals. A robot like this could potentially be used to infuse drugs at targeted points in the brain, or to deliver gene therapy vectors. He hopes it will also inspire neuroscientists to pursue other kinds of robotic automation — such as in optogenetics, the use of light to perturb targeted neural circuits and determine the causal role that neurons play in brain functions.</p><p>Neuroscience is one of the few areas of biology in which robots have yet to make a big impact, Boyden says. “The genome project was done by humans and a giant set of robots that would do all the genome sequencing. In directed evolution or in synthetic biology, robots do a lot of the molecular biology,” he says. “In other parts of biology, robots are essential.”</p><p>Other co-authors include MIT grad student Giovanni Talei Franzesi and MIT postdoc Brian Y. Chow.&nbsp;</p><p><strong>Research News &amp; Publications Office<br /> Georgia Institute of Technology<br /> 75 Fifth Street, N.W., Suite 314<br /> Atlanta, Georgia 30308 USA</strong></p><p><strong>Media Relations Contacts:</strong> Abby Robinson (abby@innovate.gatech.edu; 404-385-3364) or Caroline McCall (cmccall5@mit.edu; 617-253-1682)</p><p><strong>Writer: </strong>Anne Trafton, MIT News</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1336328111</created>  <gmt_created>2012-05-06 18:15:11</gmt_created>  <changed>1475896329</changed>  <gmt_changed>2016-10-08 03:12:09</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Researchers have automated the process of finding and recording information from neurons in the living brain.]]></teaser>  <type>news</type>  <sentence><![CDATA[Researchers have automated the process of finding and recording information from neurons in the living brain.]]></sentence>  <summary><![CDATA[<p>Researchers have automated the process of finding and recording information from neurons in the living brain. A robotic arm guided by a cell-detecting computer algorithm can identify and record from neurons in the living mouse brain with better accuracy and speed than a human experimenter.</p>]]></summary>  <dateline>2012-05-06T00:00:00-04:00</dateline>  <iso_dateline>2012-05-06T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-05-06 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[]]></email>  <location></location>  <contact><![CDATA[<p>Abby Robinson<br /> Research News and Publications<br /> <a href="mailto:abby@innovate.gatech.edu">abby@innovate.gatech.edu</a><br /> 404-385-3364</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>128501</item>          <item>128521</item>          <item>128511</item>      </media>  <hg_media>          <item>          <nid>128501</nid>          <type>image</type>          <title><![CDATA[Craig Forest robotic neural recordings]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[forest_autopatching_hires.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/forest_autopatching_hires_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/forest_autopatching_hires_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/forest_autopatching_hires_0.jpg?itok=Tn6gcGqJ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Craig Forest robotic neural recordings]]></image_alt>                    <created>1449178622</created>          <gmt_created>2015-12-03 21:37:02</gmt_created>          <changed>1475894751</changed>          <gmt_changed>2016-10-08 02:45:51</gmt_changed>      </item>          <item>          <nid>128521</nid>          <type>image</type>          <title><![CDATA[Whole-cell patching robot schematic]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autopatching_schematic_hires.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autopatching_schematic_hires_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/autopatching_schematic_hires_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autopatching_schematic_hires_0.jpg?itok=zjv6olyQ]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Whole-cell patching robot schematic]]></image_alt>                    <created>1449178622</created>          <gmt_created>2015-12-03 21:37:02</gmt_created>          <changed>1475894751</changed>          <gmt_changed>2016-10-08 02:45:51</gmt_changed>      </item>          <item>          <nid>128511</nid>          <type>image</type>          <title><![CDATA[Neuromatic Devices research team]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[autopatching_team_hires.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/autopatching_team_hires_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/autopatching_team_hires_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/autopatching_team_hires_0.jpg?itok=K6bXY_bu]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Neuromatic Devices research team]]></image_alt>                    <created>1449178622</created>          <gmt_created>2015-12-03 21:37:02</gmt_created>          <changed>1475894751</changed>          <gmt_changed>2016-10-08 02:45:51</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="1912"><![CDATA[brain]]></keyword>          <keyword tid="32681"><![CDATA[brain cell]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="12333"><![CDATA[Craig Forest]]></keyword>          <keyword tid="32711"><![CDATA[electrical activity]]></keyword>          <keyword tid="7276"><![CDATA[neuron]]></keyword>          <keyword tid="1304"><![CDATA[neuroscience]]></keyword>          <keyword tid="32691"><![CDATA[patch clamp]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>          <keyword tid="32701"><![CDATA[whole-cell patch clamping]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="252851">  <title><![CDATA[Science Teachers Build Robots]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1384001922</created>  <gmt_created>2013-11-09 12:58:42</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[strategic intelligence]]></publication>  <article_dateline>2012-05-03T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-05-03T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-05-03T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.gpb.org/news/2012/05/03/science-teachers-build-robots]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="42901"><![CDATA[Community]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="12814"><![CDATA[GT-Savannah]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252861">  <title><![CDATA[Daniel Goldman Receives 2012 DARPA Young Faculty Award]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1384002350</created>  <gmt_created>2013-11-09 13:05:50</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[climate attitudes]]></publication>  <article_dateline>2012-05-01T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-05-01T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-05-01T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.physics.gatech.edu/content/prof-daniel-goldman-receives-2012-darpa-award]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="12040"><![CDATA[Daniel Goldman]]></keyword>          <keyword tid="79391"><![CDATA[DARPA Young Faculty Award]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="125041">  <title><![CDATA[Georgia Tech Honored by Boeing for Exceptional Performance]]></title>  <uid>27462</uid>  <body><![CDATA[<p>The Georgia Institute of Technology was honored by Boeing on April 18 for its exceptional performance and contributions to the company’s overall success during 2011.</p><p>Georgia Tech was one of 16 organizations to receive a Boeing Supplier of the Year Award. The Institute was selected from a pool of more than 17,500 Boeing suppliers in more than 50 countries.</p><p>Georgia Tech was honored in the category of Academia, which recognizes outstanding performance as a strategic university. As one of Boeing's eight strategic universities, Georgia Tech provides increased knowledge and understanding of fluid flow, advanced manufacturing technology, design and aircraft technology through basic and applied research, which is based in Georgia Tech's Manufacturing Research Center (MaRC).</p><p>The Boeing award recognizes multidisciplinary research by Georgia Tech Mechanical Engineering Professors Steve Danyluk and Ari Glezer, Industrial &amp; Systems Engineering Professor Leon McGinnis, Aerospace Engineering Professor Dimitri Mavris and College of Computing Professor Henrik Christensen.&nbsp;</p><p>Boeing supports various research activities at Georgia Tech related to manufacturing technologies, such as control and control systems on cranes, mobile platforms and robotics for moving parts in a factory environment and active flow control for wing tips, said Danyluk, professor and Morris M. Bryan Jr. Chair in Mechanical Engineering for Advanced Manufacturing Systems.</p><p>“I am very pleased that Boeing has expressed their confidence and support in Georgia Tech by providing the resources to conduct research and development on manufacturing problems of critical significance to their business,” said Danyluk, former director of MaRC. “Our faculty are excited and energized by the Supplier of the Year Award, and we'll continue to excel in developing the tools and processes that will keep the U.S. in a lead position in manufacturing sciences."</p><p>Other professors and research engineers from across campus who help support Georgia Tech's work for Boeing include&nbsp;Bert Bras, Jon Colton, Bill Singhose, Rick Cowan, Shreyes Melkote, Russell Peak, Chris Paredis, Tina Guldberg, Marc Goetschalckx, Joshua Vaughn, Frank Mess and Andrew Dugenske.&nbsp;</p><p>The Boeing global supply chain is among the most geographically dispersed in manufacturing. The company annually purchases more than $50 billion in goods and services from approximately 28,000 suppliers that employ more than 1.2 million people around the world.</p><p>“In today’s challenging business environment, an agile supply chain that continuously delivers excellent performance is critical,” said Jack House, vice president of Supplier Management for Boeing Defense, Space and Security and the leader of Boeing’s companywide Supplier Management program. “The supplier partners receiving 2011 Supplier of the Year Awards have demonstrated outstanding commitment to providing our customers with the best-value, highest-quality products and services, while meeting the customers’ requirements and anticipating their needs for the future.”&nbsp;</p>]]></body>  <author>Liz Klipp</author>  <status>1</status>  <created>1334840194</created>  <gmt_created>2012-04-19 12:56:34</gmt_created>  <changed>1475896324</changed>  <gmt_changed>2016-10-08 03:12:04</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech receives Boeing 2011 Supplier of the Year award for outstanding performance as a strategic university.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech receives Boeing 2011 Supplier of the Year award for outstanding performance as a strategic university.]]></sentence>  <summary><![CDATA[<p>The Georgia Institute of Technology was honored by Boeing on April 18 for its exceptional performance and contributions to the company’s overall success during 2011.&nbsp;</p>]]></summary>  <dateline>2012-04-19T00:00:00-04:00</dateline>  <iso_dateline>2012-04-19T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-04-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[klipp@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Georgia Tech Media Relations</strong><br />Laura Diamond<br /><a href="mailto:laura.diamond@comm.gatech.edu">laura.diamond@comm.gatech.edu</a><br />404-894-6016<br />Jason Maderer<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a><br />404-660-2926</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>125811</item>          <item>125361</item>      </media>  <hg_media>          <item>          <nid>125811</nid>          <type>image</type>          <title><![CDATA[Boeing 787]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[boeing_plane.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/boeing_plane_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/boeing_plane_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/boeing_plane_0.jpg?itok=Qx2OHAT1]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Boeing 787]]></image_alt>                    <created>1449178604</created>          <gmt_created>2015-12-03 21:36:44</gmt_created>          <changed>1475894577</changed>          <gmt_changed>2016-10-08 02:42:57</gmt_changed>      </item>          <item>          <nid>125361</nid>          <type>image</type>          <title><![CDATA[Boeing award]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[georgia_tech_soy.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/georgia_tech_soy_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/georgia_tech_soy_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/georgia_tech_soy_0.jpg?itok=o2I1g9kB]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Boeing award]]></image_alt>                    <created>1449178604</created>          <gmt_created>2015-12-03 21:36:44</gmt_created>          <changed>1475894749</changed>          <gmt_changed>2016-10-08 02:45:49</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.boeing.com/aboutus/supplier_of_the_year/soy2011_gallery.html]]></url>        <title><![CDATA[Video - Georgia Tech named Boeing 2011 Supplier of the Year]]></title>      </link>          <link>        <url><![CDATA[http://boeing.mediaroom.com/index.php?item=2227&amp;s=43]]></url>        <title><![CDATA[Boeing Honors 16 Suppliers of the Year for Exceptional Performance]]></title>      </link>          <link>        <url><![CDATA[http://www.marc.gatech.edu/]]></url>        <title><![CDATA[Manufacturing Research Center (MARC)]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1214"><![CDATA[News Room]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>      </news_terms>  <keywords>          <keyword tid="4358"><![CDATA[boeing]]></keyword>          <keyword tid="169486"><![CDATA[Steven Danyluk]]></keyword>          <keyword tid="171200"><![CDATA[Supplier of the Year award]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="123531">  <title><![CDATA[Georgia Tech Celebrates National Robotics Week]]></title>  <uid>27560</uid>  <body><![CDATA[<p>The Georgia Institute of Technology opened its doors to more than 400 middle school and high school students on Wednesday for the third annual Robotics Open House. Georgia Tech masters students and Ph.D. candidates demonstrated more than 20 projects around campus, marking the Institute’s participation in National Robotics Week.</p><p>Students saw a variety of projects, including an autonomous race car, robotic submarines and <a href="http://www.simontherobot.com/">Simon</a> (click <a href="http://www.youtube.com/watch?v=C9r1mJrWfLs">here</a> for a video of the day’s events).</p><p>“You can see the students’ eyes light up when they’re watching our demos. They get really excited because they often have little knowledge that such projects exist, and they are in many cases not aware of the potential impact of new technology,” said Henrik Christensen, director of the Center for Robotics and Intelligent Machines. “That’s important because some of them arrive and think engineering is not very exciting. Then they see some of our robots and go back to school to tell others. We hope today inspires them.”</p><p>National Robotics Week was established by Congress in 2010. There are 150 events across all 50 states this year, the highest participation to date. National Robotics Week is intended to be a “national road-map” for robotics technology.</p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1334166300</created>  <gmt_created>2012-04-11 17:45:00</gmt_created>  <changed>1475896320</changed>  <gmt_changed>2016-10-08 03:12:00</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech welcomed 400 students for its annual Robotics Open House, part of National Robotics Week.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech welcomed 400 students for its annual Robotics Open House, part of National Robotics Week.]]></sentence>  <summary><![CDATA[<p>The Georgia Institute of Technology opened its doors to more than 400 middle school and high school students on Wednesday for the third annual Robotics Open House.</p>]]></summary>  <dateline>2012-04-11T00:00:00-04:00</dateline>  <iso_dateline>2012-04-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-04-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />Georgia Tech Media Relations<br />404-385-2966<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a></p><p>&nbsp;</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>123511</item>          <item>123521</item>      </media>  <hg_media>          <item>          <nid>123511</nid>          <type>image</type>          <title><![CDATA[Simon]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[dscn0755.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/dscn0755_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/dscn0755_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/dscn0755_0.jpg?itok=bOYxM0MS]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Simon]]></image_alt>                    <created>1449178582</created>          <gmt_created>2015-12-03 21:36:22</gmt_created>          <changed>1475894746</changed>          <gmt_changed>2016-10-08 02:45:46</gmt_changed>      </item>          <item>          <nid>123521</nid>          <type>image</type>          <title><![CDATA[Robotics Open House Golem Krang]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[dscn0769.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/dscn0769_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/dscn0769_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/dscn0769_0.jpg?itok=KCdia0_t]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Robotics Open House Golem Krang]]></image_alt>                    <created>1449178582</created>          <gmt_created>2015-12-03 21:36:22</gmt_created>          <changed>1475894746</changed>          <gmt_changed>2016-10-08 02:45:46</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>          <link>        <url><![CDATA[http://www.nationalroboticsweek.org/]]></url>        <title><![CDATA[National Robotics Week]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1220"><![CDATA[Digital Lounge]]></group>      </groups>  <categories>          <category tid="129"><![CDATA[Institute and Campus]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="129"><![CDATA[Institute and Campus]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="121741">  <title><![CDATA[How to Talk With Your Personal Robot]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>Maya Cakmak (<em>Interactive Computing</em>) is trying to help create an easy-to-use human-robot interaction. Along with other GT researchers, she has recently identified the types of questions a robot can ask to get more information from a human so that they can learn a new task. <em>Source: Smart Planet</em></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1333446305</created>  <gmt_created>2012-04-03 09:45:05</gmt_created>  <changed>1475893532</changed>  <gmt_changed>2016-10-08 02:25:32</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[workplace motivation]]></publication>  <article_dateline>2012-04-03T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-04-03T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-04-03T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.smartplanet.com/blog/thinking-tech/how-to-talk-with-your-personal-robot/11064]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="11526"><![CDATA[Andrea Thomaz]]></keyword>          <keyword tid="26741"><![CDATA[Center for Robotics and Intelligent Machines]]></keyword>          <keyword tid="26421"><![CDATA[maya cakmak]]></keyword>          <keyword tid="12239"><![CDATA[RIM]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="29191"><![CDATA[train]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252871">  <title><![CDATA[Robots Unlimited: Ayanna Howard Reaches for Mars, the Arctic and Pediatrics]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<h1>&nbsp;</h1><p>&nbsp;</p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1384002507</created>  <gmt_created>2013-11-09 13:08:27</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[corona]]></publication>  <article_dateline>2012-04-02T00:00:00-04:00</article_dateline>  <iso_article_dateline>2012-04-02T00:00:00-04:00</iso_article_dateline>  <gmt_article_dateline>2012-04-02T00:00:00-04:00</gmt_article_dateline>  <article_url><![CDATA[http://www.prism-magazine.org/apr12/upclose.cfm]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="825"><![CDATA[Ayanna Howard]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252881">  <title><![CDATA[Rescue Robots that Mimic Snake Movement]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1384002941</created>  <gmt_created>2013-11-09 13:15:41</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Aquatic Chemical Ecology Research Experience for Undergraduates]]></publication>  <article_dateline>2012-03-01T00:00:00-05:00</article_dateline>  <iso_article_dateline>2012-03-01T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2012-03-01T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.smartplanet.com/blog/smart-takes/rescue-robots-that-mimic-snake-movement/23589?tag=content;siu-container]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="297"><![CDATA[David Hu]]></keyword>          <keyword tid="14594"><![CDATA[Hamid Marvi]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="252891">  <title><![CDATA[How Technology Helping the Elderly is Turning into a Big Business Opportunity]]></title>  <uid>27255</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<h1 class="multi-line-title-1">&nbsp;</h1>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1384003156</created>  <gmt_created>2013-11-09 13:19:16</gmt_created>  <changed>1475893608</changed>  <gmt_changed>2016-10-08 02:26:48</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Jacqueline Rohde]]></publication>  <article_dateline>2012-02-21T00:00:00-05:00</article_dateline>  <iso_article_dateline>2012-02-21T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2012-02-21T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://articles.economictimes.indiatimes.com/2012-02-21/news/31083135_1_willow-garage-mechanics-products]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="134"><![CDATA[Student and Faculty]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="79401"><![CDATA[Charles Kemp]]></keyword>          <keyword tid="1129"><![CDATA[healthcare]]></keyword>          <keyword tid="10488"><![CDATA[PR2]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="107481">  <title><![CDATA[The Real Steel: Robotics Careers Ready to Boom]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>The robotics industry is in a major growth mode but companies are having trouble finding high quality employees.&nbsp; Henrik Christensen (<em>Interactive Computing</em>) discusses what individuals need to break into the booming robotics field. <em>Source: Today's Engineer</em></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1328697845</created>  <gmt_created>2012-02-08 10:44:05</gmt_created>  <changed>1475893523</changed>  <gmt_changed>2016-10-08 02:25:23</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[1060]]></publication>  <article_dateline>2012-02-09T00:00:00-05:00</article_dateline>  <iso_article_dateline>2012-02-09T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2012-02-09T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.todaysengineer.org/2012/Feb/career-focus.asp]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>          <group id="50876"><![CDATA[School of Interactive Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="516"><![CDATA[engineering]]></keyword>          <keyword tid="11890"><![CDATA[henrik christensen]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="23371"><![CDATA[robotics careers]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="106701">  <title><![CDATA[Why Kids Prefer Robots to Teachers and Parents]]></title>  <uid>27556</uid>  <summary><![CDATA[]]></summary>  <body><![CDATA[<p>What would happen if robots were a part of your everyday life at school and beyond? Henrik Christen (<em>Interactive Computing</em>) takes that projection a step further when he argues, “If we make conscious robots, they would want to have rights and they probably should.” <em>Source: Forbes</em></p>]]></body>  <author>Michaelanne Dye</author>  <status>1</status>  <created>1328532540</created>  <gmt_created>2012-02-06 12:49:00</gmt_created>  <changed>1475893523</changed>  <gmt_changed>2016-10-08 02:25:23</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>hgTechInTheNews</type>  <publication><![CDATA[Kausik Chakrabarti]]></publication>  <article_dateline>2012-02-06T00:00:00-05:00</article_dateline>  <iso_article_dateline>2012-02-06T00:00:00-05:00</iso_article_dateline>  <gmt_article_dateline>2012-02-06T00:00:00-05:00</gmt_article_dateline>  <article_url><![CDATA[http://www.forbes.com/sites/jamesmarshallcrotty/2012/02/03/kids-prefer-robots-to-teachers-and-parents/]]></article_url>  <media>      </media>  <hg_media>      </hg_media>  <files>      </files>  <groups>          <group id="47223"><![CDATA[College of Computing]]></group>      </groups>  <categories>          <category tid="153"><![CDATA[Computer Science/Information Technology and Security]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <keywords>          <keyword tid="13169"><![CDATA[autonomous robots]]></keyword>          <keyword tid="11890"><![CDATA[henrik christensen]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>    <userdata>      <![CDATA[]]>  </userdata></node><node id="79331">  <title><![CDATA[Snakes Improve Search-and-Rescue Robots]]></title>  <uid>27560</uid>  <body><![CDATA[<p>Designing an all-terrain robot for search-and-rescuemissions is an arduous task for scientists. The machine must be flexible enoughto move over uneven surfaces, yet not so big that it’s restricted from tightspaces. It might also be required to climb slopes of varying inclines. Existingrobots can do many of these things, but the majority require large amounts of energyand are prone to overheating. Georgia Tech researchers have designed a new machineby studying the locomotion of a certain type of flexible, efficient animal. </p><p>“By using their scales to control frictional properties,snakes are able to move large distances while exerting very little energy,”said Hamid Marvi, a Mechanical Engineering Ph.D. candidate at Georgia Tech. </p><p>While studying and videotaping the movements of 20 differentspecies at Zoo Atlanta, Marvi developed Scalybot 2, a robot that replicatesrectilinear locomotion of snakes. He unveiled the robot this month at theSociety for Integrative &amp; Comparative Biology (SICB) annual meeting inCharleston, S.C. </p><p>“During <a href="http://www.youtube.com/watch?v=TuyjtX0tdkU">rectilinearlocomotion</a>, a snake doesn’t have to bend its body laterally to move,”explained Marvi. “Snakes lift their ventral scales and pull themselves forwardby sending a muscular traveling wave from head to tail. Rectilinear locomotion isvery efficient and is especially useful for crawling within crevices, aninvaluable benefit for search-and-rescue robots.”&nbsp;&nbsp; </p><p>Scalybot 2 can automatically change the angle of its scales whenit encounters different terrains and slopes. This adjustment allows the robotto either fight or generate friction. The two-link robot is controlled by aremote-controlled joystick and can move forward and backward using four motors.</p><p>“Snakes are highly maligned creatures,” said Joe Mendelson, curatorof herpetology at Zoo Atlanta. “I really like that Hamid’s research is showingthe public that snakes can help people.” </p><p>Marvi’s advisor is David Hu, an assistant professor in theSchools of Mechanical Engineering and Biology. Hu and his research team areprimarily focused on animal locomotion. They’ve studied how dogs and otheranimals shake water off their bodies and how mosquitos fly through rainstorms. </p><p>This isn’t the first time Hu’s lab has looked at snake locomotion.Last summer the team developed Scalybot 1, a two-link climbing robot that replicatesconcertina locomotion. The push-and-pull, accordion-style movement featuresalternating scale activity. </p><p><em>This project is supported by the National Science Foundation (NSF)(Award No. </em><em>PHY-0848894<em>). The content is solely the responsibility of the principalinvestigators and does not necessarily represent the official views of the NSF.</em></em></p>]]></body>  <author>Jason Maderer</author>  <status>1</status>  <created>1326968252</created>  <gmt_created>2012-01-19 10:17:32</gmt_created>  <changed>1475896257</changed>  <gmt_changed>2016-10-08 03:10:57</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Georgia Tech researchers have designed a new machine by studying the locomotion of a certain type of flexible, efficient animal: snakes.]]></teaser>  <type>news</type>  <sentence><![CDATA[Georgia Tech researchers have designed a new machine by studying the locomotion of a certain type of flexible, efficient animal: snakes.]]></sentence>  <summary><![CDATA[<p>Designing an all-terrain robot for search-and-rescuemissions is an arduous task for scientists. The machine must be flexible enoughto move over uneven surfaces, yet not so big that it’s restricted from tightspaces. It might also be required to climb slopes of varying inclines. Existingrobots can do many of these things, but the majority require large amounts of energyand are prone to overheating. Georgia Tech researchers have designed a new machineby studying the locomotion of snakes. </p>]]></summary>  <dateline>2012-01-19T00:00:00-05:00</dateline>  <iso_dateline>2012-01-19T00:00:00-05:00</iso_dateline>  <gmt_dateline>2012-01-19 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[New Robot is Designed to Use Less Energy]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[maderer@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Jason Maderer<br />Georgia Tech Media Relations<br />404-385-2966<br /><a href="mailto:maderer@gatech.edu">maderer@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>79321</item>      </media>  <hg_media>          <item>          <nid>79321</nid>          <type>image</type>          <title><![CDATA[Scalybot 2 Photo]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[screen_shot_2012-01-13_at_9.47.24_am_0.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/screen_shot_2012-01-13_at_9.47.24_am_0_0.png]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/screen_shot_2012-01-13_at_9.47.24_am_0_0.png]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/screen_shot_2012-01-13_at_9.47.24_am_0_0.png?itok=7JnodZaq]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Scalybot 2 Photo]]></image_alt>                    <created>1449178063</created>          <gmt_created>2015-12-03 21:27:43</gmt_created>          <changed>1475894693</changed>          <gmt_changed>2016-10-08 02:44:53</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.youtube.com/watch?v=kIHlRLKMG9M]]></url>        <title><![CDATA[Scalybot 2 Demonstration]]></title>      </link>          <link>        <url><![CDATA[http://www.me.gatech.edu/]]></url>        <title><![CDATA[George W. Woodruff School of Mechanical Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.biology.gatech.edu/]]></url>        <title><![CDATA[School of Biology]]></title>      </link>          <link>        <url><![CDATA[http://www.zooatlanta.org/]]></url>        <title><![CDATA[Zoo Atlanta website]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1183"><![CDATA[Home]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="146"><![CDATA[Life Sciences and Biology]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="146"><![CDATA[Life Sciences and Biology]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="277"><![CDATA[Biology]]></keyword>          <keyword tid="297"><![CDATA[David Hu]]></keyword>          <keyword tid="541"><![CDATA[Mechanical Engineering]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="169002"><![CDATA[Snakes]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node></nodes>