<nodes> <node id="69170">  <title><![CDATA[Glove that Vibrates Fingertip Could Improve One's Sense of Touch]]></title>  <uid>27206</uid>  <body><![CDATA[<p>A little vibration can be a good thing for people who need a sensitive touch.</p><p>Researchers at the Georgia Institute of Technology have developed a glove with a special fingertip designed to improve the wearer's sense of touch. Applying a small vibration to the side of the fingertip improves tactile sensitivity and motor performance, according to their research results.</p><p>Previous research has shown that adding an appropriate amount of white noise -- a concept called stochastic resonance -- can improve sight, hearing, balance control and touch, but the white noise had not been incorporated into a wearable device. The Georgia Tech prototype is believed to be the first wearable stochastic resonance device, attaching to the fingertip to improve the sense of touch.</p><p>"This device may one day be used to assist individuals whose jobs require high-precision manual dexterity or those with medical conditions that reduce their sense of touch," said Jun Ueda, an assistant professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech.</p><p>Ueda worked with Minoru Shinohara, an associate professor in the School of Applied Physiology at Georgia Tech, and visiting scholar Yuichi Kurita, to design the device and test its capabilities on a small group of healthy individuals.</p><p>Details of the device and preliminary test results were presented in May at the 2011 IEEE International Conference on Robotics and Automation in Shanghai.</p><p>The device uses an actuator made of a stack of lead zirconate titanate layers to generate high-frequency vibration. The ceramic layers are piezoelectric, which means they vibrate when an electrical charge is applied to them. The actuator is attached to the side of the fingertip so that the palm-side of the finger remains free and the individual wearing the glove can continue to manipulate objects.</p><p>For this study, the researchers attached the device to 10 healthy adult volunteers who performed common sensory and motor skill tasks, including texture discrimination, two-point discrimination, single-point touch and grasp tests. The experimental results showed that the volunteers performed statistically better on all of the tasks when mechanical vibration was applied.</p><p>"All of the experimental results showed that some mechanical vibration was better than none at all, but the level of vibration that statistically improved sensorimotor functions varied by test," noted Ueda.</p><p>For each test, researchers attached the device to a volunteer's non-dominant index finger and subjected the finger to six randomized vibrations that ranged from 0-150 percent of that person's vibration amplitude threshold, a value that was determined by earlier testing. The threshold value was the magnitude of vibration required for a subject to feel that the device was vibrating.</p><p>In the two-point discrimination test, two sharp points were pressed against the fingertip and volunteers reported whether they could reliably distinguish two points touching their finger versus just one. The results showed that when individuals were subjected to vibrations equal to 75 and 100 percent of their thresholds, they could sense two points that were closer together.</p><p>The single-point touch experiment involved pressing a fiber strand against each individual's finger. Subjects were asked to report if they could feel filaments of different weights touching their fingers. The volunteers could feel lighter weight filaments when exposed to vibrations up to their vibration amplitude threshold.</p><p>In the third experiment, pieces of sandpaper with different grits were glued on one side of a plastic board. Researchers then randomly selected a test piece of sandpaper and attached it to the other side of the board -- which the subjects could not see. Subjects touched the single piece of sandpaper and tried to select the matching piece from the nine samples on the other side of the board. At vibration levels of 50 and 100 percent of their thresholds, the subjects selected the correct piece of sandpaper 15 percent more often than when they were not exposed to any vibration.</p><p>For the grasping test, each subject pinched and held an object for three seconds with as small a force as possible without letting it slip. Statistically significant improvements in grasping were observed for cases of 50, 100 and 125 percent of threshold vibration.</p><p>All four sensing ability tests confirmed that the application of certain levels of mechanical vibration enhanced the tactile sensitivity of the fingertip. However, because the levels of vibration that created statistically significant results varied, the researchers are currently conducting experiments to determine the optimal amplitude and frequency characteristics of vibration and the influence of long-term exposure to vibrations. The researchers are also working on optimizing the design of the glove and testing the effect of attaching actuators to both sides of the fingertip or the fingernail.</p><p>"The future of this research may lead to the development of a novel orthopedic device that can help people with peripheral nerve damage resume their daily activities or improve the abilities of individuals with jobs that require skills in manipulation or texture discrimination," said Ueda.</p><p><strong>Research News &amp; Publications Office<br /> Georgia Institute of Technology<br /> 75 Fifth Street, N.W., Suite 314<br /> Atlanta, Georgia 30308 USA</strong></p><p><strong>Media Relations Contacts:</strong> Abby Robinson (abby@innovate.gatech.edu; 404-385-3364) or John Toon (jtoon@gatech.edu; 404-894-6986)</p><p><strong>Writer:</strong> Abby Robinson</p>]]></body>  <author>Abby Vogel Robinson</author>  <status>1</status>  <created>1312416000</created>  <gmt_created>2011-08-04 00:00:00</gmt_created>  <changed>1475896192</changed>  <gmt_changed>2016-10-08 03:09:52</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Vibrating fingertip improves tactile sensitivity and motor performance]]></teaser>  <type>news</type>  <sentence><![CDATA[Vibrating fingertip improves tactile sensitivity and motor performance]]></sentence>  <summary><![CDATA[<p>Georgia Tech researchers have developed a glove with a special fingertip designed to improve the wearer's sense of touch. Applying a small vibration to the side of the fingertip improves tactile sensitivity and motor performance, according to their research results.</p>]]></summary>  <dateline>2011-08-04T00:00:00-04:00</dateline>  <iso_dateline>2011-08-04T00:00:00-04:00</iso_dateline>  <gmt_dateline>2011-08-04 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[abby@innovate.gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p><strong>Abby Robinson</strong><br />Research News and Publications<br /><a href="http://www.gatech.edu/contact/index.html?id=avogel6">Contact Abby Robinson</a><br /><strong>404-385-3364</strong></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>69171</item>          <item>69172</item>          <item>69173</item>      </media>  <hg_media>          <item>          <nid>69171</nid>          <type>image</type>          <title><![CDATA[Jun Ueda and Minoru Shinohara]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177239</created>          <gmt_created>2015-12-03 21:13:59</gmt_created>          <changed>1475894604</changed>          <gmt_changed>2016-10-08 02:43:24</gmt_changed>      </item>          <item>          <nid>69172</nid>          <type>image</type>          <title><![CDATA[Sensory glove]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177239</created>          <gmt_created>2015-12-03 21:13:59</gmt_created>          <changed>1475894604</changed>          <gmt_changed>2016-10-08 02:43:24</gmt_changed>      </item>          <item>          <nid>69173</nid>          <type>image</type>          <title><![CDATA[Jun Ueda]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[]]></image_name>            <image_path><![CDATA[]]></image_path>            <image_full_path><![CDATA[]]></image_full_path>            <image_740><![CDATA[]]></image_740>            <image_mime></image_mime>            <image_alt><![CDATA[]]></image_alt>                    <created>1449177239</created>          <gmt_created>2015-12-03 21:13:59</gmt_created>          <changed>1475894604</changed>          <gmt_changed>2016-10-08 02:43:24</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://www.me.gatech.edu/faculty/ueda.shtml]]></url>        <title><![CDATA[Jun Ueda]]></title>      </link>          <link>        <url><![CDATA[http://www.ap.gatech.edu/Shino/index.php]]></url>        <title><![CDATA[Minoru Shinohara]]></title>      </link>          <link>        <url><![CDATA[http://www.me.gatech.edu/]]></url>        <title><![CDATA[George W. Woodruff School of Mechanical Engineering]]></title>      </link>          <link>        <url><![CDATA[http://www.ap.gatech.edu/]]></url>        <title><![CDATA[Georgia Tech School of Applied Physiology]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="145"><![CDATA[Engineering]]></category>          <category tid="135"><![CDATA[Research]]></category>      </categories>  <news_terms>          <term tid="145"><![CDATA[Engineering]]></term>          <term tid="135"><![CDATA[Research]]></term>      </news_terms>  <keywords>          <keyword tid="13896"><![CDATA[Actuator]]></keyword>          <keyword tid="594"><![CDATA[college of engineering]]></keyword>          <keyword tid="4896"><![CDATA[College of Sciences]]></keyword>          <keyword tid="8382"><![CDATA[Glove]]></keyword>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="13888"><![CDATA[Minoru Shinohara]]></keyword>          <keyword tid="13891"><![CDATA[motor performance]]></keyword>      </keywords>  <core_research_areas>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="138981">  <title><![CDATA[Robot Vision: Muscle-Like Action Allows Camera to Mimic Human Eye Movement]]></title>  <uid>27303</uid>  <body><![CDATA[<p>Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.</p><p>Key to the new control system is a piezoelectric cellular actuator that uses a novel biologically inspired technology that will allow a robot eye to move more like a real eye. This will be useful for research studies on human eye movement as well as making video feeds from robots more intuitive. The research is being conducted by Ph.D. candidate Joshua Schultz under the direction of assistant professor <a href="http://www.me.gatech.edu/faculty/ueda">Jun Ueda</a>, both from the <a href="http://www.me.gatech.edu/">George W. Woodruff School of Mechanical Engineering</a> at the Georgia Institute of Technology.</p><p>“For a robot to be truly bio-inspired, it should possess actuation, or motion generators, with properties in common with the musculature of biological organisms,” said Schultz. “The actuators developed in our lab embody many properties in common with biological muscle, especially a cellular structure. Essentially, in the human eye muscles are controlled by neural impulses. Eventually, the actuators we are developing will be used to capture the kinematics and performance of the human eye.”</p><p>Details of the research were presented June 25, 2012, at the IEEE International Conference on Biomedical Robotics and Biomechatronics in Rome, Italy. The research is funded by National Science Foundation. Schultz also receives partial support from the Achievement Rewards for College Scientists (ARCS) Foundation.</p><p>Ueda, who leads the Georgia Tech Bio-Robotics and Human Modeling Laboratory in the School of Mechanical Engineering, said this novel technology will lay the groundwork for investigating research questions in systems that possess a large number of active units operating together. The application ranges from industrial robots, medical and rehabilitation robots to intelligent assistive robots.</p><p>“Robustness against uncertainty of model and environment is crucial for robots physically interacting with humans and environments,” said Ueda. “Successful integration relies on the coordinated design of control, structure, actuators and sensors by considering the dynamic interaction among them.”</p><p>Piezoelectric materials expand or contract when electricity is applied to them, providing a way to transform input signals into motion. This principle is the basis for piezoelectric actuators that have been used in numerous applications, but use in robotics applications has been limited due to piezoelectric ceramic's minuscule displacement. &nbsp;</p><p>The cellular actuator concept developed by the research team was inspired by biological muscle structure that connects many small actuator units in series or in parallel.</p><p>The Georgia Tech team has developed a lightweight, high speed approach that includes a single-degree of freedom camera positioner that can be used to illustrate and understand the performance and control of biologically inspired actuator technology. This new technology uses less energy than traditional camera positioning mechanisms and is compliant for more flexibility.</p><p>“Each muscle-like actuator has a piezoelectric material and a nested hierarchical set of strain amplifying mechanisms,” said Ueda. “We are presenting a mathematical concept that can be used to predict the performance as well as select the required geometry of nested structures. We use the design of the camera positioning mechanism’s actuators to demonstrate the concepts.”</p><p>The scientists’ research shows mechanisms that can scale up the displacement of piezoelectric stacks to the range of the ocular positioning system. In the past, the piezoelectric stacks available for this purpose have been too small.</p><p>“Our research shows a two-port network model that describes compliant strain amplification mechanisms that increase the stroke length of the stacks,” said Schultz. “Our findings make a contribution to the use of piezoelectric stack devices in robotics, modeling, design and simulation of compliant mechanisms. It also advances the control of systems using a large number of motor units for a given degree of freedom and control of robotic actuators.”</p><p>In the study, the scientists sought to resolve a previous conundrum. A cable-driven eye could produce the eye’s kinematics, but rigid servomotors would not allow researchers to test the hypothesis for the neurological basis for eye motion.</p><p>Some measure of flexibility could be used in software with traditional actuators, but it depended largely on having a continuously variable control signal and it could not show how flexibility could be maintained with quantized actuation corresponding to neural recruitment phenomena.</p><p>“Each muscle-like actuator consists of a piezoelectric material and a nested hierarchical set of strain amplifying mechanisms,” said Ueda. “Unlike traditional actuators, piezoelectric cellular actuators are governed by the working principles of muscles - namely, motion results by discretely activating, or recruiting, sets of active fibers, called motor units.</p><p>“Motor units are linked by flexible tissue, which serves a two-fold function,” said Ueda. “It combines the action potential of each motor unit, and presents a compliant interface with the world, which is critical in unstructured environments.”</p><p>The Georgia Tech team has presented a camera positioner driven by a novel cellular actuator technology, using a contractile ceramic to generate motion. The team used 16 amplified piezoelectric stacks per side.</p><p>The use of multiple stacks addressed the need for more layers of amplification. The units were placed inside a rhomboidal mechanism. The work offers an analysis of the force-displacement tradeoffs involved in the actuator design and shows how to find geometry that meets the requirement of the camera positioner, said Schultz.</p><p>“The goal of scaling up piezoelectric ceramic stacks holds great potential to more accurately replicate human eye motion than previous actuators,” noted Schultz. “Future work in this area will involve implantation of this technology on a multi-degree of freedom device, applying open and closed loop control algorithms for positioning and analysis of co-contraction phenomena.”</p><p>Future research by his team will continue to focus on the development of a design framework for highly integrated robotic systems. This ranges from industrial robots to medical and rehabilitation robots to intelligent assistive robots. <br /><br /><strong>Research News &amp; Publications Office</strong><br /><strong>Georgia Institute of Technology</strong><br /><strong>75 Fifth Street, N.W., Suite 309</strong><br /><strong>Atlanta, Georgia&nbsp; 30308&nbsp; USA</strong><br /><br /><strong>Media Relations Contact</strong>: John Toon (404-894-6986)(<a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a>).<br /><strong>Writer</strong>: Sarah E. Goodwin</p>]]></body>  <author>John Toon</author>  <status>1</status>  <created>1341495522</created>  <gmt_created>2012-07-05 13:38:42</gmt_created>  <changed>1475896349</changed>  <gmt_changed>2016-10-08 03:12:29</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[Stacks of piezoelectric actuators that simulate the action of real muscles could give robots more human-like eyes.]]></teaser>  <type>news</type>  <sentence><![CDATA[Stacks of piezoelectric actuators that simulate the action of real muscles could give robots more human-like eyes.]]></sentence>  <summary><![CDATA[<p>Using piezoelectric materials, researchers have replicated the muscle motion of the human eye to control camera systems in a way designed to improve the operation of robots. This new muscle-like action could help make robotic tools safer and more effective for MRI-guided surgery and robotic rehabilitation.</p>]]></summary>  <dateline>2012-07-05T00:00:00-04:00</dateline>  <iso_dateline>2012-07-05T00:00:00-04:00</iso_dateline>  <gmt_dateline>2012-07-05 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[jtoon@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>John Toon</p><p>Research News &amp; Publications Office</p><p><a href="mailto:jtoon@gatech.edu">jtoon@gatech.edu</a></p><p>(404) 894-6986</p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>138951</item>          <item>138961</item>          <item>138971</item>      </media>  <hg_media>          <item>          <nid>138951</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision1]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision1.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision1_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/piezoelectric-vision1_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision1_0.jpg?itok=F81Tfwsj]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision1]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>          <item>          <nid>138961</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision2]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision2.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision2_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/piezoelectric-vision2_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision2_0.jpg?itok=X2G1wwV7]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision2]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>          <item>          <nid>138971</nid>          <type>image</type>          <title><![CDATA[Piezoelectric-vision4]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[piezoelectric-vision4.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/piezoelectric-vision4_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/piezoelectric-vision4_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/piezoelectric-vision4_0.jpg?itok=q47qc-Gl]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Piezoelectric-vision4]]></image_alt>                    <created>1449178698</created>          <gmt_created>2015-12-03 21:38:18</gmt_created>          <changed>1475894769</changed>          <gmt_changed>2016-10-08 02:46:09</gmt_changed>      </item>      </hg_media>  <related>      </related>  <files>      </files>  <groups>          <group id="1188"><![CDATA[Research Horizons]]></group>      </groups>  <categories>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="7699"><![CDATA[piezoelectric]]></keyword>          <keyword tid="37861"><![CDATA[piezoelectric actuator]]></keyword>          <keyword tid="1356"><![CDATA[robot]]></keyword>          <keyword tid="167377"><![CDATA[School of Mechanical Engineering]]></keyword>          <keyword tid="820"><![CDATA[vision]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node><node id="236671">  <title><![CDATA[Georgia Tech Robotics Projects Receive More Than $2 Million in Funding]]></title>  <uid>27255</uid>  <body><![CDATA[<p>The National Science Foundation (NSF) awarded more than $2 million to fund projects led by Georgia Tech robotics researchers. The principal investigators (PIs) and co-PIs for these projects represent three of the Institute’s six colleges, illustrating the interdisciplinary collaboration that distinguishes Tech as a leader in the national initiative to accelerate the development and use of robots in the United States.</p><p>“Georgia Tech faculty have a strong tradition of exceptional research and a robust interdisciplinary focus,” said Henrik Christensen, KUKA Chair of Robotics and director of the Robotics &amp; Intelligent Machines Center (RIM), the flagship for the Institute’s robotics efforts. “I’m extremely proud of and continually impressed with the contributions our researchers make to advancing robotics.”</p><p>Three projects received NSF funding through the National Robotics Initiative program, which was unveiled by President Obama in June 2011, and is led by NSF with support from NASA, the National Institutes of Health, and the United States Department of Agriculture. Tech’s new projects focus on the development of the next generation of robotics and the advancement of the capability and usability of such systems in innovative application areas:</p><ul><li>“Learning from Demonstration for Cloud Robotics”—Led by School of Interactive Computing Associate Professor Andrea Thomaz, this project received $426K and aims to leverage cloud computing to enable robots to efficiently learn from remote human domain experts.</li><li>“Understanding Neuromuscular Adaptations in Human-Robot Physical Interaction for Adaptive Robot Coworkers”—Led by School of Mechanical Engineering Assistant Professor Jun Ueda, this research focuses on developing theories, methods, and tools to understand the mechanisms of neuromotor adaptation in human-robot physical interaction. Associate Professor Minoru Shinohara (School of Applied Physiology) and Assistant Professor Karen Feigh (School of Aerospace Engineering) serve as co-PIs on the project, which received almost $1.2M.</li><li>“Don't Read My Face: Tackling the Challenges of Facial Masking in Parkinson's Disease Rehabilitation through Co-Robot Mediators”—Led by College of Computing Associate Dean &amp; Regents' Professor, Ronald Arkin, this project received almost $580K and has two primary goals: 1) developing a robotic architecture endowed with moral emotional control mechanisms, abstract moral reasoning, and theory of mind sensitive to human affect and ethics; and 2) creating a specific architecture for a robot to mediate communication barriers between caregivers and patients with Parkinson's disease who experience “facial masking,” or lack of recognizable emotion.</li></ul><p>The fourth project, “Bioinspired Collaborative Sensing with Novel Gliding Robotic Fish,” received more than $83K from the NSF’s Robust Intelligence (RI) program, which encompasses all aspects of the computational understanding and modeling of intelligence in complex, realistic contexts. Led by School of Electrical &amp; Computer Engineering Associate Professor Fumin Zhang, the research aims to establish a theoretical framework and provide an enabling technology for robust underwater collaborative sensing with small, inexpensive robots.</p><p>Robotics research at Tech attracts more than $35 million in sponsored research each year. Core research areas include mechanisms, control, perception, artificial intelligence, human interaction, and application technologies. The Institute continues to advance personal and everyday robotics through its research into the ways robots can learn from and interact with humans, and by exploring issues surrounding their governance and ethical use.</p><p><em>This research is supported by the National Science Foundation (NSF) under Awards IIS-1317926, IIS-1317718, IIS-1317214, and IIS-1319874. Any conclusions or opinions are those of the authors and do not necessarily represent the official views of the NSF.</em></p>]]></body>  <author>Josie Giles</author>  <status>1</status>  <created>1378919025</created>  <gmt_created>2013-09-11 17:03:45</gmt_created>  <changed>1475896493</changed>  <gmt_changed>2016-10-08 03:14:53</gmt_changed>  <promote>0</promote>  <sticky>0</sticky>  <teaser><![CDATA[]]></teaser>  <type>news</type>  <sentence><![CDATA[]]></sentence>  <summary><![CDATA[<p>The National Science Foundation (NSF) awarded funding for four robotics projects with applications in various innovative areas.&nbsp;</p>]]></summary>  <dateline>2013-09-11T00:00:00-04:00</dateline>  <iso_dateline>2013-09-11T00:00:00-04:00</iso_dateline>  <gmt_dateline>2013-09-11 00:00:00</gmt_dateline>  <subtitle>    <![CDATA[]]>  </subtitle>  <sidebar><![CDATA[]]></sidebar>  <email><![CDATA[josie@gatech.edu]]></email>  <location></location>  <contact><![CDATA[<p>Josie Giles<br />RIM Communications Officer<br /><a href="mailto:josie@gatech.edu">josie@gatech.edu</a></p>]]></contact>  <boilerplate></boilerplate>  <boilerplate_text><![CDATA[]]></boilerplate_text>  <media>          <item>236701</item>          <item>236741</item>          <item>236771</item>          <item>236751</item>          <item>236761</item>      </media>  <hg_media>          <item>          <nid>236701</nid>          <type>image</type>          <title><![CDATA[Robotics @ Tech]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[robots-at-tech.png]]></image_name>            <image_path><![CDATA[/sites/default/files/images/robots-at-tech_0.png]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/robots-at-tech_0.png]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/robots-at-tech_0.png?itok=N3ntyFcP]]></image_740>            <image_mime>image/png</image_mime>            <image_alt><![CDATA[Robotics @ Tech]]></image_alt>                    <created>1449243659</created>          <gmt_created>2015-12-04 15:40:59</gmt_created>          <changed>1475894911</changed>          <gmt_changed>2016-10-08 02:48:31</gmt_changed>      </item>          <item>          <nid>236741</nid>          <type>image</type>          <title><![CDATA[Andrea Thomaz]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[andrea_thomaz.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/andrea_thomaz_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/andrea_thomaz_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/andrea_thomaz_0.jpg?itok=8mytZeXt]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Andrea Thomaz]]></image_alt>                    <created>1449243659</created>          <gmt_created>2015-12-04 15:40:59</gmt_created>          <changed>1475894911</changed>          <gmt_changed>2016-10-08 02:48:31</gmt_changed>      </item>          <item>          <nid>236771</nid>          <type>image</type>          <title><![CDATA[Jun Ueda]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[jun_ueda_0.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/jun_ueda_0_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/jun_ueda_0_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/jun_ueda_0_0.jpg?itok=3H9Gw6-L]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Jun Ueda]]></image_alt>                    <created>1449243659</created>          <gmt_created>2015-12-04 15:40:59</gmt_created>          <changed>1475894911</changed>          <gmt_changed>2016-10-08 02:48:31</gmt_changed>      </item>          <item>          <nid>236751</nid>          <type>image</type>          <title><![CDATA[Ronald Arkin]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[ron_arkin.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/ron_arkin_0.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/ron_arkin_0.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/ron_arkin_0.jpg?itok=wadMpWJf]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Ronald Arkin]]></image_alt>                    <created>1449243659</created>          <gmt_created>2015-12-04 15:40:59</gmt_created>          <changed>1475894911</changed>          <gmt_changed>2016-10-08 02:48:31</gmt_changed>      </item>          <item>          <nid>236761</nid>          <type>image</type>          <title><![CDATA[Fumin Zhang]]></title>          <body><![CDATA[]]></body>                      <image_name><![CDATA[fumin_zhang.jpg]]></image_name>            <image_path><![CDATA[/sites/default/files/images/fumin_zhang.jpg]]></image_path>            <image_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/fumin_zhang.jpg]]></image_full_path>            <image_740><![CDATA[http://www.tlwarc.hg.gatech.edu/sites/default/files/styles/740xx_scale/public/sites/default/files/images/fumin_zhang.jpg?itok=meGdvI-g]]></image_740>            <image_mime>image/jpeg</image_mime>            <image_alt><![CDATA[Fumin Zhang]]></image_alt>                    <created>1449243659</created>          <gmt_created>2015-12-04 15:40:59</gmt_created>          <changed>1475894911</changed>          <gmt_changed>2016-10-08 02:48:31</gmt_changed>      </item>      </hg_media>  <related>          <link>        <url><![CDATA[http://gatech.edu/research/areas/robotics]]></url>        <title><![CDATA[Robotics Research @ Tech]]></title>      </link>          <link>        <url><![CDATA[http://robotics.gatech.edu/]]></url>        <title><![CDATA[Center for Robotics & Intelligent Machines]]></title>      </link>          <link>        <url><![CDATA[http://www.whitehouse.gov/blog/2011/06/24/developing-next-generation-robots]]></url>        <title><![CDATA[National Robotics Initiative]]></title>      </link>      </related>  <files>      </files>  <groups>          <group id="142761"><![CDATA[IRIM]]></group>      </groups>  <categories>          <category tid="135"><![CDATA[Research]]></category>          <category tid="152"><![CDATA[Robotics]]></category>      </categories>  <news_terms>          <term tid="135"><![CDATA[Research]]></term>          <term tid="152"><![CDATA[Robotics]]></term>      </news_terms>  <keywords>          <keyword tid="11526"><![CDATA[Andrea Thomaz]]></keyword>          <keyword tid="7045"><![CDATA[Fumin Zhang]]></keyword>          <keyword tid="13887"><![CDATA[Jun Ueda]]></keyword>          <keyword tid="11039"><![CDATA[Karen Feigh]]></keyword>          <keyword tid="13888"><![CDATA[Minoru Shinohara]]></keyword>          <keyword tid="362"><![CDATA[National Science Foundation]]></keyword>          <keyword tid="363"><![CDATA[NSF]]></keyword>          <keyword tid="667"><![CDATA[robotics]]></keyword>          <keyword tid="11106"><![CDATA[Ronald Arkin]]></keyword>      </keywords>  <core_research_areas>          <term tid="39521"><![CDATA[Robotics]]></term>      </core_research_areas>  <news_room_topics>      </news_room_topics>  <files></files>  <related></related>  <userdata>      <![CDATA[]]>  </userdata></node></nodes>