<node id="70408">
  <nid>70408</nid>
  <type>news</type>
  <uid>
    <user id="27174"><![CDATA[27174]]></user>
  </uid>
  <created>1317308368</created>
  <changed>1475896214</changed>
  <title><![CDATA[Georgia Tech Researchers Receive Three NSF Emerging Frontiers Awards]]></title>
  <body><![CDATA[<p>The National Science Foundation (NSF) has awarded $6 million to fund 
three projects involving researchers from the Georgia Institute of 
Technology. Each four-year, $2 million grant was awarded through the 
NSF's Division of Emerging Frontiers in Research and Innovation (EFRI). 
</p>
<p>"The EFRI research teams will probe some profound aspects of the 
interface of biology and engineering," said Sohi Rastegar, director of 
EFRI. "If they are successful, the principles and theories uncovered in 
their investigations could unlock many technological opportunities."</p>
<p>This year, 14 transformative, fundamental research projects were 
awarded EFRI grants in two emerging areas: technologies that build on 
understanding of biological signaling, and machines that can interact 
and cooperate with humans. 
</p><p>
The three Georgia Tech projects include:</p><ul><li>Developing a "therapeutic robot" to help rehabilitate and improve motor skills in people with mobility problems;</li><li>Creating wearable sensors that allow blind people to "see" with their hands, bodies or faces;</li><li>Generating
 and rigorously testing quantitative models that describe spatial and 
temporal regulation of cell differentiation in tissues.</li></ul>
<p>The therapeutic robot could enhance, assist and improve motor skills 
in humans with varying motor capabilities and deficits. The goal of the 
project is to program a humanoid rehabilitation robot to perform a 
"partnered box step," which is a defined pattern of weight shifts and 
directional changes, solely based on interpreting movement cues from 
subtle changes in forces between the hands and arms of the robot and the
 person. 
</p>
<p>To do this, researchers at Georgia Tech and Emory University will 
study how humans use their muscles to walk, balance and generate force 
signals with the hands for guidance when moving in cooperation with 
another person. They will also study "rehabilitative partnered dance," 
which has been specifically adapted to help improve gait and balance in 
individuals with motor impairments. 
</p>
<p>"Our vision is to develop robots that will interact with humans as 
both assistants and movement therapists," explained principal 
investigator Lena Ting, an associate professor in the Wallace H. Coulter
 Department of Biomedical Engineering at Georgia Tech and Emory 
University. "We expect our project to have a long-term impact on quality
 of life of individuals with movement difficulties, such as those caused
 by Parkinson's disease, stroke and injury by improving fitness, motor 
skills and social engagement."
</p>
<p>Working with Ting on the project are Emory University School of 
Medicine (geriatrics) assistant professor Madeleine Hackney, Coulter 
Department of Biomedical Engineering assistant professor Charlie Kemp 
and Georgia Tech School of Interactive Computing assistant professor 
Karen Liu.
</p>
<p>For the second project, researchers at Georgia Tech and The City 
College of New York will investigate devices for "alternative 
perception" and the principles underlying the human-machine interaction.
 Alternative perception combines electronics and the other senses to 
emulate vision. In addition to aiding the visually impaired, the 
findings are expected to have other applications, such as the 
development of intelligent robots. 
</p>
<p>The researchers plan to untangle how humans learn to coordinate input
 from their senses -- e.g. vision, touch -- with movements, like 
reaching for a glass or moving through a crowded room. They will then 
map out how machines, such as robots and computers, learn similar tasks,
 to model devices that can assist humans. 
</p>
<p>The team envisions a multifunctional array of sensors on the body and
 has already developed prototypes for some of the devices. The full 
complement of wearable sensors would help a sightless person navigate by
 conveying information about his or her surroundings. 
</p>
<p>The researchers hope their findings on perception, and the prototypes
 they develop, will spawn a raft of wearable electronic devices to help 
blind people "see" their environment at a distance through touch, 
hearing and other senses. The technology would also benefit sighted 
individuals who must navigate in poor visibility, such as firefighters 
and pilots.
</p>
<p>Principal investigator Zhigang Zhu, professor of computer science and
 computer engineering in City College's Grove School of Engineering, 
will collaborate with City College professor of psychology and director 
of the Program in Cognitive Neuroscience Tony Ro, City College professor
 of electrical engineering Ying Li Tian, Georgia Tech Woodruff School of
 Mechanical Engineering professor Kok-Meng Lee, and Georgia Tech School 
of Applied Physiology associate professor Boris Prilutsky.</p>
<p>The third project will address a fundamental question of 
developmental biology: what controls the spatial and temporal patterns 
of cell differentiation? Answering this question will lead to a better 
understanding of the basic principles of embryogenesis, explain origins 
of developmental disorders, and provide guidelines for tissue 
engineering and regenerative medicine. 
</p>
<p>The research will be conducted by principal investigator and 
Princeton University Department of Chemical and Biological Engineering 
associate professor Stanislav Shvartsman, Georgia Tech School of 
Chemical and Biomolecular Engineering associate professor Hang Lu, New 
York University Department of Biology professor Christine Rushlow, and 
University of Illinois at Urbana Champaign Department of Computer 
Science associate professor Saurabh Sinha.
</p>
<p>Scientists know that among an embryo's first major developments is 
the establishment of its dorsoventral axis, which runs from its back to 
its belly. The researchers plan to study how this axis development 
unfolds -- specifically the presence and location of proteins during the
 process, which give rise to muscle, nerve and skin tissues. 
</p>

<p>To enable large-scale quantitative analyses of protein positional 
information along the dorsoventral axis, Lu and Shvartsman will further 
develop a microfluidic device they previously designed to reliably and 
robustly orient several hundred embryos in just a few minutes.</p><p>"By understanding this system at a deeper, quantitative level, we 
will elucidate general principles underlying the operation of genetic 
and multicellular networks that drive development," said Lu.
</p>
<p><strong>Research News &amp; Publications Office<br />
Georgia Institute of Technology<br />
75 Fifth Street, N.W., Suite 314<br />
Atlanta, Georgia 30308 USA</strong>
</p>
<p><strong>Media Relations Contacts:</strong> Abby Robinson (abby@innovate.gatech.edu; 404-385-3364) or John Toon (jtoon@gatech.edu; 404-894-6986)
</p>]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2011-09-29T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>The National Science Foundation has awarded $6 million through its Division of Emerging Frontiers in Research and Innovation to fund three projects involving researchers from Georgia Tech, including Karen Liu and Charlie Kemp (<em>Interactive Computing</em>). <em>Source: GT Research News</em></p>]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="70367">
            <nid>70367</nid>
            <type>image</type>
            <title><![CDATA[(L-R) Lena Ting, Karen Liu, Charlie Kemp and Madeleine Hackney]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>192951</fid>
                  <filename><![CDATA[tinggroup195.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/images/tinggroup195_0.jpg]]></filepath>
                  <file_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/tinggroup195_0.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[(L-R) Lena Ting, Karen Liu, Charlie Kemp and Madeleine Hackney]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>47223</item>
      </og_groups>
  <og_groups_both>
      </og_groups_both>
  <field_categories>
      </field_categories>
  <core_research_areas>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>47223</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[College of Computing]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>1102</tid>
        <value><![CDATA[blind]]></value>
      </item>
          <item>
        <tid>14478</tid>
        <value><![CDATA[Boris Prilutsky]]></value>
      </item>
          <item>
        <tid>14480</tid>
        <value><![CDATA[cell differentiation]]></value>
      </item>
          <item>
        <tid>2157</tid>
        <value><![CDATA[Charlie Kemp]]></value>
      </item>
          <item>
        <tid>654</tid>
        <value><![CDATA[College of Computing]]></value>
      </item>
          <item>
        <tid>594</tid>
        <value><![CDATA[college of engineering]]></value>
      </item>
          <item>
        <tid>11533</tid>
        <value><![CDATA[Department of Biomedical Engineering]]></value>
      </item>
          <item>
        <tid>898</tid>
        <value><![CDATA[Hang Lu]]></value>
      </item>
          <item>
        <tid>2296</tid>
        <value><![CDATA[Karen Liu]]></value>
      </item>
          <item>
        <tid>14477</tid>
        <value><![CDATA[Kok-Meng Lee]]></value>
      </item>
          <item>
        <tid>2266</tid>
        <value><![CDATA[Lena Ting]]></value>
      </item>
          <item>
        <tid>7341</tid>
        <value><![CDATA[microfluidic]]></value>
      </item>
          <item>
        <tid>1482</tid>
        <value><![CDATA[mobility]]></value>
      </item>
          <item>
        <tid>1356</tid>
        <value><![CDATA[robot]]></value>
      </item>
          <item>
        <tid>167863</tid>
        <value><![CDATA[School of Applied Physiology]]></value>
      </item>
          <item>
        <tid>167445</tid>
        <value><![CDATA[School of Chemical and Biomolecular Engineering]]></value>
      </item>
          <item>
        <tid>166848</tid>
        <value><![CDATA[School of Interactive Computing]]></value>
      </item>
          <item>
        <tid>167377</tid>
        <value><![CDATA[School of Mechanical Engineering]]></value>
      </item>
          <item>
        <tid>167318</tid>
        <value><![CDATA[sensor]]></value>
      </item>
          <item>
        <tid>14479</tid>
        <value><![CDATA[therapeutic robot]]></value>
      </item>
      </field_keywords>
  <field_userdata>
      <![CDATA[]]>
  </field_userdata>
</node>
