<node id="625377">
  <nid>625377</nid>
  <type>news</type>
  <uid>
    <user id="28750"><![CDATA[28750]]></user>
  </uid>
  <created>1567178593</created>
  <changed>1567178593</changed>
  <title><![CDATA[Gil Weinberg and Robotic Musicianship lab awarded new grant from National Science Foundation]]></title>
  <body><![CDATA[<p>Georgia Tech Center for Music Technology director Gil Weinberg and Ph.D candidate Richard Savery were awarded a new grant totalling $669,912 from the National Science Foundation. The purpose of the grant is to research trust and emotions between robots and humans by way of non-verbal communication.</p>

<p>As co-robots become prevalent at home, work, and public spaces, they need to become trust-worthy and socially believable agents if they are to be integrated into and accepted by society. The research will utilize the latest developments in Artificial Intelligence to gain knowledge about of the role of non-linguistic expressions in trust building. Findings from studies about non-linguistic emotional expressions such as prosody and gestures in music - one of the most emotionally meaningful human experiences - will be implemented in a group of newly developed personal robots. User experiments will be conducted to explore humans&#39; reactions to - and trust building with - these prosody-driven robots.</p>

<p>Results of the study will lead to novel approaches for creating open and meaningful interactions between groups of humans and robots. The research will advance national prosperity by increasing engagement, relatability, and trust in large scale human-robot interactive scenarios such as personal robots in private and public spaces, work place training, education, and combat. The project takes an interdisciplinary approach, which will address fields such as cognitive science, communication, and music, while leading to progress in both science and engineering.</p>

<p>For more details, read about the grant awarded here: https://www.nsf.gov/awardsearch/showAward?AWD_ID=1925178&amp;HistoricalAwards=false</p>
]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2019-08-30T00:00:00-04:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Gil Weinberg and Richard Savery awarded new grant from the National Science Foundation to research trust between robots and humans]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Georgia Tech Center for Music Technology director Gil Weinberg and Ph.D candidate Richard Savery were awarded a new grant totalling $669,912 from the National Science Foundation. The purpose of the grant is to research trust and emotions between robots and humans by way of non-verbal communication.</p>
]]></value>
    </item>
  </field_summary>
  <field_media>
          <item>
        <nid>
          <node id="625376">
            <nid>625376</nid>
            <type>image</type>
            <title><![CDATA[Gil Weinberg and Richard Savery pose with a Shimi robot.]]></title>
            <body><![CDATA[]]></body>
                          <field_image>
                <item>
                  <fid>238092</fid>
                  <filename><![CDATA[GilRichard.jpg]]></filename>
                  <filepath><![CDATA[/sites/default/files/images/GilRichard.jpg]]></filepath>
                  <file_full_path><![CDATA[http://www.tlwarc.hg.gatech.edu//sites/default/files/images/GilRichard.jpg]]></file_full_path>
                  <filemime>image/jpeg</filemime>
                  <image_740><![CDATA[]]></image_740>
                  <image_alt><![CDATA[Gil Weinberg and Richard Savery pose with a Shimi robot, one of the robots created by the Robotic Musicianship group.]]></image_alt>
                </item>
              </field_image>
            
                      </node>
        </nid>
      </item>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<p>Joshua Smith</p>

<p>joshua.smith@design.gatech.edu</p>

<p>404-385-5593</p>
]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>1227</item>
      </og_groups>
  <og_groups_both>
          <item>
        <![CDATA[Music and Music Technology]]>
      </item>
          <item>
        <![CDATA[Robotics]]>
      </item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>148</tid>
        <value><![CDATA[Music and Music Technology]]></value>
      </item>
          <item>
        <tid>152</tid>
        <value><![CDATA[Robotics]]></value>
      </item>
      </field_categories>
  <core_research_areas>
          <term tid="39521"><![CDATA[Robotics]]></term>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>1227</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[School of Music]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>667</tid>
        <value><![CDATA[robotics]]></value>
      </item>
          <item>
        <tid>1936</tid>
        <value><![CDATA[Center for Music Technology]]></value>
      </item>
          <item>
        <tid>2556</tid>
        <value><![CDATA[artificial intelligence]]></value>
      </item>
      </field_keywords>
  <field_userdata>
      <![CDATA[]]>
  </field_userdata>
</node>
