<node id="257451">
  <nid>257451</nid>
  <type>news</type>
  <uid>
    <user id="27924"><![CDATA[27924]]></user>
  </uid>
  <created>1385474303</created>
  <changed>1475896525</changed>
  <title><![CDATA[TechDebate on Lethal Autonomous Robots on YouTube]]></title>
  <body><![CDATA[<p><strong>TechDebate on Lethal Autonomous Robots published on </strong><strong><a href="http://youtu.be/nO1oFKc_-4A" target="_blank">YouTube</a></strong><strong>:</strong><br /> <a href="http://youtu.be/nO1oFKc_-4A" target="_blank">http://youtu.be/nO1oFKc_-4A</a><br /> <br /> Lethal Autonomous Robots, or “LARs” for short, are machines that can decide to take human life. Such a technology has the potential to revolutionize modern warfare and more. Opponents call LARs “killer robots” because they are deadly or “lethal.” They are “autonomous” because they “can select and engage targets without further intervention by a human operator,” based on the data they process in the battlefield, and based on the algorithms that guide their behavior. The need for understanding LARs is essential to decide whether their development and possible deployment should be regulated or banned. This TechDebate centers on the question: Are LARs ethical?<br /> <br /> <strong>Debaters:</strong><br /> <a href="http://www.cc.gatech.edu/aimosaic/faculty/arkin/" target="_blank">Ron Arkin</a>, Robotics Professor at Georgia Tech's College of Computing<br /> <a href="http://profiles.arts.monash.edu.au/rob-sparrow/" target="_blank">Rob Sparrow</a>, Philosophy Professor at Monash University in Australia and one of the founding members of the International Committee for Robot Arms-Control (icrac.net).<br /> <br /> &nbsp;The TechDebate took place November 18, 2013<br /> <strong><br /> </strong><strong>TechDebates on Emerging Technologies, presented by the </strong><strong><a href="http://www.ethics.gatech.edu/" target="_blank">Center for Ethics and Technology (CET)</a></strong><br /> The TechDebates pursue one goal: to stimulate reflection and deliberation on emerging technologies. What is the purpose of these technologies? What are the risks and ethical concerns? How will they change society and what it means to be human? We ask experts to help us--from their varying points of view--to navigate through the complexity of deliberations that are still in their infancy.<br /> &nbsp; <br /> &nbsp;Each TechDebate is a live event both on the Georgia Tech campus and on the Internet. In both settings, the audience can participate in a question-and-answer session.<br /> &nbsp; <br /> <strong>&nbsp;Ongoing public deliberation</strong><br /> We invite you to participate in public deliberation on emerging technologies in the <a href="http://agora.gatech.edu/release/English.html" target="_blank">AGORA-net</a>, a web-based and collaborative argument visualization tool. In the AGORA-net, go to “Technology Assessment” and select the technology you are interested in. Add further arguments or objections to existing argument maps, or create your own map! If you are new to the system, watch the video that you can access on the login-page.</p>]]></body>
  <field_subtitle>
    <item>
      <value><![CDATA[Military Robotics and Ethics]]></value>
    </item>
  </field_subtitle>
  <field_dateline>
    <item>
      <value>2013-11-26T00:00:00-05:00</value>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_dateline>
  <field_summary_sentence>
    <item>
      <value><![CDATA[The TechDebates pursue one goal: to stimulate reflection and deliberation on emerging technologies.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>Lethal Autonomous Robots, or “LARs” for short, are machines that can decide to take human life. Such a technology has the potential to revolutionize modern warfare and more. Opponents call LARs “killer robots” because they are deadly or “lethal.” They are “autonomous” because they “can select and engage targets without further intervention by a human operator,” based on the data they process in the battlefield, and based on the algorithms that guide their behavior. The need for understanding LARs is essential to decide whether their development and possible deployment should be regulated or banned. This TechDebate centers on the question: Are LARs ethical?</p>]]></value>
    </item>
  </field_summary>
  <field_media>
      </field_media>
  <field_contact_email>
    <item>
      <email><![CDATA[clark.bonilla@pubpolicy.gatech.edu]]></email>
    </item>
  </field_contact_email>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_contact>
    <item>
      <value><![CDATA[<p>Clark R. Bonilla, Director, Alumni and Career Services</p><p>School of Public Policy</p><p>Office Phone: 404--385-7220</p>]]></value>
    </item>
  </field_contact>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <!--  TO DO: correct to not conflate categories and news room topics  -->
  <!--  Disquisition: it's funny how I write these TODOs and then never
         revisit them. It's as though the act of writing the thing down frees me
         from the responsibility to actually solve the problem. But what can I
         say? There are more problems than there's time to solve.  -->
  <links_related> </links_related>
  <files> </files>
  <og_groups>
          <item>1289</item>
      </og_groups>
  <og_groups_both>
      </og_groups_both>
  <field_categories>
      </field_categories>
  <core_research_areas>
          <term tid="39481"><![CDATA[National Security]]></term>
          <term tid="39521"><![CDATA[Robotics]]></term>
      </core_research_areas>
  <field_news_room_topics>
      </field_news_room_topics>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>1289</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[School of Public Policy]]></item>
      </og_groups_both>
  <field_keywords>
          <item>
        <tid>1496</tid>
        <value><![CDATA[Ethics]]></value>
      </item>
          <item>
        <tid>80941</tid>
        <value><![CDATA[military warfare]]></value>
      </item>
          <item>
        <tid>667</tid>
        <value><![CDATA[robotics]]></value>
      </item>
          <item>
        <tid>2352</tid>
        <value><![CDATA[robots]]></value>
      </item>
      </field_keywords>
  <field_userdata>
      <![CDATA[]]>
  </field_userdata>
</node>
