<node id="650228">
  <nid>650228</nid>
  <type>event</type>
  <uid>
    <user id="34868"><![CDATA[34868]]></user>
  </uid>
  <created>1630352570</created>
  <changed>1631041864</changed>
  <title><![CDATA[ISyE Seminar- Robert Nowak]]></title>
  <body><![CDATA[<p><strong>Title:</strong> What Kinds of Functions Do Neural Networks Learn?<br />
<br />
<strong>Abstract: </strong>Neural nets have made an amazing comeback during the past decade. Their empirical success has been truly phenomenal, but neural nets are poorly understood in a mathematical sense compared to classical methods like splines, kernels, and wavelets.&nbsp; This talk describes recent steps towards a mathematical theory of neural networks comparable to the foundations we have for classical nonparametric methods. Surprisingly, neural nets are minimax optimal in a wide variety of classical univariate function spaces, including those handled by splines and wavelets. In multivariate settings, neural nets are&nbsp; solutions to data-fitting problems cast in entirely new types of multivariate function spaces characterized through total variation (TV) measured in the Radon transform domain.&nbsp; And deep (multilayer) neural nets naturally represent compositions of functions in these Radon-BV (bounded variation) spaces.&nbsp; Remarkably, this theory provides novel explanations for many notable empirical discoveries in deep learning, including the benefits of &ldquo;skip connections&rdquo; and sparse and low-rank &ldquo;weight&rdquo; matrices. Radon-BV spaces set the stage for the nonparametric theory of neural nets.<br />
<br />
<strong>Bio:</strong> Rob holds the Nosbusch Professorship in Engineering at the University of Wisconsin-Madison. His research focuses on signal processing, machine learning, optimization, and statistics.</p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[What Kinds of Functions Do Neural Networks Learn?]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p><strong>Abstract:</strong></p>

<p>Neural nets have made an amazing comeback during the past decade. Their empirical success has been truly phenomenal, but neural nets are poorly understood in a mathematical sense compared to classical methods like splines, kernels, and wavelets.&nbsp; This talk describes recent steps towards a mathematical theory of neural networks comparable to the foundations we have for classical nonparametric methods. Surprisingly, neural nets are minimax optimal in a wide variety of classical univariate function spaces, including those handled by splines and wavelets. In multivariate settings, neural nets are&nbsp; solutions to data-fitting problems cast in entirely new types of multivariate function spaces characterized through total variation (TV) measured in the Radon transform domain.&nbsp; And deep (multilayer) neural nets naturally represent compositions of functions in these Radon-BV (bounded variation) spaces.&nbsp; Remarkably, this theory provides novel explanations for many notable empirical discoveries in deep learning, including the benefits of &ldquo;skip connections&rdquo; and sparse and low-rank &ldquo;weight&rdquo; matrices. Radon-BV spaces set the stage for the nonparametric theory of neural nets.<br />
&nbsp;</p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2021-09-10T12:00:00-04:00]]></value>
      <value2><![CDATA[2021-09-10T13:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Faculty/Staff]]></value>
      </item>
          <item>
        <value><![CDATA[Postdoc]]></value>
      </item>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
          <item>
        <value><![CDATA[Graduate students]]></value>
      </item>
          <item>
        <value><![CDATA[Undergraduate students]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[https://bluejeans.com/594189735/7913]]></url>
      <title><![CDATA[Virtual Link]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>1242</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[School of Industrial and Systems Engineering (ISYE)]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1795</tid>
        <value><![CDATA[Seminar/Lecture/Colloquium]]></value>
      </item>
      </field_categories>
  <field_keywords>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
