<node id="625118">
  <nid>625118</nid>
  <type>event</type>
  <uid>
    <user id="34963"><![CDATA[34963]]></user>
  </uid>
  <created>1566753865</created>
  <changed>1567083046</changed>
  <title><![CDATA[TRIAD Lecture Series by Yuxin Chen from Princeton (2/5)]]></title>
  <body><![CDATA[<p>This is one of a series of talks that are given by Professor Chen. The full list of his talks is as follows:<br />
Wednesday, August 28, 2019; 11:00 am - 12:00 pm; Groseclose 402<br />
Thursday, August 29, 2019; 11:00 am - 12:00 pm; Groseclose 402<br />
Tuesday, September 3, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228<br />
Wednesday, September 4, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228<br />
Thursday, September 5, 2019; 11:00 am - 12:00 pm; Groseclose 402</p>

<p>Check https://triad.gatech.edu/events for more information.&nbsp;<br />
For location information, please check https://isye.gatech.edu/about/maps-directions/isye-building-complex</p>

<p>Title of this talk: Random initialization and implicit regularization in nonconvex statistical estimation</p>

<p>Abstract: Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation/learning problems. Due to the highly nonconvex nature of the empirical loss, state-of-the-art procedures often require suitable initialization and proper regularization (e.g.,&nbsp;trimming, regularized cost, projection) in order to guarantee fast convergence. For vanilla procedures such as&nbsp;gradient descent, however, the prior theory is often either far from optimal or completely lacks theoretical<br />
guarantees.</p>

<p>This talk is concerned with a striking phenomenon arising in two nonconvex problems (i.e. phase retrieval and matrix completion): even in the absence of careful initialization, proper saddle escaping, and/or explicit regularization, gradient descent converges to the optimal solution within a logarithmic number of iterations, thus achieving near-optimal statistical and computational guarantees at once. All of this is achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data. As&nbsp;a byproduct, for noisy matrix completion, we demonstrate that gradient descent achieves near-optimal entrywise&nbsp;error control.</p>

<p>This is joint work with Cong Ma, Kaizheng Wang, Yuejie Chi, and Jianqing Fan</p>
]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[This is one of a series of talks that are given by Professor Chen.]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[<p>This is one of a series of talks that are given by Professor Chen. The full list of his talks is as follows:<br />
Wednesday, August 28, 2019; 11:00 am - 12:00 pm; Groseclose 402<br />
Thursday, August 29, 2019; 11:00 am - 12:00 pm; Groseclose 402<br />
Tuesday, September 3, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228<br />
Wednesday, September 4, 2019; 11:00 am - 12:00 pm; Main - Executive Education Room 228<br />
Thursday, September 5, 2019; 11:00 am - 12:00 pm; Groseclose 402</p>

<p>Check https://triad.gatech.edu/events for more information.<br />
&nbsp;</p>
]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2019-08-29T12:00:00-04:00]]></value>
      <value2><![CDATA[2019-08-29T13:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Faculty/Staff]]></value>
      </item>
          <item>
        <value><![CDATA[Postdoc]]></value>
      </item>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
          <item>
        <value><![CDATA[Graduate students]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[https://triad.gatech.edu/events]]></url>
      <title><![CDATA[Transdisciplinary Research Institute for Advancing Data Science]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
          <item>
        <url>http://www.princeton.edu/~yc5/slides/random_init_slides.pdf</url>
        <link_title><![CDATA[Talk Slides at Speaker's web site]]></link_title>
      </item>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>602673</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[TRIAD ]]></item>
      </og_groups_both>
  <field_categories>
          <item>
        <tid>1795</tid>
        <value><![CDATA[Seminar/Lecture/Colloquium]]></value>
      </item>
      </field_categories>
  <field_keywords>
          <item>
        <tid>92811</tid>
        <value><![CDATA[data science]]></value>
      </item>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
