<node id="587515">
  <nid>587515</nid>
  <type>event</type>
  <uid>
    <user id="32895"><![CDATA[32895]]></user>
  </uid>
  <created>1487255117</created>
  <changed>1492117968</changed>
  <title><![CDATA[Theoretical Neuroscience Day]]></title>
  <body><![CDATA[<p align="center"><strong><a href="http://arc.gatech.edu">Algorithms &amp; Randomness Center (ARC)</a> and </strong></p>

<p align="center"><strong><a href="http://neuro.gatech.edu/neural-engineering-center">GT Neural Engineering Center</a> present:</strong></p>

<p align="center"><strong>Theoretical Neuroscience Day</strong></p>

<p align="center"><strong>Wednesday, March 15, 2017</strong></p>

<p align="center"><strong>Marcus Nanotechnology Building 1116-1118, 2-5pm</strong></p>

<p>2pm:&nbsp;<a href="http://www.arc.gatech.edu/hg/item/587514">Distinguished Lecture by&nbsp;</a><strong><em><a href="http://www.arc.gatech.edu/hg/item/587514">Bruno Olshausen</a></em></strong>&nbsp;(UC Berkeley)<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;Director,&nbsp;<a href="http://redwood.berkeley.edu/bruno/">Redwood Center for Theoretical Neuroscience</a><br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;Helen Wills Neuroscience Institute and School of Optometry, UC Berkeley<br />
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;Title:&nbsp;<a href="http://arc.gatech.edu/node/160#Bruno"><em>Neural computations for active perception</em></a></p>

<p>3:00-3:15pm Coffee break</p>

<p>3:15pm:&nbsp;<a href="http://siplab.gatech.edu/rozell.html"><strong><em>Chris Rozell</em></strong>&nbsp;</a>(GT ECE)<br />
&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Title:&nbsp;<a href="http://arc.gatech.edu/node/160#Chris"><em>Optimal sensory coding theories for neural systems under biophysical constraints</em></a></p>

<p>3:45pm:&nbsp;<strong><a href="http://www.cc.gatech.edu/~vempala/"><em>Santosh Vempala</em></a></strong>&nbsp;(GT CS)<br />
&nbsp; &nbsp;&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Title:&nbsp;<a href="http://arc.gatech.edu/node/160#Santosh"><em>A Computer Science View of the Brain</em></a></p>

<p>4:15pm: Reception</p>

<p><strong>Talk Abstracts:</strong></p>

<p><a id="Bruno" name="Bruno"></a>Bruno Olshausen<br />
<strong>Title:</strong>&nbsp;<em>Neural computations for active perception</em><br />
<strong>Abstract:</strong>&nbsp;<br />
The human visual system does not passively view the world, but actively moves its sensor array through eye, head and body movements.&nbsp; How do neural circuits in the brain control and exploit these movements in order to build a scene representation that can guide useful behavior?&nbsp; Here we focus on three aspects of this problem: 1) how do we see in the presence of fixational eye movements?&nbsp; 2) what is the optimal spatial layout of the image sampling array for a visual system that must search via eye movements?&nbsp; and 3) how is information integrated across multiple fixations in order to form a holistic scene representation that allows for visual reasoning about compositional structure?&nbsp; &nbsp;We address these questions by optimizing model neural systems to perform active vision tasks.&nbsp; These model systems in turn provide us with new ways to think about structures found in biology, and they point to new experiments that explore the neural mechanisms enabling active vision.</p>

<p><a id="Chris" name="Chris"></a>Chris Rozell<br />
<strong>Title:</strong>&nbsp;<em>Optimal sensory coding theories for neural systems under biophysical constraints</em><br />
<strong>Abstract:</strong>&nbsp;<br />
The natural stimuli that biological vision must use to understand the world are extremely complex. Recent advances in machine learning have shown that low-dimensional geometric models (e.g., sparsity, manifolds) can capture much of the structure in complex natural images.&nbsp; I will describe our work building efficient neural coding models that optimally exploit this structure.&nbsp; These results incorporate the constraints of biophysical systems and the physical world by drawing on&nbsp; mathematical tools such as dynamical systems, optimization, unsupervised learning, randomized dimensionality reduction, and manifold learning.&nbsp; These results show that incorporating natural constraints can lead to theoretical models that account for a wide range of observed phenomenon, including complex response properties of individual neurons, architectural features of the network (e.g., makeup of different cell types), and reported perceptual results from human psychophysical experiments.</p>

<p><a id="Santosh" name="Santosh"></a>Santosh Vempala<br />
<strong>Title:</strong>&nbsp;<em>A Computer Science View of the Brain</em><br />
<strong>Abstract:</strong>&nbsp;<br />
Computational perspectives on scientific phenomena have often&nbsp;proven to&nbsp;be remarkably insightful. Rapid advances in computational neuroscience, and the resulting plethora of data and models highlight the lack of an overarching theory for how the brain accomplishes perception and cognition (the mind). Taking the view that the answer must surely have a computational component, we present a few approachable questions for computer scientists, along with some recent work (with Christos Papadimitriou,&nbsp;Samantha Petti and Wolfgang Maass) on mechanisms for&nbsp;the formation of memories, the creation of associations between memories and the benefits of such associations.</p>

<p>&nbsp;</p>

<p>&nbsp;</p>]]></body>
  <field_summary_sentence>
    <item>
      <value><![CDATA[Wednesday, March 15, 2017 at 2pm in Marcus 1117 ]]></value>
    </item>
  </field_summary_sentence>
  <field_summary>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_summary>
  <field_time>
    <item>
      <value><![CDATA[2017-03-15T15:00:00-04:00]]></value>
      <value2><![CDATA[2017-03-15T18:00:00-04:00]]></value2>
      <rrule><![CDATA[]]></rrule>
      <timezone><![CDATA[America/New_York]]></timezone>
    </item>
  </field_time>
  <field_fee>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_fee>
  <field_extras>
      </field_extras>
  <field_audience>
          <item>
        <value><![CDATA[Faculty/Staff]]></value>
      </item>
          <item>
        <value><![CDATA[Public]]></value>
      </item>
          <item>
        <value><![CDATA[Undergraduate students]]></value>
      </item>
          <item>
        <value><![CDATA[Graduate students]]></value>
      </item>
      </field_audience>
  <field_media>
      </field_media>
  <field_contact>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_contact>
  <field_location>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_location>
  <field_sidebar>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_sidebar>
  <field_phone>
    <item>
      <value><![CDATA[]]></value>
    </item>
  </field_phone>
  <field_url>
    <item>
      <url><![CDATA[]]></url>
      <title><![CDATA[]]></title>
            <attributes><![CDATA[]]></attributes>
    </item>
  </field_url>
  <field_email>
    <item>
      <email><![CDATA[]]></email>
    </item>
  </field_email>
  <field_boilerplate>
    <item>
      <nid><![CDATA[]]></nid>
    </item>
  </field_boilerplate>
  <links_related>
      </links_related>
  <files>
      </files>
  <og_groups>
          <item>70263</item>
      </og_groups>
  <og_groups_both>
          <item><![CDATA[ARC]]></item>
      </og_groups_both>
  <field_categories>
      </field_categories>
  <field_keywords>
      </field_keywords>
  <userdata><![CDATA[]]></userdata>
</node>
