{"670885":{"#nid":"670885","#data":{"type":"event","title":"ISYE Statistic Seminar - Bodhisattva Sen","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle:\u003C\/strong\u003E\u0026nbsp;Extending the Scope of Nonparametric Empirical Bayes\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EAbstract:\u0026nbsp;\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn this talk we will describe two applications of empirical Bayes (EB) methodology. EB procedures estimate the prior probability distribution (in a Bayesian statistical model) from the data. In the first part we study the (Gaussian) signal plus noise model with multivariate, heteroscedastic errors. This model arises in many large-scale denoising problems (e.g., in astronomy). We consider the nonparametric maximum likelihood estimator (NPMLE) in this setting. We study the characterization, uniqueness, and computation of the NPMLE which estimates the unknown (arbitrary) prior by solving an infinite-dimensional convex optimization problem. The EB posterior means based on the NPMLE have low regret, meaning they closely target the oracle posterior means one would compute with the true prior in hand. We demonstrate the adaptive and near-optimal properties of the NPMLE for density estimation, denoising and deconvolution.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn the second half of the talk, we consider the problem of Bayesian high dimensional regression where the regression coefficients are drawn i.i.d. from an unknown prior. To estimate this prior distribution, we propose and study a \u0022variational empirical Bayes\u0022 approach \u2014 it combines EB inference with a variational approximation (VA). The idea is to approximate the intractable marginal log-likelihood of the response vector --- also known as the \u0022evidence\u0022 --- by the evidence lower bound (ELBO) obtained from a naive mean field (NMF) approximation. We then maximize this lower bound over a suitable class of prior distributions in a computationally feasible way. We show that the marginal log-likelihood function can be (uniformly) approximated by its mean field counterpart. More importantly, under suitable conditions, we establish that this strategy leads to consistent approximation of the true posterior and provides asymptotically valid posterior inference for the regression coefficients.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u003Cstrong\u003EBio:\u0026nbsp;\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EBodhi Sen is a Professor of Statistics at Columbia\u0026nbsp;University, New York. He completed his Ph.D in Statistics from\u0026nbsp;University of Michigan,\u0026nbsp;Ann Arbor, in 2008. Prior to that, he was a student at the Indian\u0026nbsp;Statistical Institute, Kolkata, where he received his Bachelors (2002)\u0026nbsp;and Masters (2004) in Statistics. His core statistical research\u0026nbsp;centers around nonparametrics --- function estimation (with special\u0026nbsp;emphasis on shape constrained estimation), theory of optimal transport\u0026nbsp;and its applications to statistics, empirical Bayes procedures, kernel\u0026nbsp;methods, likelihood and bootstrap based inference, etc. He is also\u0026nbsp;actively involved in interdisciplinary research, especially in\u0026nbsp;astronomy.\u003Cbr \/\u003E\r\n\u003Cbr \/\u003E\r\nHis honors include the NSF CAREER award (2012), and the Young\u0026nbsp;Statistical Scientist Award (YSSA) in the Theory and Methods category\u0026nbsp;from the International Indian Statistical Association (IISA). He is an\u0026nbsp;elected fellow of the Institute of Mathematical Statistics (IMS).\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003E\u003Cstrong\u003EAbstract:\u0026nbsp;\u003C\/strong\u003E\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn this talk we will describe two applications of empirical Bayes (EB) methodology. EB procedures estimate the prior probability distribution (in a Bayesian statistical model) from the data. In the first part we study the (Gaussian) signal plus noise model with multivariate, heteroscedastic errors. This model arises in many large-scale denoising problems (e.g., in astronomy). We consider the nonparametric maximum likelihood estimator (NPMLE) in this setting. We study the characterization, uniqueness, and computation of the NPMLE which estimates the unknown (arbitrary) prior by solving an infinite-dimensional convex optimization problem. The EB posterior means based on the NPMLE have low regret, meaning they closely target the oracle posterior means one would compute with the true prior in hand. We demonstrate the adaptive and near-optimal properties of the NPMLE for density estimation, denoising and deconvolution.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn the second half of the talk, we consider the problem of Bayesian high dimensional regression where the regression coefficients are drawn i.i.d. from an unknown prior. To estimate this prior distribution, we propose and study a \u0022variational empirical Bayes\u0022 approach \u2014 it combines EB inference with a variational approximation (VA). The idea is to approximate the intractable marginal log-likelihood of the response vector --- also known as the \u0022evidence\u0022 --- by the evidence lower bound (ELBO) obtained from a naive mean field (NMF) approximation. We then maximize this lower bound over a suitable class of prior distributions in a computationally feasible way. We show that the marginal log-likelihood function can be (uniformly) approximated by its mean field counterpart. More importantly, under suitable conditions, we establish that this strategy leads to consistent approximation of the true posterior and provides asymptotically valid posterior inference for the regression coefficients.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"Extending the Scope of Nonparametric Empirical Bayes"}],"uid":"36433","created_gmt":"2023-11-03 20:29:52","changed_gmt":"2023-11-03 20:29:52","author":"mrussell89","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2023-11-07T11:00:00-05:00","event_time_end":"2023-11-07T12:00:00-05:00","event_time_end_last":"2023-11-07T12:00:00-05:00","gmt_time_start":"2023-11-07 16:00:00","gmt_time_end":"2023-11-07 17:00:00","gmt_time_end_last":"2023-11-07 17:00:00","rrule":null,"timezone":"America\/New_York"},"location":"Groseclose 402","extras":["free_food"],"groups":[{"id":"1242","name":"School of Industrial and Systems Engineering (ISYE)"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[{"id":"78761","name":"Faculty\/Staff"},{"id":"177814","name":"Postdoc"},{"id":"78771","name":"Public"},{"id":"174045","name":"Graduate students"},{"id":"78751","name":"Undergraduate students"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}