{"619923":{"#nid":"619923","#data":{"type":"event","title":"ISyE Statistic Seminar - Andrew Brown","body":[{"value":"\u003Ch3\u003E\u003Cstrong\u003ETitle: \u003C\/strong\u003E\u003C\/h3\u003E\r\n\r\n\u003Cp\u003ELow Rank Independence Samplers in Hierarchical Bayesian Inverse Problems\u003Cbr \/\u003E\r\nAndrew Brown\u003Cbr \/\u003E\r\nSchool of Mathematical and Statistical Sciences\u003Cbr \/\u003E\r\nClemson University\u003C\/p\u003E\r\n\r\n\u003Ch3\u003E\u003Cstrong\u003EAbstract:\u003C\/strong\u003E\u003C\/h3\u003E\r\n\r\n\u003Cp\u003EIn Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about\u0026nbsp;the reconstructed solution. In fully Bayesian approaches in which prior parameters are assigned\u0026nbsp;hyperpriors, Markov chain Monte Carlo (MCMC) algorithms often are used to approximate samples\u0026nbsp;from the posterior. However, implementations of such algorithms can be computationally expensive. In\u0026nbsp;this talk, I will present a computationally efficient scheme for sampling high-dimensional Gaussian\u0026nbsp;distributions in ill-posed Bayesian linear inverse problems. The approach uses Metropolis-Hastings\u0026nbsp;independence sampling with a proposal distribution based on a low-rank approximation of the priorpreconditioned\u003Cbr \/\u003E\r\nHessian. I will present results obtained when using the proposed approach with\u003Cbr \/\u003E\r\nMetropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring and\u0026nbsp;computerized tomography. Time permitting, I will also briefly discuss applying the low-approximation\u0026nbsp;idea in marginalization-based MCMC algorithms to improve the mixing behavior when compared to\u0026nbsp;standard block Gibbs sampling.\u003C\/p\u003E\r\n\r\n\u003Ch3\u003E\u003Cstrong\u003EBio: \u003C\/strong\u003E\u003C\/h3\u003E\r\n\r\n\u003Cp\u003EAndrew Brown earned his B.S. in Applied Mathematics from Georgia Tech in 2006. After briefly\u0026nbsp;working for Porsche Cars North America in Atlanta, he went to the University of Georgia to earn his MS\u0026nbsp;and PhD in Statistics under the direction of Nicole Lazar and Gauri Datta. He then joined the School of\u0026nbsp;Mathematical and Statistical Sciences at Clemson University, where he is currently an Assistant\u0026nbsp;Professor. He spent Spring of 2016 as a Visiting Research Fellow at the Statistical and Applied\u0026nbsp;Mathematical Sciences Institute. His research interests include high-dimensional Bayesian modeling and\u0026nbsp;computation, neuroimaging data analysis (particularly functional and structural MRI), computer\u003Cbr \/\u003E\r\nexperiments, and uncertainty quantification.\u0026nbsp;\u003C\/p\u003E\r\n","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Ch3\u003E\u003Cstrong\u003EAbstract:\u003C\/strong\u003E\u003C\/h3\u003E\r\n\r\n\u003Cp\u003EIn Bayesian inverse problems, the posterior distribution is used to quantify uncertainty about\u0026nbsp;the reconstructed solution. In fully Bayesian approaches in which prior parameters are assigned\u0026nbsp;hyperpriors, Markov chain Monte Carlo (MCMC) algorithms often are used to approximate samples\u0026nbsp;from the posterior. However, implementations of such algorithms can be computationally expensive. In\u0026nbsp;this talk, I will present a computationally efficient scheme for sampling high-dimensional Gaussian\u003Cbr \/\u003E\r\ndistributions in ill-posed Bayesian linear inverse problems. The approach uses Metropolis-Hastings\u0026nbsp;independence sampling with a proposal distribution based on a low-rank approximation of the priorpreconditioned\u0026nbsp;Hessian. I will present results obtained when using the proposed approach with\u0026nbsp;Metropolis-Hastings-within-Gibbs sampling in numerical experiments in image deblurring and\u0026nbsp;computerized tomography. Time permitting, I will also briefly discuss applying the low-approximation\u003Cbr \/\u003E\r\nidea in marginalization-based MCMC algorithms to improve the mixing behavior when compared to\u0026nbsp;standard block Gibbs sampling\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"Low Rank Independence Samplers in Hierarchical Bayesian Inverse Problems"}],"uid":"34977","created_gmt":"2019-04-01 18:11:03","changed_gmt":"2019-04-01 18:11:43","author":"Julie Smith","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2019-04-08T15:00:00-04:00","event_time_end":"2019-04-08T16:00:00-04:00","event_time_end_last":"2019-04-08T16:00:00-04:00","gmt_time_start":"2019-04-08 19:00:00","gmt_time_end":"2019-04-08 20:00:00","gmt_time_end_last":"2019-04-08 20:00:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"1242","name":"School of Industrial and Systems Engineering (ISYE)"}],"categories":[],"keywords":[],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[{"id":"78761","name":"Faculty\/Staff"},{"id":"177814","name":"Postdoc"},{"id":"78771","name":"Public"},{"id":"174045","name":"Graduate students"},{"id":"78751","name":"Undergraduate students"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}