Trends and Advances in Monte Carlo Sampling Algorithms

Location

This workshop was held at the Penn Pavilion (Level 2), Duke University, Durham, NC.

Description

This was the second workshop in the SAMSI Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics focusses on Monte Carlo sampling methods –an important class of computational algorithms for estimating high dimensional distributions. Monte Carlo sampling is widely used in physics, chemistry, mathematics and statistics, and is most useful when other methods fail due to the high dimensionality of the problem. Due to the extensive application of Monte Carlo sampling across disciplines, breakthroughs in one discipline can lead to advances in others.

This SAMSI workshop brought together experts from applied mathematics, statistics and machine learning for the purpose of exchanging ideas and advancing the broad area of sampling algorithms.


Schedule and Supporting Media

Printable Schedule
Printable Map
Participant List

Lecturers

Speakers for this event were:

Monday, December 11, 2017
Penn Pavilion, West Campus, Duke University

Description Speaker Slides
Introductions and Welcome Ilse Ipsen, SAMSI Associate Director
Component-wise Markov Chain Monte Carlo Galin Jones, University of Minnesota
Approximate MCMC in Theory and Practice James Johndrow, Stanford University
Sparse Polynomial Approximation via Compressed Sensing of High Dimension Functions Hoang Tran, Oak Ridge National Laboratory
Principled Variational Learning and Inference for Deep Generative Neural Networks Larry Carin, Duke University
Stochastic Gradient MCMC for Independent and Correlated Data Yian Ma, University of Washington
Parallel Markov Chain Monte Carlo Scott Schmidler, Duke University

Tuesday, December 12, 2017
Penn Pavilion, West Campus, Duke University

Description Speaker Slides
Multiscale Implementation of Infinite-Swap Replica Exchange Molecular Dynamics Eric Vanden-Eijnden, New York University
Lecture to be Announced Jonathan Mattingly, Duke University
Discontinuous Hamiltonian Monte Carlo for Sampling Discrete Parameters Akihiko Nishimura, University of California, Los Angeles
Measuring Sample Quality with Kernels Lester Mackey, Microsoft Research
A Stein Variational Approach for Deep Probabilistic Modeling Qiang Liu, Dartmouth College
When to Stop Sampling: Answers and Further Questions Fred Hickernell, Illinois Institute of Technology
Poster Session and Reception

Wednesday, December 13, 2017
Penn Pavilion, West Campus, Duke University

Description Speaker Slides
Variance Reduction via Taylor Approximation of High-dimensional Parameter-to-output
Maps Governed by Expensive-to-solve PDEs: Applications to optimal control under uncertainty
Omar Ghattas, University of Texas
Approximate Bayesian Computation for Mechanistic Network Models Antonietta Mira, Università della Svizzera Italiana and Università dell’Insubria
Self-adjusted Mixture Sampling and Locally Weighted Histogram Analysis Methods Zhiqiang Tan, Rutgers University

Thursday, December 14, 2017
Penn Pavilion, West Campus, Duke University

Description Speaker Slides
Sequential Inference via Low-dimensional Couplings Youssef Marzouk, MIT
About Infinite-Dimensional Geometric MCMC Shiwei Lan, Caltech
Automated Scalable Bayesian Inference via Data Summarization Tamara Broderick, MIT
Expediting Monte Carlo Sampling via Multi-fidelity Information Fusion Paris Perdikaris, MIT
Regularization and Computation with High-dimensional Spike-and-slab Posterior Distributions Yves Atchade, University of Michigan
Jittered Sampling: Bounds and Problems Stefan Steinerberger, Yale University

Friday, December 15, 2017
Penn Pavilion, West Campus, Duke University

Description Speaker Slides
Importance Sampling the Union of Rare Events with an Application to Power Systems Analysis Art Owen, Stanford University
Stratification for Markov Chain Monte Carlo Simulation Jonathan Weare, University of Chicago
Robust MCMC Sampling with Non-Gaussian and Hierarchical Priors in High Dimensions Matthew Dunlop, Caltech
Discussion and Wrap-Up

Questions: email [email protected]