Abstract
The problem of sampling constrained continuous distributions has frequently appeared in many machine/statistical learning models. Many Markov Chain Monte Carlo (MCMC) sampling methods have been adapted to handle different types of constraints on random variables. Among these methods, Hamilton Monte Carlo (HMC) and the related approaches have shown significant advantages in terms of computational efficiency compared with other counterparts. In this article, we first review HMC and some extended sampling methods, and then we concretely explain three constrained HMC-based sampling methods, reflection, reformulation, and spherical HMC. For illustration, we apply these methods to solve three well-known constrained sampling problems, truncated multivariate normal distributions, Bayesian regularized regression, and nonparametric density estimation. In this review, we also connect constrained sampling with another similar problem in the statistical design of experiments with constrained design space. This article is categorized under: Applications of Computational Statistics > Computational Mathematics Statistical and Graphical Methods of Data Analysis > Bayesian Methods and Theory Statistical and Graphical Methods of Data Analysis > Markov Chain Monte Carlo (MCMC) Statistical and Graphical Methods of Data Analysis > Sampling.
Original language | English (US) |
---|---|
Article number | e1608 |
Journal | Wiley Interdisciplinary Reviews: Computational Statistics |
Volume | 15 |
Issue number | 6 |
DOIs | |
State | Published - Nov 1 2023 |
Keywords
- Hamilton Monte Carlo
- Riemannian Monte Carlo
- constrained sampling
- regularized regression
- truncated multivariate Gaussian
ASJC Scopus subject areas
- Statistics and Probability