Give us a call or drop by anytime, we endeavor to answer all inquiries within 24 hours.
PO Box 16122 Collins Street West Victoria, Australia
info@domain.com / example@domain.com
Phone: + (066) 0760 0260 / + (057) 0760 0560
Abstract: In this talk I will introduce some extensions to the proximal Markov Chain Monte Carlo (Proximal MCMC) – a flexible and general Bayesian inference framework for constrained or regularized parametric estimation. The basic idea of Proximal MCMC is to approximate nonsmooth regularization terms via the Moreau-Yosida envelope. Initial proximal MCMC strategies, however, fixed nuisance and regularization parameters as constants, and relied on the Langevin algorithm for the posterior sampling. We extend Proximal MCMC to the full Bayesian framework with modeling and data-adaptive estimation of all parameters including regularization parameters. More efficient sampling algorithms such as the Hamiltonian Monte Carlo are employed to scale Proximal MCMC to high-dimensional problems. Our proposed Proximal MCMC offers a versatile and modularized procedure for the inference of constrained and non-smooth problems that is mostly tuning parameter free. We illustrate its utility on various statistical estimation and machine learning tasks.