Give us a call or drop by anytime, we endeavor to answer all inquiries within 24 hours.
PO Box 16122 Collins Street West Victoria, Australia
info@domain.com / example@domain.com
Phone: + (066) 0760 0260 / + (057) 0760 0560
Stochastic zeroth-order optimization concerns about optimizing a function with only noisy function evaluations. Typically, the oracle complexity of stochastic zeroth-order optimization algorithms depend at least linearly on the dimensionality (up to noise variance). In this talk, we will discuss stochastic zeroth-order optimization with structures that help overcome this dimensionality dependency. In the first part of the talk, we consider the case when the function being optimized satisfies a certain sparse structure. We show for this case that the dimension dependency of stochastic zeroth-order gradient algorithm is linear in the sparsity parameter and depend only poly-logarithmically on the ambient dimension. In the second part of the talk, we consider the case when the function being optimized is defined over a Riemannian sub-manifold embedded in a Euclidean space. We propose novel estimators of the gradient and Hessian in this setting based on noisy function evaluations. We then show that in this case the oracle complexities are independent of the ambient Euclidean dimension and depend only linearly on the Riemannian dimension.
Krishna Balasubramanian is an assistant professor in the Department of Statistics, University of California, Davis. His recent research interests include stochastic optimization and sampling, theory and computation with tensors, higher-order interaction analysis and theoretical analysis of kernel methods. His research was/is supported by a Facebook PhD fellowship, and CeDAR and NSF grants.