TITLE: Scalable Kernel Methods via Doubly Stochastic Gradients
SPEAKER: Le Song
ABSTRACT:
Kernel methods are widely used in many machine learning models, such as the support vector machines and Gaussian processes. However, the general perception is that kernel methods are not scalable, and neural networks are the methods of choice for large scale nonlinear learning problems. Or we simply have not tried hard enough for kernel methods? Here we propose an approach that scales up kernel methods using a novel concept called "doubly stochastic functional gradients". Our approach relies on the fact that many kernel methods can be expressed as convex optimization problems, and we solve the problems by making two unbiased stochastic approximations to the functional gradient, one using random training points and another using random functions associated with a positive definite kernel function, and then descending using this noisy functional gradient. We show that a function estimated by this procedure after t iterations converges to the optimal function in rate O(t^-1), and achieves a generalization guarantee of O(t^-1/2). This doubly stochasticity also allows us to avoid keeping the support vectors and to implement the algorithm in a small memory footprint, which is linear in number of iterations and independent of data dimension. Our approach can readily scale kernel methods up to the regimes which are dominated by neural networks. We show that our method can achieve competitive performance to neural networks in problems such as classifying 8 million handwritten digits from MNIST, regressing 2.3 million energy materials from MolecularSpace, and categorizing 1 million photos from ImageNet.
Short Bio:
Le Song is an assistant professor in the Department of Computational Science and Engineering, College of Computing, Georgia Institute of Technology. He received his Ph.D. in Computer Science from University of Sydney and NICTA in 2008, and then conducted his post-doctoral research in the School of Computer Science, Carnegie Mellon University, between 2008 and 2011. Before he joined Georgia Institute of Technology, he worked briefly as a research scientist at Google. His principal research interests lie in nonparametric and kernel methods, probabilistic graphical models, spatial/temporal dynamics of networked processes, and the applications of machine learning to interdisciplinary problems. He is the recipient of NSF CAREER Award 2014, IPDPS'15 Best Paper Award, NIPS’13 Outstanding Paper Award and ICML’10 Best Paper Award.