Bio: Ming Yuan is Professor of Statistics at Columbia University. He was previously Senior Investigator in Virology at Morgridge Institute for Research and Professor of Statistics at University of Wisconsin at Madison, and prior to that Coca-Cola Junior Professor of Industrial and Systems Engineering at Georgia Institute of Technology. His research and teaching interests lie broadly in statistics and its interface with other quantitative and computational fields such as optimization, machine learning, computational biology and financial engineering. He has over 100 scientific publications in applied mathematics, computer science, electrical engineering, financial econometrics, medical informations, optimization, and statistics among others.
He has served as the program secretary of the Institute for Mathematical Statistics (IMS), and was a member of the advisory board for the Quality, Statistics and Reliability (QSR) section of the Institute for Operations Research and the Management Sciences (INFORMS). He is also a co-Editor of The Annals of Statistics and has been serving on numerous editorial boards. He was named a Medallion Lecturer of IMS in 2018, and a recipient of the John van Ryzin Award (2004; International Biometrics Society), CAREER Award (2009; US National Science Foundation), the Guy Medal in Bronze (2014; Royal Statistical Society), and the Leo Breiman Junior Researcher Award (2017; the Statistical Learning and Data Mining section of the American Statistical Association).
Abstract: Matrix perturbation bounds developed by Weyl, Davis, Kahan and Wedin and others play a central role in many statistical and machine learning problems. I shall discuss some of the recent progresses in developing similar bounds for higher order tensors. I will highlight the intriguing differences from matrices, and explore their implications in spectral learning problems.