Title:

Multivariate Analysis of Large-Scale Network Series

 

Abstract:

Networks are an increasingly common representation of real-world phenomena, able to succinctly describe social dynamics, communications infrastructure, genetic mechanisms, and more. In many applications, multiple views on the same network structure are available, each of which captures a different aspect of the same underlying phenomenon. For example, in multi-subject neuroimaging, independently estimated functional networks can be combined to identify common and generalizable patterns in the brain's response to stimuli. The scope and scale of these and other types of networks give rise to a host of computational and statistical challenges, taxing classical approaches with their ultra-high-dimensionality, small sample sizes, and expensive computation. In the first part of this talk, I develop a novel framework for principal components analysis of a population of networks based on a new class of semi-symmetric tensor decompositions. This Network PCA framework allows us to identify and isolate core patterns which capture network dynamics in a significantly reduced space, enabling more efficient computation and improved statistical estimation in downstream tasks. I also develop a new proof technique for tensors and higher-order power iterations to establish statistical consistency for these challenging non-convex optimization problems. I demonstrate the utility of this framework through applications to trend identification, variance analysis, and changepoint detection on an extended analysis of voting dynamics at the US Supreme Court. In the second half of this talk, I discuss several related problems in unsupervised statistical learning for multiple networks, highlighting my current and future research directions.

 

Bio:

Michael Weylandt is currently an Intelligence Community Postdoctoral Fellow, working with George Michailidis at the University of Florida. His work focuses on statistical machine learning methodology and computation for highly-structured data, with a particular focus on network data and time series. His work has been recognized with best paper awards from the American Statistical Association Sections in Statistical Learning and Data Science and in Business & Economic Statistics. He has served as a mentor in the Google Summer of Code program for 7 years on behalf of the R Foundation for Statistical Computing and previously held an NSF Graduate Research Fellowship. Prior to beginning his Ph.D. studies, he worked at Morgan Stanley as a quantitative analyst, focusing on derivatives pricing and financial risk management. He received a Bachelor's of Science in Engineering from Princeton University in 2008 and a Ph.D. in Statistics from Rice University in 2020.