Tengyu's work brings together techniques from theoretical computer science, applied mathematics, statistics, probability, and information theory to answer the twin questions of how to design successful nonlinear models and efficiently optimize nonconvex training functions for those models. Several of his publications develop mathematical tools to characterize the optimization landscape of various machine learning problems including dictionary learning, matrix completion, tensor decomposition, and linearized (recurrent) neural nets; some of these results have been published in Transactions of the Association for Computational Linguistics and the Journal of Machine Learning Research. Tengyu has also worked on sum-of-squares algorithms and statistical and communication trade-offs in machine learning, both areas having technical and conceptual open problems that he intends to continue investigating.
Jure Leskovec is an Associate Professor of Computer Science at Stanford University. His research focuses on mining and modeling large social and information networks, their evolution, and diffusion of information and influence over them. Problems he investigates are motivated by large scale data, the web and on-line media.
Christopher Manning is the inaugral Thomas M. Siebel Professor in Machine Learning in the Departments of Computer Science and Linguistics at Stanford University. His research goal is computers that can intelligently process, understand, and generate human language material. Manning is a leader in applying Deep Learning to Natural Language Processing, with well-known research on Tree Recursive Neural Networks, sentiment analysis, neural network dependency parsing, the GloVe model of word vectors, neural machine translation, and deep language understanding.