Yangyang Xu
Dr. Yangyang Xu earned a bachelor degree in Computational Mathematics from Nanjing University, a master degree from the Institute of Applied Mathematics at Chinese Academy of Sciences, and his Ph.D from the Department of Computational and Applied Mathematics at Rice University in 2014. Before joining RPI, Dr. Xu was an assistant professor at University of Alabama. He also spent one year as a postdoctoral fellow at University of Waterloo and another year as an NSF postdoc at University of Minnesota.
Dr. Xu's broad research interests are optimization theory and methods and their applications such as in machine learning, statistics, and signal processing. He worked on developing algorithms for compressed sensing, matrix completion, and tensor factorization and learning. Recently, his research focuses on firstorder methods, operator splitting, stochastic optimization methods, and high performance parallel computing. These works are motivated by very "big" problems arising in machine learning and image processing.
Education

Ph.D, Computational & Applied Mathematics
Rice University, 2014M.S, Operations Research
Chinese Academy of Sciences, 2010B.S, Computational Mathematics
Nanjing University, 2007
Selected Publications
 Y. Ouyang and Y. Xu. Lower complexity bounds of firstorder methods for convexconcave bilinear saddlepoint problems. Mathematical Programming, Series A, 185, pp. 135, 2021.
 Y. Xu. Primaldual stochastic gradient method for convex programs with many functional constraints. SIAM Journal on Optimization, 30(2), pp. 16641692, 2020.
 Y. Xu. Hybrid Jacobian and GaussSeidel proximal block coordinate update methods for linearly constrained convex programming. SIAM Journal on Optimization, 28(1), pp. 646670, 2018.
 Y. Xu. Accelerated firstorder primaldual proximal methods for linearly constrained composite convex programming. SIAM Journal on Optimization, 27(3), 14591484, 2017.
 Z. Peng, Y. Xu, M. Yan and W. Yin. ARock: an algorithmic framework for asynchronous parallel coordinate updates. SIAM Journal on Scientific Computing, 38(5), A2851A2879, 2016.
 N. Zhou, Y. Xu, H. Cheng, J. Fang and W. Pedrycz. Global and local structure preserving sparse subspace learning: an iterative approach to unsupervised feature selection. Pattern Recognition, 53, pp. 87101, 2016.
 Y. Xu and W. Yin. Block stochastic gradient iteration for convex and nonconvex optimization. SIAM Journal on Optimization, 25(3), 16861716, 2015.
 Y. Xu and W. Yin. A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion. SIAM Journal on imaging sciences, 6(3), pp. 17581789, 2013.
 M. Lai, Y. Xu and W. Yin. Improved iteratively reweighted least squares for unconstrained smoothed Lq minimization. SIAM Journal on Numerical Analysis, 51(2), pp. 927957, 2013.
 Y. Xu, W. Yin, Z. Wen and Y. Zhang. An alternating direction algorithm for matrix completion with nonnegative factors. Journal of Frontiers of Mathematics in China, Special Issues on Computational Mathematics (Springer), pp. 365384, 2011.