# Fast Sparse Regression with Guarantees but not the Bias

Wotao Yin

Professor

Department of Mathematics of UCLA

Abstract: We introduce a new sparse regression method that has a number of theoretical and computational properties but doesn't have the bias found in LASSO and other L1-like sparse regression methods. Known since Jianqing Fan and Runze Li'S publication in 2001, points on a LASSO path are biased．In order to avoid the bias, instead of the convex L1 energy used in LASSO, one must minimize a nonconvex energy and thus sacrifice the computational advantages of convex minimization. Our new method generates a regularization path by evolving an ordinary differential inclusion, which involves the subdifferential of the L1 energy. We show that there exists a point on the generated path that is the unbiased estimate of the true signal and whose entries have the signs consistent with those of the true signal. All of these are achieved without any debiasing post-processing. In fact, it works better than LASSO combined with debiasing. We also show how to efficiently compute our path both exactly and inexactly but much faster. The exact path can be computed in finitely many steps at a low cost per step. For problems with terabytes of data, we generate an approximate regularization path by so-alled Linearized Bregman iteration, which is fast and
easy to parallelize while still holding the sign—consistency property but is slightly biased．This is joint work with Stanley Osher(UCLA), Feng Ruan(Stanford), Jiechao Xiong(PKU), Ming Yah(Michigan State), and Yuan Yao(PKU).

Bio: Wotao Yin is a professor in the Department of Mathematics of UCLA. His research interests lie in computational optimization and its applications in image processing, machine learning, and other inverse problems. He received his B.S. in mathematics from Nanjing University in 2001, and then M.S. and Ph.D. in operations research from Columbia University in 2003 and 2006, respectively. During 2006-2013, he was with Rice University. He won NSF CAREER award in 2008 and alfred P.Sloan Research Fellowship in 2009. His recent work has been in optimization algorithms for large-scale and distributed signal processing and machine learning problems.