Semismooth Newton Augmented Lagrangian Algorithm for Adaptive Lasso Penalized Least Squares in Semiparametric Regression
This paper is concerned with a partially linear semiparametric regression model with an unknown regression coefficient, an unknown nonparametric function for the non-linear component, and an unobservable Gaussian distributed random error. We consider the applications of the least squares to semiparametric regression and particularly present an adaptive lasso penalized least squares (PLS) procedure to select the regression coefficient. Different from almost all the numerical methods in previous literature, this paper concentrates on the corresponding dual problem. We observe that the dual problem consists of a smooth strongly convex function and an indicator function, so it can be solved by the semismooth Newton augmented Lagrangian (SSNAL) algorithm. Besides, a strongly semismooth nonlinear system is involved per-iteration, which can be solved by the semismooth Newton by taking full use of the structure of proximal mappings. We show that this implemented algorithm offers a notable computational advantage in statistical regression inference and the sequence generated by the method admits fast local convergence rate under some assumptions. Numerical experiments on some simulation data and real data are conducted, and the performance comparisons with ADMM to demonstrate the effectiveness of PLS and the progressiveness of SSNAL are also included.
READ FULL TEXT