Articles

PENALIZED LEAST SQUARE IN SPARSE SETTING WITH CONVEX PENALTY AND NON GAUSSIAN ERRORS

  • Doualeh ABDILLAHI-ALI ,
  • Nourddine AZZAOUI ,
  • Arnaud GUILLIN ,
  • Guillaume LE MAILLOUX ,
  • Tomoko MATSUI
Expand
  • 1. Laboratoire de Mathématiques Blaise Pascal, CNRS-UMR 6620, Université Clermont-Auvergne(UCA), 63000 Clermont-Ferrand, France;
    2. epartment of Statistical Modeling, Institute of Statistical Mathematics, 10-3 Midori-cho, Tachikawa-shi, Tokyo 1908562, Japan

Received date: 2021-05-26

  Revised date: 2021-10-13

  Online published: 2021-12-27

Supported by

This work has been (partially) supported by the Project EFI ANR-17-CE40-0030 of the French National Research Agency.

Abstract

This paper consider the penalized least squares estimators with convex penalties or regularization norms. We provide sparsity oracle inequalities for the prediction error for a general convex penalty and for the particular cases of Lasso and Group Lasso estimators in a regression setting. The main contribution is that our oracle inequalities are established for the more general case where the observations noise is issued from probability measures that satisfy a weak spectral gap (or Poincaré) inequality instead of Gaussian distributions. We illustrate our results on a heavy tailed example and a sub Gaussian one; we especially give the explicit bounds of the oracle inequalities for these two special examples.

Cite this article

Doualeh ABDILLAHI-ALI , Nourddine AZZAOUI , Arnaud GUILLIN , Guillaume LE MAILLOUX , Tomoko MATSUI . PENALIZED LEAST SQUARE IN SPARSE SETTING WITH CONVEX PENALTY AND NON GAUSSIAN ERRORS[J]. Acta mathematica scientia, Series B, 2021 , 41(6) : 2198 -2216 . DOI: 10.1007/s10473-021-0624-0

References

[1] Tsybakov A, Bellec A, Lecué G. Towards the study of least squares estimators with convex penalty. Arxiv, 2017
[2] Barthe F, Cattiaux P, Roberto C. Concentration for independent random variables with heavy tails. Applied Mathematics Research eXpress, 2005, 2005(2):39-60
[3] Bellec P, Lecué G, Tsybakov A. Slope meets lasso:improved oracle bounds and optimality. Annals of Statistics, 2018, 46(6B):3603-3642
[4] Bellec P, Tsybakov A. Bounds on the prediction error of penalized least squares estimators with convex penalty//International Conference on Modern Problems of Stochastic Analysis and Statistics. Springer, 2016:315-333
[5] Bickel P J, Ritov Y, Tsybakov A. Simultaneous analysis of Lasso and Dantzig selector. Annals of Statistics, 2009, 37(4):1705-1732
[6] Bühlmann P, Van De Geer S. Statistics for High-dimensional Data:Methods, Theory and Applications. Springer Science & Business Media, 2011
[7] Cattiaux P, Guillin A. On the Poincaré constant of log-concave measures//Geometric Aspects of Functional Analysis. Springer, 2020:171-217
[8] van de Geer S. Estimation and Testing under Sparsity. Springer, 2016
[9] Giraud C. Graphical models//Introduction to High-Dimensional Statistics. Chapman and Hall/CRC, 2014:157-180
[10] Koltchinskii V, Mendelson S. Bounding the smallest singular value of a random matrix without concentration. International Mathematics Research Notices, 2015, 2015(23):12991-13008
[11] Lounici K, et al. Oracle inequalities and optimal inference under group sparsity. Annals of Statistics, 2011, 39(4):2164-2204
[12] Mendelson S. Learning without concentration//Conference on Learning Theory. PMLR, 2014:25-39
[13] Negahban S, et al. A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers. Statistical Science, 2012, 27(4):538-557
[14] Selesnick I. Sparse regularization via convex analysis. IEEE Transactions on Signal Processing, 2017, 65(17):4481-4494
[15] Taylor J. The geometry of least squares in the 21st century. Bernoulli, 2013, 19(4):1449-1464
Options
Outlines

/