[1] Tsybakov A, Bellec A, Lecué G. Towards the study of least squares estimators with convex penalty. Arxiv, 2017 [2] Barthe F, Cattiaux P, Roberto C. Concentration for independent random variables with heavy tails. Applied Mathematics Research eXpress, 2005, 2005(2):39-60 [3] Bellec P, Lecué G, Tsybakov A. Slope meets lasso:improved oracle bounds and optimality. Annals of Statistics, 2018, 46(6B):3603-3642 [4] Bellec P, Tsybakov A. Bounds on the prediction error of penalized least squares estimators with convex penalty//International Conference on Modern Problems of Stochastic Analysis and Statistics. Springer, 2016:315-333 [5] Bickel P J, Ritov Y, Tsybakov A. Simultaneous analysis of Lasso and Dantzig selector. Annals of Statistics, 2009, 37(4):1705-1732 [6] Bühlmann P, Van De Geer S. Statistics for High-dimensional Data:Methods, Theory and Applications. Springer Science & Business Media, 2011 [7] Cattiaux P, Guillin A. On the Poincaré constant of log-concave measures//Geometric Aspects of Functional Analysis. Springer, 2020:171-217 [8] van de Geer S. Estimation and Testing under Sparsity. Springer, 2016 [9] Giraud C. Graphical models//Introduction to High-Dimensional Statistics. Chapman and Hall/CRC, 2014:157-180 [10] Koltchinskii V, Mendelson S. Bounding the smallest singular value of a random matrix without concentration. International Mathematics Research Notices, 2015, 2015(23):12991-13008 [11] Lounici K, et al. Oracle inequalities and optimal inference under group sparsity. Annals of Statistics, 2011, 39(4):2164-2204 [12] Mendelson S. Learning without concentration//Conference on Learning Theory. PMLR, 2014:25-39 [13] Negahban S, et al. A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers. Statistical Science, 2012, 27(4):538-557 [14] Selesnick I. Sparse regularization via convex analysis. IEEE Transactions on Signal Processing, 2017, 65(17):4481-4494 [15] Taylor J. The geometry of least squares in the 21st century. Bernoulli, 2013, 19(4):1449-1464 |