Prediction Consistency of Lasso Regression Does Not Need Normal Errors

Hlaváčková-Schindler, Kateřina (2016) Prediction Consistency of Lasso Regression Does Not Need Normal Errors. British Journal of Mathematics & Computer Science, 19 (4). pp. 1-7. ISSN 22310851

[thumbnail of Schindler1942016BJMCS29533.pdf] Text
Schindler1942016BJMCS29533.pdf - Published Version

Download (401kB)

Abstract

Sourav Chatterjee in 2014 proved consistency of any estimator using orthogonal least squares (OLS) together with Lasso penalty under the conditions the observations are upper bounded, with normal errors, and being independent of observations, with a zero mean and a finite variance. Reviewing his elegant proof, we come to the conclusion that the prediction consistency of OLS with Lasso can be proven even with fewer assumptions, i.e., without assuming normality of the errors, knowing only they have a finite variance and zero mean. We give an upper bound on the convergence rate of OLS-Lasso estimator for these errors. This upper bound is not asymptotic and depends both on the number of regressors and on the size of the data set. Knowing the number of regressors in a regression problem, one can estimate how large data set is needed, to achieve a prediction error under a given value, and this in comparison to the cited work, without solving the parameter estimation problem for fitting the errors to a normal distribution. The result can encourage practitioners to use OLS Lasso as a convergent algorithm for prediction with other than normal errors satisfying these milder conditions.

Item Type: Article
Subjects: Open Article Repository > Mathematical Science
Depositing User: Unnamed user with email support@openarticledepository.com
Date Deposited: 12 Jun 2023 04:37
Last Modified: 20 Apr 2024 13:31
URI: http://journal.251news.co.in/id/eprint/1507

Actions (login required)

View Item
View Item