Statistical Properties of the Method of Regularization With Periodic Gaussian Reproducing Kernel

Loading...
Thumbnail Image

Related Collections

Degree type

Discipline

Subject

asymptotic minimax risk
Gaussian reproducing kernel
nonparametric estimation
rate of convergence
Sobolev spaces
white noise model
Statistics and Probability

Funder

Grant number

License

Copyright date

Distributor

Related resources

Contributor

Abstract

The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infinite-order Sobolev space and the space of analytic functions, the method under consideration is asymptotically minimax; in finite-order Sobolev spaces, the method is rate optimal, and the efficiency in terms of constant when compared with the minimax estimator is reasonably high. The smoothing parameters in the periodic Gaussian regularization can be chosen adaptively without loss of asymptotic efficiency. The results derived in this paper give a partial explanation of the success of the Gaussian reproducing kernel in practice. Simulations are carried out to study the finite sample properties of the periodic Gaussian regularization.

Advisor

Date Range for Data Collection (Start Date)

Date Range for Data Collection (End Date)

Digital Object Identifier

Series name and number

Publication date

2004-01-01

Journal title

Annals of Statistics

Volume number

Issue number

Publisher

Publisher DOI

relationships.isJournalIssueOf

Comments

Recommended citation

Collection