Année
2025
Auteurs
LECUE Guillaume, GAVRILOPOULOS Georgios, SHANG Zong
Abstract
We obtain upper bounds for the estimation error of Kernel Ridge Regression (KRR) for all non-negative regularization parameters, offering a geometric perspective on various phenomena in KRR. As applications: 1. We address the Multiple Descents problem, unifying the proofs of [47] and [33] for polynomial kernels in non-asymptotic regime and we establish Multiple Descents for the generalization error of KRR for polynomial kernel under sub-Gaussian design in asymptotic regimes. 2. In the non-asymptotic regime, we have established a one-sided isomorphic version of the Gaussian Equivalent Conjecture for sub-Gaussian design vectors. 3. We offer a novel perspective on the linearization of kernel matrices of non-linear kernel, extending it to the power regime for polynomial kernels. 4. Our theory is applicable to data-dependent kernels, providing a convenient and accurate tool for the feature learning regime in deep learning theory. 5. Our theory extends the results in [72] under weak moment assumption. Ourproofis basedonthreemathematical tools developed in this paper that can be of independent interest: 1. Dvoretzky-Milman theorem for ellipsoids under (very) weak moment assumptions. 2. Restricted Isomorphic Property in Reproducing Kernel Hilbert Spaces with embedding index conditions. 3. Aconcentration inequality for finite-degree polynomial kernel functions.
GAVRILOPOULOS, G., LECUE, G. et SHANG, Z. (2025). A Geometrical Analysis of Kernel Ridge Regression and its Applications. Annals of Statistics, Forthcoming.