Authors
Kefan Dong, Tengyu Ma
Publication date
2023/7/12
Conference
The Thirty Sixth Annual Conference on Learning Theory
Pages
2877-2918
Publisher
PMLR
Description
Many machine learning applications require learning a function with a small worst-case error over the entire input domain, that is, the -error, whereas most existing theoretical works only guarantee recovery in average errors such as the -error. -recovery from polynomial samples is even impossible for seemingly simple function classes such as constant-norm infinite-width two-layer neural nets. This paper makes some initial steps beyond the impossibility results by leveraging the randomness in the ground-truth functions. We prove a polynomial sample complexity bound for random ground-truth functions drawn from Gaussian random fields. Our key technical novelty is to prove that the degree- spherical harmonics components of a function from Gaussian random field cannot be spiky in that their / ratios are upperbounded by with high probability. In contrast, the worst-case / ratio for degree- spherical harmonics is on the order of .
Total citations