papers:bottou-lecun-vapnik-1999
Report: Predicting Learning Curves without the Ground Truth Hypothesis
Abstract: Upper bounds for the deviation between test error and training error of a learning machine are derived in the case where no probability distribution that generates the examples is assumed to exist. The bounds are data-dependent and algorithm dependent. The result justifies the concept of data-dependent and algorithm dependent VC-dimension.
Léon Bottou, Yann Le Cun and Vladimir Vapnik: Report: Predicting Learning Curves without the Ground Truth Hypothesis, May 1999.
nogroundtruth-1999.djvu nogroundtruth-1999.pdf nogroundtruth-1999.ps.gz
@misc{bottou-lecun-vapnik-1999,
author = {Bottou, L\'{e}on and {Le Cun}, Yann and Vapnik, Vladimir},
title = {Report: Predicting Learning Curves without the Ground Truth Hypothesis},
year = {1999},
month = {May},
note = {Available on http://leon.bottou.org/papers"},
url = {http://leon.bottou.org/papers/bottou-lecun-vapnik-1999},
}
Notes
This is a slightly modernized rewrite of (Bottou et al., 1994) with a stronger focus on the lack of independence assumptions, but these bounds were already present in the 1994 paper. Please refer to the notes associated with this earlier text.
papers/bottou-lecun-vapnik-1999.txt · Last modified: 2006/12/13 19:03 by leonb
