Abstract
We propose an incremental SVM approach to regularization. Support Vectors are added in an iterative manner during the training process. For each new vector added, the kernel parameters are settled according to an extended chained version of the Nadaraja-Watson estimator. We show this approach minimize the expected risk and leads to an efficient learning procedure.