Home | Search | Help  
Home Page Università di Genova

Seminar Details

Date 21-12-2006
Time 14:15
Room/Location DISI, Sala Conferenze 322, 3 piano
Title Online gradient descent learning algorithms
Speaker Dott. Yiming Ying
Affiliation Department of Computer Science, University College London, UK
Link http://www.cs.ucl.ac.uk/
Abstract We consider the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) associated with two different types of step sizes. One is a polynomially decaying universal sequence. The other one is similar to the spirit of early stopping implicit regularization. We present a novel capacity independent approach to derive generalization bounds for this algorithm. The essential element in our analysis is the relation between the generalization error and a weighted cumulative error which we defined in the paper. In both types of step sizes, we show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately can yield competitive error rates with those for both offline and online regularization algorithms in the literature. In particular, we show that our error rates are, up to a logarithmic factor, optimal in the capacity independent sense. This is a joint work with Massimiliano Pontil.
Back to Seminars