||In this talk I present some recent developments in the context of learning theory which is a fairly new discipline at the border between computer science and statistics. Some special features differentiate Learning Theory from more classical statistical approaches, for example a systematic, quantitative assessment of the role played by random sampling or the study of the whole distribution of errors rather than just the expectation. On the other hand, from the computational point of view the effort is put in trying to develop efficient non parametric technique able to handle modern high dimensional data (like high throughput genomic technologies or high resolution images to name a few).
Besides the more classical approach based on empirical process theory (Vapnik '96) the work of Girosi and Poggio and more recently Cucker and Smale suggested a functional analytical approach which emphasize the connection between learning theory and other branch of mathematics such as approximation theory, sampling theory, signal processing and inverse problems.
The latter connection has been a main focus in our study here in genova and in this talk, after some general introduction, I'll summarize the main outcomes of our work both from a theoretical and algorithmic point of view.
The work I'll present was done over the years in collaboration with: Yuan Yao, Alessandro Verri, Michele Piana, Sergei Pereverzev, Francesca Odone, Sofia Mosci, Laura Lo Gerfo, Ernesto De Vito, Christine De Mol, Andrea Caponnetto, Frank Bauer.