| Abstract |
A method is given to learn and represent similarity with linear operators
in kernel-induced Hilbert spaces. Transferring error bounds for vector
valued large-margin classifiers to the setting of Hilbert-Schmidt operators
leads to dimension free bounds on a risk functional for linear
representations and motivates a regularized objective functional.
Minimization of this objective is effected by stochastic gradient descent.
The resulting representations are tested on transfer problems in image
processing, involving plane and spatial geometric invariants, handwritten
characters and face recognition. |