Date of Original Version



Conference Proceeding

Rights Management

Copyright 2011 by the authors.

Abstract or Description

Training structured predictors often requires a considerable time selecting features or tweaking the kernel. Multiple kernel learning (MKL) sidesteps this issue by embedding the kernel learning into the training procedure. Despite the recent progress towards efficiency of MKL algorithms, the structured output case remains an open research front. We propose a family of online algorithms able to tackle variants of MKL and group-LASSO, for which we show regret, convergence, and generalization bounds. Experiments on handwriting recognition and dependency parsing attest the success of the approach.



Published In

Journal of Machine Learning Research : Workshop and Conference Proceedings, 15, 507-515.