Date of Original Version

10-2010

Type

Conference Proceeding

Rights Management

c 2010 Association for Computational Linguistics

Abstract or Description

We present a unified view of two state-of-theart non-projective dependency parsers, both approximate: the loopy belief propagation parser of Smith and Eisner (2008) and the relaxed linear program of Martins et al. (2009). By representing the model assumptions with a factor graph, we shed light on the optimization problems tackled in each method. We also propose a new aggressive online algorithm to learn the model parameters, which makes use of the underlying variational representation. The algorithm does not require a learning rate parameter and provides a single framework for a wide family of convex loss functions, including CRFs and structured SVMs. Experiments show state-of-the-art performance for 14 languages.

Creative Commons


This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

Share

COinS
 

Published In

Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing, 34-44.