Date of Original Version

7-2011

Type

Article

Journal Title

Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP)

First Page

238

Last Page

249

Rights Management

Copyright 2011 ACL

Abstract or Description

Dual decomposition has been recently proposed as a way of combining complementary models, with a boost in predictive power. However, in cases where lightweight decompositions are not readily available (e.g., due to the presence of rich features or logical constraints), the original subgradient algorithm is inefficient. We sidestep that difficulty by adopting an augmented Lagrangian method that accelerates model consensus by regularizing towards the averaged votes. We show how first-order logical constraints can be handled efficiently, even though the corresponding subproblems are no longer combinatorial, and report experiments in dependency parsing, with state-of-the-art results.

Creative Commons License

Creative Commons Attribution-Noncommercial-Share Alike 3.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License.

Share

COinS
 

Published In

Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 238-249.