Date of Original Version

7-2010

Type

Conference Proceeding

Journal Title

Proceedings of the Annual Meeting of the Association for Computational Linguistics

First Page

1502

Last Page

1511

Rights Management

Copyright 2010 ACL

Abstract or Description

We consider the search for a maximum likelihood assignment of hidden derivations and grammar weights for a probabilistic context-free grammar, the problem approximately solved by “Viterbi training.” We show that solving and even approximating Viterbi training for PCFGs is NP-hard. We motivate the use of uniformat-random initialization for Viterbi EM as an optimal initializer in absence of further information about the correct model parameters, providing an approximate bound on the log-likelihood.

Creative Commons License

Creative Commons Attribution-Noncommercial-Share Alike 3.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License.

Share

COinS
 

Published In

Proceedings of the Annual Meeting of the Association for Computational Linguistics, 1502-1511.