Date of Original Version

12-1999

Type

Conference Proceeding

Abstract or Description

Whole sentence exponential language models directly model the probability of an entire sentence using arbitrary computable properties of that sentence. We present an interactive methodology for feature induction, and demonstrate it in the simple but common case of a trigram baseline, focusing on features that capture the linguistic notion of semantic coherence. We then show how parametric regression can be used in this setup to efficiently estimate the model's parameters, whereas non-parametric regression can be used to construct more powerful exponential models from the raw features.

Share

COinS
 

Published In

Proceedings of the IEEE Workshop on Automatic Speech Recognition and Understanding.