Date of Original Version

8-2013

Type

Conference Proceeding

Rights Management

c 2013 Association for Computational Linguistics

Abstract or Description

We present vector space semantic parsing (VSSP), a framework for learning compositional models of vector space semantics. Our framework uses Combinatory Categorial Grammar (CCG) to define a correspondence between syntactic categories and semantic representations, which are vectors and functions on vectors. The complete correspondence is a direct consequence of minimal assumptions about the semantic representations of basic syntactic categories (e.g., nouns are vectors), and CCG’s tight coupling of syntax and semantics. Furthermore, this correspondence permits nonuniform semantic representations and more expressive composition operations than previous work. VSSP builds a CCG semantic parser respecting this correspondence; this semantic parser parses text into lambda calculus formulas that evaluate to vector space representations. In these formulas, the meanings of words are represented by parameters that can be trained in a task-specific fashion. We present experiments using noun-verbnoun and adverb-adjective-noun phrases which demonstrate that VSSP can learn composition operations that RNN (Socher et al., 2011) and MV-RNN (Socher et al., 2012) cannot

Share

COinS
 

Published In

Proceedings of the Workshop on Continuous Vector Space Models and their Compositionality, 1-10.