Date of Original Version



Technical Report

Rights Management

All Rights Reserved

Abstract or Description

Abstract: "Quite a few interesting experiments have been done applying neural networks to natural language tasks. Without detracting from the value of these early investigations, this paper argues that current neural network architectures are too weak to solve anything but toy language problems. Their downfall is the need for 'dynamic inference,' in which several pieces of information not previously seen together are dynamically combined to derive the meaning of a novel input. The first half of the paper defines a hierarchy of classes of connectionist models, from categorizers and associative memories to pattern transformers and dynamic inferencers. Some well-known connectionist models that deal with natural language are shown to be either categorizers or pattern transformers. The second half examines in detail a particular natural language problem: prepositional phrase attachment. Attaching a PP to an NP changes its meaning, thereby influencing other attachments. So PP attachment requires compositional semantics, and compositionality in non-toy domains requires dynamic inference.Mere pattern transformers cannot learn the PP attachment task without an exponential training set. Connectionist-style computation still has many valuable ideas to offer, so this is not an indictment of connectionism's potential. It is an argument for a more sophisticated and more symbolic connectionist approach to language."