Abstract
This paper describes a system for extracting typed dependency parses of English sentences from phrase structure parses. In order to capture inherent relations occurring in corpus texts that can be critical in real-world applications, many NP relations are included in the set of grammatical relations used. We provide a comparison of our system with Minipar and the Link parser. The typed dependency extraction facility described here is integrated in the Stanford Parser, available for download. 1.
Keywords
Affiliated Institutions
Related Publications
Parsing Natural Scenes and Natural Language with Recursive Neural Networks
Recursive structure is commonly found in the inputs of different modalities such as natural scene images or natural language sentences. Discovering this recursive structure help...
Transition network grammars for natural language analysis
The use of augmented transition network grammars for the analysis of natural language sentences is described. Structure-building actions associated with the arcs of the grammar ...
DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding
Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP tasks to capture the long-term and local dependencies, respectively. Attention mechanisms ...
Recurrent Neural Network Grammars
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow applicatio...
Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars
This paper addresses the problem of mapping natural language sentences to lambda-calculus encodings of their meaning. We describe a learning algorithm that takes as input a trai...
Publication Info
- Year
- 2006
- Type
- article
- Pages
- 449-454
- Citations
- 2301
- Access
- Closed