Abstract

This paper addresses the problem of mapping natural language sentences to lambda-calculus encodings of their meaning. We describe a learning algorithm that takes as input a training set of sentences labeled with expressions in the lambda calculus. The algorithm induces a grammar for the problem, along with a log-linear model that represents a distribution over syntactic and semantic analyses conditioned on the input sentence. We apply the method to the task of learning natural language interfaces to databases and show that the learned parsers outperform previous methods in two benchmark database domains.

Keywords

Combinatory categorial grammarComputer scienceNatural language processingArtificial intelligenceCategorial grammarParsingSentenceNatural languageLogical formProbabilistic logicRule-based machine translationTask (project management)Benchmark (surveying)Grammar inductionLink grammarGrammarSet (abstract data type)Definite clause grammarContext-sensitive grammarAttribute grammarMildly context-sensitive grammar formalismContext-free grammarProgramming languageLinguisticsGenerative grammarEmergent grammar

Related Publications

Skip-Thought Vectors

We describe an approach for unsupervised learning of a generic, distributed sentence encoder. Using the continuity of text from books, we train an encoder-decoder model that tri...

2015 arXiv (Cornell University) 723 citations

Publication Info

Year
2012
Type
article
Pages
658-666
Citations
791
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

791
OpenAlex

Cite This

Luke Zettlemoyer, Michael J. Collins (2012). Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars. arXiv (Cornell University) , 658-666. https://doi.org/10.48550/arxiv.1207.1420

Identifiers

DOI
10.48550/arxiv.1207.1420