Abstract
Semantic vector models have proven their worth in a number of natural language applications whose goals can be accomplished by modelling individual semantic concepts and measuring similarities between them. By comparison, the area of semantic compositionality in these models has so far remained underdeveloped. This will be a crucial hurdle for semantic vector models: in order to play a fuller part in the modelling of human language, these models will need some way of modelling the way in which single concepts are put together to form more complex conceptual structures. This paper explores some of the opportunities for using vector product operations to model compositional phenomena in natural language. These vector operations are all well-known and used in mathematics and physics, particularly in quantum mechanics. Instead of designing new vector composition operators, this paper gathers a list of existing operators, and a list of typical composition operations in natural language, and describes two small experiments that begin to investigate the use of certain vector operators to model certain language phenomena. Though preliminary, our results are encouraging. It is our hope that these results, and the gathering of other untested semantic and vector compositional challenges into a single paper, will stimulate further research in this area.
Keywords
Related Publications
Compositional Matrix-Space Models for Sentiment Analysis
We present a general learning-based approach for phrase-level sentiment analysis that adopts an ordinal sentiment scale and is explicitly compositional in nature. Thus, we can m...
Compositional Matrix-Space Models of Language
We propose CMSMs, a novel type of generic compositional models for syntactic and semantic aspects of natural language, based on matrix multiplication. We argue for the structura...
Estimating Linear Models for Compositional Distributional Semantics
In distributional semantics studies, there is a growing attention in compositionally determining the distributional meaning of word sequences. Yet, compositional distributional ...
Compositional-ly Derived Representations of Morphologically Complex Words in Distributional Semantics
Speakers of a language can construct an unlimited number of new words through morphological derivation. This is a major cause of data sparseness for corpus-based approaches to l...
Composition in Distributional Models of Semantics
Abstract Vector‐based models of word meaning have become increasingly popular in cognitive science. The appeal of these models lies in their ability to represent meaning simply ...
Publication Info
- Year
- 2008
- Type
- article
- Citations
- 96
- Access
- Closed