Deriving Boolean Structures from Distributional Vectors
Abstract
Corpus-based distributional semantic models capture degrees of semantic relatedness among the words of very large vocabularies, but have problems with logical phenomena such as entailment, that are instead elegantly handled by model-theoretic approaches, which, in turn, do not scale up.
We combine the advantages of the two views by inducing a mapping from distributional vectors of words (or sentences) into a Boolean structure of the kind in which natural language terms are assumed to denote. We evaluate this Boolean Distributional Semantic Model (BDSM) on recognizing entailment between words and sentences. The method achieves results comparable to a state-of-the art SVM, degrades more gracefully when less training data are available and displays interesting qualitative properties.
Full Text:
PDF (presented at EMNLP 2015)Refbacks
- There are currently no refbacks.
Copyright (c) 2015 Association for Computational Linguistics

This work is licensed under a Creative Commons Attribution 4.0 International License.