Statements of natural language describe a world, be it real or imaginary. Besides referring to things, actions and so on words also evoke a concept, a shorthand summary of preceding descriptions. Predicate logic captures the denoting content of words whereas vector semantics of information retrieval aims at the conceptual content.
This talk gives a rather naive account how both semantics interact.
The first step is a formulation of predicate logic in the category of finite dimensional vector spaces over the field of real numbers. The elements and sets of predicate logic become vectors and the predicates linear maps. The resulting logic is functional and thus compositional. Pregroup grammars introduce an efficient algorithm that computes the meaning of a grammatical string from the meaning of the words in the string.
The basis of a concept space consists of mutually exclusive Boolean combinations of an arbitrarily chosen set of 'primitive concepts'. The vectors with components in the real interval [0,1] stand for concepts. The propositional connectives and a consequence relation are defined by algebraic operators.
The vectors and linear maps associated to words are mapped to concept vectors via the canonical frequency distribution associated to linear predicates. Thus, each concept vector associated to a word has a probabilistic interpretation. The map reflects the consequence relation. It preserves the consequence relation and the connectives under certain conditions.
To conclude, I show that the algebraic consequence relation coincides with the geometrical consequence relation of quantum logic on projectors and relate conceptual reasoning in natural language semantics to probabilistic reasoning.