This article describes a method to build semantic representations of composite expressions in a compositional way by using WordNet relations to represent the meaning of words. The meaning of a target word is modelled as a vector in which its semantically related words are assigned weights according to both the type of the relationship and the distance to the target word.
Word vectors are compositionally combined by syntactic dependencies. Each syntactic dependency triggers two complementary compositional functions: the named head function and dependent function. The experiments show that
the proposed compositional method performs as the state-of-the-art for subjectverb expressions, and clearly outperforms the best system for transitive subject-verb-object constructions.
Keywords: distributional similarity, compositional semantics, WordNet, dependencies, natural language processing