Congress 1330
Author/s
  • Pablo Gamallo, Manuel Prada Corral, Marcos Garcia
DOI
Source
  • 13th International Conference on Agents and Artificial Intelligence. , Portugal. 2021

Comparing Dependency-based Compositional Models with Contextualized Word Embedding

In this article, we compare two different strategies to contextualize the meaning of words in a sentence: both distributional models that make use of syntax-based methods following the Principle of Compositionality and Transformer technology such as BERT-like models. As the former methods require controlled syntactic struc- tures, the two approaches are compared against datasets with syntactically fixed sentences, namely subject- predicate and subject-predicate-object expressions. The results show that syntax-based compositional ap- proaches working with syntactic dependencies are competitive with neural-based Transformer models, and could have a greater potential when trained and developed using the same resources.
Keywords: Compositional Distributional Models, Contextualized Word Embeddings, Transformers, Compositionality, Dependency-based Parsing.
Canonical link