About Time: Do Transformers Learn Temporal Verbal Aspect? - Intelligence Artificielle Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

About Time: Do Transformers Learn Temporal Verbal Aspect?

Résumé

Aspect is a linguistic concept that describes how an action, event, or state of a verb phrase is situated in time. In this paper, we explore whether different transformer models are capable of identifying aspectual features. We focus on two specific aspectual features: telicity and duration. Telicity marks whether the verb's action or state has an endpoint or not (telic/atelic), and duration denotes whether a verb expresses an action (dynamic) or a state (stative). These features are integral to the interpretation of natural language, but also hard to annotate and identify with NLP methods. We perform experiments in English and French, and our results show that transformer models adequately capture information on telicity and duration in their vectors, even in their non-finetuned forms, but are somewhat biased with regard to verb tense and word order.
Fichier principal
Vignette du fichier
Hathout-2022-CMCL.pdf (473.62 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-03699336 , version 1 (20-06-2022)

Identifiants

Citer

Eleni Metheniti, Tim van de Cruys, Nabil Hathout. About Time: Do Transformers Learn Temporal Verbal Aspect?. 12th Workshop on Cognitive Modeling and Computational Linguistics (CMCL 2022), May 2022, Dublin, Ireland. pp.88-101, ⟨10.18653/v1/2022.cmcl-1.10⟩. ⟨hal-03699336⟩
117 Consultations
65 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More