Cross-Language Transformer Adaptation for Frequently Asked Questions

Fiche du document

Date

3 septembre 2021

Discipline
Périmètre
Langue
Identifiants
Collection

OpenEdition Books

Organisation

OpenEdition

Licences

https://www.openedition.org/12554 , info:eu-repo/semantics/openAccess




Citer ce document

Luca Di Liello et al., « Cross-Language Transformer Adaptation for Frequently Asked Questions », Accademia University Press, ID : 10.4000/books.aaccademia.8463


Métriques


Partage / Export

Résumé 0

Transfer learning has been proven to be effective, especially when data for the target domain/task is scarce. Sometimes data for a similar task is only available in another language because it may be very specific. In this paper, we explore the use of machine-translated data to transfer models on a related domain. Specifically, we transfer models from the question duplication task (QDT) to similar FAQ selection tasks. The source domain is the well-known English Quora dataset, while the target domain is a collection of small Italian datasets for real case scenarios consisting of FAQ groups retrieved by pivoting on common answers. Our results show great improvements in the zero-shot learning setting and modest improvements using the standard transfer approach for direct in-domain adaptation.

document thumbnail

Par les mêmes auteurs

Sur les mêmes sujets

Sur les mêmes disciplines

Exporter en