Analogy Training Multilingual Encoderss

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Documents

  • Nicolas Garneau
  • lwp876 lwp876
  • Anders Sandholm
  • Sebastian Ruder
  • Ivan Vulić
  • Søgaard, Anders
Language encoders encode words and phrases in ways that capture their local semantic relatedness, but are known to be globally inconsistent. Global inconsistency can seemingly be corrected for, in part, by leveraging signals from knowledge bases, but previous results are partial and limited to monolingual English encoders. We extract a large-scale multilingual, multi-word analogy dataset from Wikidata for diagnosing and correcting for global inconsistencies, and then implement a four-way Siamese BERT architecture for grounding multilingual BERT (mBERT) in Wikidata through analogy training. We show that analogy training not only improves the global consistency of mBERT, as well as the isomorphism of language-specific subspaces, but also leads to consistent gains on downstream tasks such as bilingual dictionary induction and sentence retrieval.
Original languageEnglish
Title of host publicationProceedings of the AAAI-21 International Joint Conference on Artificial Intelligence
Number of pages10
PublisherAAAI Press
Publication date2021
Pages12884-12892.
ISBN (Electronic)978-1-57735-866-4
Publication statusPublished - 2021
Event35th AAAI Conference on Artificial Intelligence - Virtual
Duration: 2 Feb 20219 Feb 2021

Conference

Conference35th AAAI Conference on Artificial Intelligence
ByVirtual
Periode02/02/202109/02/2021
SeriesProceedings of the International Joint Conference on Artificial Intelligence
Number14
Volume35
ISSN1045-0823

Number of downloads are based on statistics from Google Scholar and www.ku.dk


No data available

ID: 300671526