Skip to main content
arXiv

Computer Science > Computation and Language

arXiv:2004.08053 (cs)

Title:Enriching the Transformer with Linguistic and Semantic Factors for Low-Resource Machine Translation

Download PDF
Abstract: Introducing factors, that is to say, word features such as linguistic information referring to the source tokens, is known to improve the results of neural machine translation systems in certain settings, typically in recurrent architectures. This study proposes enhancing the current state-of-the-art neural machine translation architecture, the Transformer, so that it allows to introduce external knowledge. In particular, our proposed modification, the Factored Transformer, uses factors, either linguistic or semantic, that insert additional knowledge into the machine translation system. Apart from using different kinds of features, we study the effect of different architectural configurations. Specifically, we analyze the performance of combining words and features at the embedding level or at the encoder level, and we experiment with two different combination strategies. With the best-found configuration, we show improvements of 0.8 BLEU over the baseline Transformer in the IWSLT German-to-English task. Moreover, we experiment with the more challenging FLoRes English-to-Nepali benchmark, which includes both extremely low-resourced and very distant languages, and obtain an improvement of 1.2 BLEU. These improvements are achieved with linguistic and not with semantic information.
Subjects: Computation and Language (cs.CL)
ACM classes: I.2.7
Cite as: arXiv:2004.08053 [cs.CL]
  (or arXiv:2004.08053v1 [cs.CL] for this version)

Submission history

From: Marta R. Costa-jussà [view email]
[v1] Fri, 17 Apr 2020 03:40:13 UTC (93 KB)
Full-text links:

Download:

Current browse context:
cs.CL
< prev   |   next >
Change to browse by:
cs

References & Citations

Bookmark

BibSonomy logo Mendeley logo Reddit logo ScienceWISE logo