On November 11th, Melissa Ailem, Researcher at the Lab, presented Lingua Custodia’s submission to the WMT21 shared task on machine translation using terminologies. It is the third time the Lab contributes to the conference.
WMT is a major event where best MT players, both academic and industry-side, present their latest findings.
This paper describes Lingua Custodia’s submission to the WMT21 shared task on machine translation using terminologies. We consider three directions, namely English to French, Russian, and Chinese. We rely on a Transformer-based architecture as a building block, and we explore a method which introduces two main changes to the standard procedure to handle terminologies. The first one consists in augmenting the training data in such a way as to encourage the model to learn a copy behavior when it encounters terminology constraint terms. The second change is constraint token masking, whose purpose is to ease copy behavior learning and to improve model generalization. Empirical results show that our method satisfies most terminology constraints while maintaining high translation quality.