HomeReseach Talks ➤ 052 17 05 2023

Multi-Domain Low Resource Language Neural Machine Translation

Velayuthan Menan
Slides Video

Machine translation systems have become an integral part of our lives, as they have brought down the language barrier and made knowledge sharing easier than ever, resulting in the transformation of local communities into a true global community. A significant contribution to this transformation can be attributed to the success of Neural Machine Translation systems (NMT). However, its important to note that while NMT systems are successful in general machine translation, they may not perform as well in domain-specific tasks. Hence the reason for domain adaptation to be an active research field. Its worth noting that this research is limited when it comes to low-resource languages. Limitations of domain adaptation for low-resource languages presents a dual challenge, involving both domain adaptation and overcoming resource scarcity. To tackle these challenges, we plan to leverage the datasets we are developing for the Google-funded project as well other available benchmark datasets and introduce a novel method for domain adaptation by utilizing Large Language Models (LLMs) with knowledge distillation governed by curriculum learning.

    Page: /