Otwarty dostęp

Hyper-parameter optimization in neural-based translation systems: A case study


Zacytuj

Machine translation (MT) is an important use case in natural language processing (NLP) that converts a source language to a target language automatically. Modern intelligent system or artificial intelligence (AI) uses a machine learning approach and the machine has acquired learning ability using datasets. Nowadays, in the MT domain, the neural machine translation (NMT) system has almost replaced the statistical machine translation (SMT) system. The NMT systems use a deep learning framework in their implementation. To achieve higher accuracy during the training of the NMT model, extensive hyper-parameter tuning is required. The paper highlights the significance of hyper-parameter tuning in various machine learning algorithms. And as a case study, in-house experimentation was conducted on a low-resource English–Bangla language pair by designing an NMT system and the significance of various hyper-parameter optimizations was analyzed while evaluating its performance with an automatic metric BLEU. The BLEU scores obtained for the first, second, and third randomly picked test sentences are 4.1, 3.2, and 3.01, respectively.

eISSN:
1178-5608
Język:
Angielski
Częstotliwość wydawania:
Volume Open
Dziedziny czasopisma:
Engineering, Introductions and Overviews, other