TY - JOUR
T1 - A novel and robust approach for pro-drop language translation
AU - Wang, Longyue
AU - Tu, Zhaopeng
AU - Zhang, Xiaojun
AU - Liu, Siyou
AU - Li, Hang
AU - Way, Andy
AU - Liu, Qun
N1 - Publisher Copyright:
© 2017, The Author(s).
PY - 2017/6/1
Y1 - 2017/6/1
N2 - A significant challenge for machine translation (MT) is the phenomena of dropped pronouns (DPs), where certain classes of pronouns are frequently dropped in the source language but should be retained in the target language. In response to this common problem, we propose a semi-supervised approach with a universal framework to recall missing pronouns in translation. Firstly, we build training data for DP generation in which the DPs are automatically labelled according to the alignment information from a parallel corpus. Secondly, we build a deep learning-based DP generator for input sentences in decoding when no corresponding references exist. More specifically, the generation has two phases: (1) DP position detection, which is modeled as a sequential labelling task with recurrent neural networks; and (2) DP prediction, which employs a multilayer perceptron with rich features. Finally, we integrate the above outputs into our statistical MT (SMT) system to recall missing pronouns by both extracting rules from the DP-labelled training data and translating the DP-generated input sentences. To validate the robustness of our approach, we investigate our approach on both Chinese–English and Japanese–English corpora extracted from movie subtitles. Compared with an SMT baseline system, experimental results show that our approach achieves a significant improvement of + 1.58 BLEU points in translation performance with 66% F-score for DP generation accuracy for Chinese–English, and nearly + 1 BLEU point with 58% F-score for Japanese–English. We believe that this work could help both MT researchers and industries to boost the performance of MT systems between pro-drop and non-pro-drop languages.
AB - A significant challenge for machine translation (MT) is the phenomena of dropped pronouns (DPs), where certain classes of pronouns are frequently dropped in the source language but should be retained in the target language. In response to this common problem, we propose a semi-supervised approach with a universal framework to recall missing pronouns in translation. Firstly, we build training data for DP generation in which the DPs are automatically labelled according to the alignment information from a parallel corpus. Secondly, we build a deep learning-based DP generator for input sentences in decoding when no corresponding references exist. More specifically, the generation has two phases: (1) DP position detection, which is modeled as a sequential labelling task with recurrent neural networks; and (2) DP prediction, which employs a multilayer perceptron with rich features. Finally, we integrate the above outputs into our statistical MT (SMT) system to recall missing pronouns by both extracting rules from the DP-labelled training data and translating the DP-generated input sentences. To validate the robustness of our approach, we investigate our approach on both Chinese–English and Japanese–English corpora extracted from movie subtitles. Compared with an SMT baseline system, experimental results show that our approach achieves a significant improvement of + 1.58 BLEU points in translation performance with 66% F-score for DP generation accuracy for Chinese–English, and nearly + 1 BLEU point with 58% F-score for Japanese–English. We believe that this work could help both MT researchers and industries to boost the performance of MT systems between pro-drop and non-pro-drop languages.
KW - Dropped pronoun annotation
KW - Dropped pronoun generation
KW - Machine translation
KW - Multilayer perceptron
KW - Pro-drop language
KW - Recurrent neural networks
KW - Semi-supervised approach
UR - http://www.scopus.com/inward/record.url?scp=85010748085&partnerID=8YFLogxK
U2 - 10.1007/s10590-016-9184-9
DO - 10.1007/s10590-016-9184-9
M3 - Article
AN - SCOPUS:85010748085
SN - 0922-6567
VL - 31
SP - 65
EP - 87
JO - Machine Translation
JF - Machine Translation
IS - 1-2
ER -