The Workshop on Automatic Simultaneous Translation (2021)
up
Proceedings of the Second Workshop on Automatic Simultaneous Translation
Proceedings of the Second Workshop on Automatic Simultaneous Translation
Hua Wu
|
Colin Cherry
|
Liang Huang
|
Zhongjun He
|
Qun Liu
|
Maha Elbayad
|
Mark Liberman
|
Haifeng Wang
|
Mingbo Ma
|
Ruiqing Zhang
ICT’s System for AutoSimTrans 2021 : Robust Char-Level Simultaneous TranslationICT’s System for AutoSimTrans 2021: Robust Char-Level Simultaneous Translation
Shaolei Zhang
|
Yang Feng
Simultaneous translation (ST) outputs the translation simultaneously while reading the input sentence, which is an important component of simultaneous interpretation. In this paper, we describe our submitted ST system, which won the first place in the streaming transcription input track of the Chinese-English translation task of AutoSimTrans 2021. Aiming at the robustness of ST, we first propose char-level simultaneous translation and applied wait-k policy on it. Meanwhile, we apply two data processing methods and combine two training methods for domain adaptation. Our method enhance the ST model with stronger robustness and domain adaptability. Experiments on streaming transcription show that our method outperforms the baseline at all latency, especially at low latency, the proposed method improves about 6 BLEU. Besides, ablation studies we conduct verify the effectiveness of each module in the proposed method.
XMU’s Simultaneous Translation System at NAACL 2021XMU’s Simultaneous Translation System at NAACL 2021
Shuangtao Li
|
Jinming Hu
|
Boli Wang
|
Xiaodong Shi
|
Yidong Chen
This paper describes our two systems submitted to the simultaneous translation evaluation at the 2nd automatic simultaneous translation workshop.