A Bidirectional Recurrent Neural Language Model for Machine Translation

Álvaro Peris, Francisco Casacuberta

Resumen


A language model based in continuous representations of words is presented, which has been applied to a statistical machine translation task. This model is implemented by means of a bidirectional recurrent neural network, which is able to take into account both past and future context of a word in order to perform predictions. Due to its computational cost, for obtaining relevant training data an instance selection algorithm is used, which aims to capture useful information for translating a test set. Obtained results show that the neural model trained with the selected data outperforms the results obtained by an n-gram language model.

Texto completo:

PDF