Authors
Patrik Lambert, Rafael E Banchs
Publication date
2006/11/27
Conference
Proceedings of the International Workshop on Spoken Language Translation
Pages
190-196
Description
Most of statistical machine translation systems are combinations of various models, and tuning of the scaling factors is an important step. However, this optimisation problem is hard because the objective function has many local minima and the available algorithms cannot achieve a global optimum. Consequently, optimisations starting from different initial settings can converge to fairly different solutions. We present tuning experiments with the Simultaneous Perturbation Stochastic Approximation (SPSA) algorithm, and compare them to tuning with the widely used downhill simplex method. With IWSLT 2006 Chinese-English data, both methods showed similar performance, but SPSA was more robust to the choice of initial settings.
Total citations
200720082009201020112012201320142015201620172018201920202021202220233323211212
Scholar articles
P Lambert, RE Banchs - Proceedings of the International Workshop on Spoken …, 2006