Think of things backwards-How does Google Translate work

The machine translation can be considered as a simple model. We input some words or sentences, the machine analysis the input, transfer the input and then generate the output, words or sentences in other languages. When we do not apply the Neural Network to machine translation, we usually use three architectures, direct, transfer and interlingua for machine translation which do not involve probability and statistics. In fact, “GNMT did not create its own universal interlingua but rather aimed at finding the commonality between many languages using insights from psychology and linguistics” (McDonald, 2017).  Google translate belongs to the statistical MT. When comes to the statistical MT, “All statistical translation models are based on the idea of a word alignment. A word alignment is a mapping between the source words and the target words in a set of parallel sentences”, according to the Speech and Language Processing.  Google translate uses the bidirectional RNN to align the input and output. Firstly, it encodes the input sentence into vectors by one RNN for the input language which is used for encoding. Then the vectors will try to be mapped by many vectors which represent words in language of output (actually, words here are still vectors) to find which alignment is the best. It is just like what Speech and Language Processing wrote “think of things backwards”. The task here is to find the hidden output vectors which can generate the input vectors. And from this step we can know that it is a supervised machine learning. After the mapping or alignment, the match vectors will be decoded into words in output language.

Question:

Does the google translate use English as a bridge to link with other language, like Chinese and Japanese?

How does the statistical machine translation deal with syntax and semantics, use the probability and statistics to skip these kinds of problems?

References

A Neural Network for Machine Translation, at Production Scale. (n.d.). Google AI Blog. Retrieved March 9, 2021, from http://ai.googleblog.com/2016/09/a-neural-network-for-machine.html

CS Dojo Community. (2019, February 14). How Google Translate Works—The Machine Learning Algorithm Explained! https://www.youtube.com/watch?v=AIpXjFwVdIE

Jurafsky, D., & Martin, J. H. (2000). Speech and language processing: An introduction to natural language processing, computational linguistics, and speech recognition. Prentice Hall.

McDonald, C. (2017, January 7). Ok slow down. Medium. https://medium.com/@chrismcdonald_94568/ok-slow-down-516f93f83ac8