[BLANK]

Google Introduces Huge Universal Language Translation Model: 103 Languages Trained on Over 25 Billion Examples


Publication Title
Synced
Publication/Creation Date
October 31 2019
Creators/Contributors
Graham Neubig (creator)
Carnegie Mellon University (contributor)
Persuasive Intent
Information
Description
Neural machine translation (NMT) is a machine translation approach that utilizes an artificial neural network to predict the likelihood of a sequence of words. With the development of deep learning, NMT is playing a major role in machine translation and has been adopted by Google, Microsoft, IBM and other tech giants. Synced invited Graham Neubig, an Assistant Professor at the Language Technologies Institute of Carnegie Mellon University who works on natural language processing specifically multi-lingual models and natural language interfaces, to share his thoughts on the universal neural machine translation (NMT) system.
HCI Platform
Carryables
Location on Body
Hand
Marketing Keywords
Google, IBM, Microsoft
Source
https://syncedreview.com/2019/10/31/google-introduces-huge-universal-language-translation-model-103-languages-trained-on-over-25-billion-examples/

Date archived
May 3 2020
Last edited
May 22 2020
How to cite this entry
Graham Neubig. (October 31 2019). "Google Introduces Huge Universal Language Translation Model: 103 Languages Trained on Over 25 Billion Examples". Synced. Synched. Fabric of Digital Life. https://fabricofdigitallife.com/index.php/Detail/objects/4520