Wired discusses Google’s announcement that it is open sourcing its TensorFlow machine learning system – noting the system uses GPUs to both train and run artificial intelligence services at the company.
Inside Google, when tackling tasks like image recognition and speech recognition and language translation, TensorFlow depends on machines equipped with GPUs that were originally designed to render graphics for games and the like, but have also proven adept at other tasks. And it depends on these chips more than the larger tech universe realizes. See the whitepaper for details of TensorFlow’s programming model and implementation).
According to Jeff Dean,
The article continues to mention that companies like Facebook, Microsoft, and Baidu, are taking advantage of NVIDIA GPUs for deep learning because they can process lots of little bits of data in parallel.
At Google, they use deep learning to not only identify photos, recognize spoken words, and translate from one language to another, but also to boost search results. And other companies are pushing the same technology into ad targeting, computer security, and even applications that understand natural language. And to do so, it will take a large amount of GPUs.
Read the entire article on Wired >>