Google AI Algorithm Masters Ancient Game of Go

Features, News, Machine Learning & Artificial Intelligence

Nadeem Mohammad, posted Jan 28 2016

For the first time, a computer has beaten a human professional at the game of Go — an ancient board game that has long been viewed as one of the greatest challenges for Artificial Intelligence. Google DeepMind’s GPU-accelerated AlphaGo program beat Fan Hui, the European Go champion, five times out of five in tournament conditions.

Read more

Vulkan Shader Resource Binding

GameWorks, GameWorks Expert Developer, Vulkan

Christoph Kubisch, posted Jan 28 2016

In this blog post we will go into further details of one of the most common state changes in scene rendering: binding shader resources such as uniform- or storage-buffers, images or samplers.

Read more

Saving Endangered Birds with Deep Learning and GPUs

Research, Big Data & Data Mining, GeForce, Machine Learning & Artificial Intelligence

Nadeem Mohammad, posted Jan 26 2016

Ornithologists study every aspect of birds, including bird songs, flight patterns, physical appearance, and migration patterns – and to do so, they use acoustic sensors and cameras placed in remote areas. Conservation Metrics, a California-based company, is using deep learning accelerated with NVIDIA GPUs to help capture the immense amounts of data that would be

Read more

Weekly Social Roundup

News, Big Data & Data Mining, Computer Vision, CUDA, DIGITS, GeForce, Machine Learning & Artificial Intelligence, Tesla

Nadeem Mohammad, posted Jan 22 2016

We love seeing all of the social media posts from developers using NVIDIA GPUs – here are a few highlights from the week: Should be titled everyone partners with nvidia on AI development. Cuda is controlling the market. https://t.co/75ElB6ZhO2 — Geoffrey Papilion (@gpapilion) January 20, 2016 OMG! — I need this yesterday NVIDIA® DIGITS™ DevBox

Read more

EGL Eye: OpenGL Visualization without an X Server

Features, In-situ, OpenGL, Visualization

Nadeem Mohammad, posted Jan 21 2016

If you’re like me, you have a GPU-accelerated in-situ visualization toolkit that you need to run on the latest-generation supercomputer. Or maybe you have a fantastic OpenGL application that you want to deploy on a server farm for offline rendering. Even though you have access to all that amazing GPU power, you’re often out of

Read more