SONY Breaks ResNet-50 Training Record with NVIDIA V100 Tensor Core GPUs

Accelerated Computing, Artificial Intelligence, Features, Computer Vision & Machine Vision, CUDA, cuDNN, Development Tools & Libraries, Machine Learning & Artificial Intelligence

Nadeem Mohammad, posted Nov 13 2018

Researchers from SONY today announced a new speed record for training ImageNet/ResNet 50 in only 224 seconds (three minutes and 44 seconds) with 75 percent accuracy using 2,100 NVIDIA Tesla V100 Tensor Core GPUs.

Read more

AI Research Detects Glaucoma with 94 Percent Accuracy

Artificial Intelligence, Computer Vision & Machine Vision, CUDA, Healthcare & Life Sciences, Machine Learning & Artificial Intelligence, Tesla

Nadeem Mohammad, posted Nov 08 2018

Glaucoma affects more than 2.7 million people in the U.S. and is one of the leading causes of blindness in the world.

Read more

AI Study Predicts Alzheimer’s Six Years Before Diagnosis

Artificial Intelligence, Features, Computer Vision & Machine Vision, CUDA, cuDNN, GeForce, Healthcare & Life Sciences, Machine Learning & Artificial Intelligence

Nadeem Mohammad, posted Nov 07 2018

A new study published in Radiology describes how deep learning can improve the ability of brain imaging to predict Alzheimer’s disease years before an actual diagnosis.

Read more

CUDA on Turing Opens New GPU Compute Possibilities

Accelerated Computing, CUDA, Game Development, RTX 2070, trie, Turing, Volta

Nadeem Mohammad, posted Nov 07 2018

The Turing architecture introduces so many cool new features that it’s easy to miss the quiet revolution in GPU programming that it also represents: all of the features introduced with Volta now exist in a GeForce product.

Read more

Accelerated Ray Tracing in One Weekend in CUDA

Accelerated Computing, Design & Visualization, C++, CUDA, Game Development, ray tracing

Nadeem Mohammad, posted Nov 05 2018

Recent announcements of NVIDIA’s new Turing GPUs, RTX technology, and Microsoft’s DirectX Ray Tracing have spurred a renewed interest in ray tracing. Using these technologies vastly simplifies the ability to write applications using ray tracing.

Read more

Real-Time Noise Suppression Using Deep Learning

Artificial Intelligence, 2hz.ai, Cloud Services, CUDA, Deep Learning, edge computing, machine learning and AI, noise suppression, telecoms

Nadeem Mohammad, posted Oct 31 2018

Imagine waiting for your flight at the airport. Suddenly, an important business call with a high profile customer lights up your phone.

Read more

Visualizing Star Polymers in Record Time

Accelerated Computing, Design & Visualization, Cluster/Supercomputing, Computer Graphics & Visualization, CUDA, GeForce, Higher Education/Academia, Visualization

Nadeem Mohammad, posted Oct 11 2018

In the last five minutes, you have probably come into contact with more polymers than you can count. In fact, they are everywhere; in grocery bags,  water bottles, phones, computers, food packaging, auto parts, tires, airplanes, and toys.

Read more

Using MATLAB and TensorRT on NVIDIA GPUs

Artificial Intelligence, Computer Vision & Machine Vision, CUDA, Development Tools & Libraries, Machine Learning & Artificial Intelligence, MATLAB, TensorRT

Nadeem Mohammad, posted Oct 09 2018

MathWorks recently released MATLAB R2018b which integrates with NVIDIA TensorRT through GPU Coder. With this integration, scientists and engineers can achieve faster inference performance on GPUs from within MATLAB.

Read more

PyTorch 1.0 Accelerated On NVIDIA GPUs

Artificial Intelligence, Computer Vision & Machine Vision, CUDA, cuDNN, Development Tools & Libraries, Machine Learning & Artificial Intelligence, Python

Nadeem Mohammad, posted Oct 02 2018

Facebook announced availability of PyTorch 1.0 preview release today at the PyTorch Developer Conference, an event for PyTorch Developer Community. PyTorch is one of the most widely used deep learning frameworks by researchers and developers.

Read more

NVIDIA Turing SDKs Now Available

Accelerated Computing, Artificial Intelligence, Features, Game Development, Computer Vision & Machine Vision, CUDA, Machine Learning & Artificial Intelligence, Turing

Nadeem Mohammad, posted Sep 20 2018

NVIDIA’s Turing architecture is one of the biggest leaps in computer graphics in 20 years. Here’s a look at the latest developer software releases to take advantage of this cutting-edge GPU.

Read more