NVIDIA GPUs power the world’s fastest supercomputer, and 20 of the 100 most powerful supercomputing clusters in the world are also powered by them too. If you follow NVIDIA closely, you are probably not surprised by this, but in a new article published in Nature this week, the leading scientific publication explains why so many researchers and developers are using NVIDIA GPUs to accelerate their work.
“Over the past decade or so, GPUs have been challenging the conventional workhorse of computing, the central processing unit (CPU), for dominance in computationally intensive work. As a result, chips that were designed to make video games look better are now being deployed to power everything from virtual reality to self-driving cars,” the Nature writers stated in their article. “Put simply, GPUs can perform vastly more calculations simultaneously than CPUs.”
The article describes how scientists in the field of molecular dynamics, astrophysics, and machine learning have adopted GPUs to accelerate their work. One of them, Evan Schneider, an astrophysicist at Princeton University, describes how GPUs have enabled her to run complex astrophysical models that wouldn’t have been possible without GPUs. Because of the technology, Schneider can simulate regions of the Galaxy in ten times as much detail, she says.
“As a result of the increase in resolution…the entire model now works differently — for example, giving new insights into how gas behaves on the outskirts of galaxies,” Schneider explained.
Also described in the article is CUDA, the parallel computing platform and application programming interface developed by NVIDIA, and how most developers prefer it for its user-friendliness.
“Matthew Liska, an astrophysicist at the University of Amsterdam, prefers CUDA for its user-friendliness. Liska wrote GPU-accelerated code simulating black holes as part of a research project; the GPUs accelerated that code by at least an order of magnitude, he says. Other scientists who spoke to Nature also said CUDA was easier to use, with plenty of code libraries and support available,” the nature team stated in the article.
The article was published this week in Nature, the International Journal of Science.
Read more>
Nature: Supercharge your Research with a GPU
Oct 03, 2018
Discuss (0)
Related resources
- DLI course: Speed Up DataFrame Operations With RAPIDS cuDF
- GTC session: Poster Reception (Sponsored by Cadence)
- GTC session: Reducing the Cost of your Data Science Workloads on the Cloud
- GTC session: Harnessing GPUs for Accelerated Air Quality Simulations in the NASA Earth System Model
- SDK: IndeX - Amazon Web Services
- Webinar: Accelerating Large-Scale Genomics Research