After clicking “Watch Now” you will be prompted to login or join.


WATCH NOW



 
Click “Watch Now” to login or join the NVIDIA Developer Program.

WATCH NOW

Scaling Data by 109x and Compute for Deep-Learning Applications

John Taylor, DST/CSIRO | Pablo Rozas Larraondo, Australian National University

GTC 2020

We'll explore the scalable applications of artificial intelligence on massive data sets. First, we'll cover how we optimized and developed highly parallelized implementations of DL algorithms and tested them on HPC GPU clusters. Then we'll demonstrate how to develop models that can run over large high-resolution datasets, identifying the spatial and temporal relationships between physical parameters in global-scale high-resolution numerical weather prediction models.




View More GTC 2020 Content