After clicking “Watch Now” you will be prompted to login or join.


WATCH NOW



 
Click “Watch Now” to login or join the NVIDIA Developer Program.

WATCH NOW

A Mixed-Precision Machine Learning Approach to Accelerate Geostatistical Simulations and Prediction on GPUs

Sameh Abdulah, KAUST

GTC 2020

Geostatistics represents one of the most challenging classes of scientific applications due to the desire to incorporate an ever-increasing number of geospatial locations to accurately model and predict environmental phenomena. Geostatistics modeling involves solving systems of linear equations on a given covariance matrix, which makes the model training phase prohibitive at large scale. We'll present a mixed-precision numerical solver to accelerate Geostatistics applications while maintaining appropriate prediction accuracy during the inference phase. Our algorithm not only achieves an average 1.9x speedup on CPU/GPU heterogeneous systems, but also reduces the data movement cost. The algorithm can also benefit from half-precision computations using NVIDIA Tensor Cores to further boost performance. Prediction accuracy of the new mixed-precision model shows promising results on synthetic and real environmental datasets.




View More GTC 2020 Content