Thorsten Kurth

Thorsten Kurth works at NVIDIA on optimizing scientific codes for GPU-based supercomputers. His focus is on providing optimized deep learning applications for HPC systems, including MLPerf HPC benchmark applications. These include end-to-end optimizations such as input pipeline including I/O tuning and distributed training. In 2018, he was awarded the Gordon Bell Prize for the first deep learning application that achieved more than 1 exaop peak performance on the OLCF Summit HPC system. In 2020, he was awarded the Gordon Bell Special Prize for HPC-based Covid-19 research for efficiently generating large ensembles of scientifically relevant spike trimer confirmations using the AI-driven MD simulations workflow.
Avatar photo

Posts by Thorsten Kurth

Simulation / Modeling / Design

Modeling Earth’s Atmosphere with Spherical Fourier Neural Operators

Machine learning-based weather prediction has emerged as a promising complement to traditional numerical weather prediction (NWP) models. Models such as NVIDIA... 8 MIN READ