After clicking “Watch Now” you will be prompted to login or join.


WATCH NOW



 
Click “Watch Now” to login or join the NVIDIA Developer Program.

WATCH NOW

Improving Geophysical Turbulence Models with Machine Learning

Peetak Mitra, Los Alamos National Laboratory | Gavin Portwood, Los Alamos National Laboratory

GTC 2020

In large-scale geophysical flows, the relatively small-scale processes of turbulence and mixing can have a leading-order impact on the prediction of (for instance) ocean circulation and global energy budgets. Such predictions are critical components of weather and climate simulation — geophysical problems where small-scale models help offset the otherwise prohibitively expensive computational cost of simulation. These turbulence closure models attempt to capture dynamics that have complex functional dependence on a potentially broad range of large-scale flow parameters. However, models and frameworks are often phenomenological and heuristic in nature, such that robust model calibration to simulation, observation, and experiment data is a challenge. We'll explain how a nonintrusive supervised GPU-driven machine learning framework, such as Neural ODE, can help improve the state of turbulence models in canonical geophysical flows. We'll also discuss the interpretability of these machine-learning models and provide a roadmap to create more general frameworks for modeling such physics.




View More GTC 2020 Content