Researchers from Google, along with collaborators from academia, announced today they developed a deep learning-based system for identifying protein crystallization, achieving a 94 percent accuracy rate.
Physicists from more than a dozen institutions used the power of the GPU-accelerated Titan Supercomputer at Oak Ridge National Laboratory to calculate a subatomic-scale physics problem of measuring the lifetime of neutrons.
Scientists used an extremely high-resolution transmission electron microscope to capture 2D projections of the nanoparticle’s structure, and used an algorithm to stitch those together into a 3D reconstruction.
Thomas Cheatham, professor of Medicinal Chemistry and the director of research computing at University of Utah shares how they’re using the GPU-accelerated Blue Waters supercomputer and NVLink to compute the interactions of atoms that can lead to dru
Gil Speyer, Senior Postdoctoral Fellow at the Translational Genomics Research Institute (TGen) shares how NVIDIA technology is accelerating the computer processing of transcriptomes from thousands of cells gleaned from patient tumor samples.
The Facebook Artificial Intelligence Research (FAIR) lab announced a new Research Partnership Program to spur advances in Artificial Intelligence and machine learning — Facebook will be giving out 25 servers powered with GPUs, free of charge.