Jeff Pool

Jeff Pool is a senior architect in the Deep Learning architecture team investigating efficient DL methods. He’s contributed to many areas of various architectures after joining NVIDIA in 2012, but he’s most recently been playing with sparse neural networks. Jeff holds a PhD in computer science with a focus on efficient graphics hardware.

Posts by Jeff Pool

Technical Walkthrough 0

Accelerating Inference with Sparsity Using the NVIDIA Ampere Architecture and NVIDIA TensorRT

○ TensorRT is an SDK for high-performance deep learning inference, and TensorRT 8.0 introduces support for sparsity that uses sparse tensor cores on NVIDIA Ampere GPUs. It can accelerate networks by reducing the computation of zeros present in GEMM operations in neural networks. You get a performance gain compared to dense networks by just following the steps in this post. 8 MIN READ
Technical Walkthrough 0

Exploiting NVIDIA Ampere Structured Sparsity with cuSPARSELt

Deep neural networks achieve outstanding performance in a variety of fields, such as computer vision, speech recognition, and natural language processing. 9 MIN READ