NVIDIA TensorRT Inference Server Boosts Deep Learning Inference

Artificial Intelligence, Cloud Services, Containers, Deep Learning, Featured, GPU, machine learning and AI, NGC, TensorRT, TensorRT Inference Server

Nadeem Mohammad, posted Sep 12 2018

You’ve built, trained, tweaked and tuned your model. You finally create a TensorRT, TensorFlow or ONNX model that meets your requirements. Now you need an inference solution, deployable to a datacenter or to the cloud.

Read more

Docker Compatibility with Singularity for HPC

Accelerated Computing, cluster and supercomputing, Containers, Development Tools and Libraries, Docker, HPC, NGC, singularity

Nadeem Mohammad, posted Aug 15 2018

Bare-metal installations of HPC applications on a shared system require system administrators to build environment modules for 100s of applications which is complicated, high maintenance, and time consuming.

Read more