The NVIDIA Triton Inference Server, previously known as TensorRT Inference Server, is now available from NVIDIA NGC or via GitHub.