Nsight Deep Learning Designer
Nsight DL Designer is an integrated development environment that helps developers efficiently design and develop deep neural networks for in-app inference.Get started
End-to-End Support for Deep Learning Development
DL development for in-app inference is a highly iterative process, where changes to a model, the training parameters, training data, or even the user-based evaluation process can cause the development cycle to start anew. Nsight DL Designer aims at streamlining this process by providing developers with effective support at every stage.
Highly Interactive UI for Model Design and Optimization
Nsight DL Designer is a GUI-based tool that makes model construction and modification visible and intuitive. The integrated profiler using GPU metrics helps developers have an early gauge of their model’s performance envelope. The
Easy and Flexible Inference Deployment
Nsight DL Designer provides flexible options to support model deployment on multiple platforms and technologies(ONNX, Windows DirectML, and CUDA), and the deployment support is fully automated. Especially for developers working with CUDA, Nsight DL Designer generates C++ inference code that runs high-performance kernels implemented by NVIDIA via the NvNeural inference engine.
GUI and NvNeural inference engine for model design
Nsight DL Designer is a GUI based tool and developers can create a model simply by dragging and dropping a neural network layer. Templates and network layers are ways to help developers create large models. Nsight DL Designer ships with a built-in set of high-level neural network layers implemented as the NvNeural inference engine. NvNeural is designed to be fully extensible. With the shipped NvNeural SDK, developers can implement their own layer as external plugins, which are fully operable with Nsight DL Designer. Currently the layers in NvNeural are geared towards building computer vision-centric neural network models.
Inference performance profiling with GPU metrics
Nsight DL Designer allows developers to profile a network’s inference performance as soon as the network’s model graph is constructed. This gives developers early insights into whether the model can meet the timing budget set for in-app inference, before they spend significant efforts on training. The profiling data are based on common GPU metrics like SM utilization, Tensor Core utilization, SM occupancy, etc. The data may help developers further optimize their network models to make better use of the GPU’s compute resources.
Interface with training framework - PyTorch
Nsight DL Designer ships with convenient Python scripts that automatically convert a Nsight DL Designer model into a PyTorch model, which can be easily added to developers training loop.
Interactive visual analysis of the inference process
Nsight DL Designer enables developers to visually examine the inference results as well as dive deep into the inference process to inspect feature maps at every layer. Developers can also augment their network models with analysis layers, which, with dynamically- created UI elements, allow developers to interactively control the inference process.These allow developers to gain valuable insights into the network behavior, more efficiently diagnose quality problems, and test different model variations.
Automated model export and code generation for deployment
Nsight DL Designer provides flexible options to support model deployment. A network model created using DL Designer can be exported as an ONNX file for platform-independent deployment. To deploy a model for the Windows platform, developers can use Nsight DL Designer to generate C++ inference code that runs on top of the DirectML API. For developers working with the CUDA ecosystem, Nsight DL Designer can also generate C++ inference code that runs on top of the NvNeural inference engine. The export and code generation process is fully automated.
A: We currently support both Windows and Linux
A: Yes, Nsight DL Designer does require a local NVIDIA GPU to run
A: We currently only support exporting DL Designer models to PyTorch. Support for other frameworks are being considered.
A: Nsight DL Designer currently does not support direct import of an existing model from PyTorch, TensorFlow, or ONNX yet.
A: Nsight DL Designer is a desktop application. It should work anywhere that has a GPU and support GUI (either natively or via redirection)
A: We do recommend a recent generation of GPU that has tensor core support.
A: Nsight DL Designer is best suited for developers who want to add a DL-based image/video processing feature to a host application that has strict performance requirements for the DL inference workload.
A: The high-level neural network layers in Nsight DL Designer are currently computer-vision centric. They should be sufficient to build network models for:
Ready to download NVIDIA Deep Learning Designer?