Running Yocto built Apps on Target

  1. Note: Export LD_LIBRARY_PATH=/usr/local/cuda-<CUDA-VERSION>/targets/aarch64-linux/lib before running CUDA sample apps.
    Run CUDA applications:

    CUDA samples are packaged at /home/root/samples/cuda.

    ./sample-application>
  2. Run TensorRT applications:

    TensorRT samples are packaged at /home/root/samples/tensorrt

    The following sample applications are available:

    sample_char_rnn
    sample_googlenet
    sample_movielens
    sample_fasterRCNN

    sample_fasterRCNN requires the TensorRT model file to be present in the data/faster-rcnn/ subfolder. Download the model file here:

    https://dl.dropboxusercontent.com/s/o6ii098bu51d139/faster_rcnn_models.tgz?dl=0

    Note: Export LD_LIBRARY_PATH=/usr/local/cuda-<CUDA-VERSION>/targets/aarch64-linux/lib before running CUDA sample apps.
  3. Run the NVIDIA DriveWorks applications:

    The NVIDIA DriveWorks 'core' samples are packaged at /usr/local/driveworks-<DW-VERSION>/bin.

    ./<sample-application>

    If DriveWorks samples fail, export LD_LIBRARY_PATH:

    export LD_LIBRARY_PATH=/usr/local/driveworks-<DW-VERSION>/lib:/usr/local/cuda-<CUDA-VERSION>/targets/aarch64-linux/lib
    Note: The test data files required by some of the DriveWorks samples are not packaged in the rootfs by default, due to their large size. Samples from DW add-on packages are also not packaged.

    In order to run the DriveWorks sample apps that require test data files, the scp utility may be used to push the required files into the target at /usr/local/driveworks-<DW-VERSION>/data/.

    The additional DriveWorks samples may be pushed via scp onto the target, at /usr/local/driveworks-<DW-VERSION>/bin/.

  4. For running any OpenGLES based applications on the target, kernel module nvidia_drm should be loaded by using insmod on target device:
    insmod /lib/modules/<kernel_version>-rt63-tegra/extra/opensrc-disp/nvidia-drm.ko modeset=1