Running Yocto Built Apps on Target#
Export
LD_LIBRARY_PATH=/usr/local/cuda-<CUDA-VERSION>/<MACHINE>/targets/aarch64-linux/lib
before running CUDA sample apps.
Note
MACHINE is ‘orin/thor’, based on the platform
Run CUDA applications:
CUDA samples are packaged at /home/root/samples/cuda
.
./sample-application>
Run TensorRT applications:
TensorRT samples are packaged at /home/root/samples/tensorrt
The following sample applications are available:
sample_char_rnn
sample_fasterRCNN
sample_fasterRCNN
requires the TensorRT model file to be present in the data/faster-rcnn/
subfolder.
Note
Export LD_LIBRARY_PATH=/usr/local/cuda-<CUDA-VERSION>/<MACHINE>/targets/aarch64-linux/lib
before running CUDA sample apps.
Note
MACHINE is ‘orin/thor’. based on platform
Run the NVIDIA DriveWorks applications:
The NVIDIA DriveWorks ‘core’ samples are packaged at /usr/local/driveworks-<DW-VERSION>/bin
.
./<sample-application>
If DriveWorks samples fail, export LD_LIBRARY_PATH:
export LD_LIBRARY_PATH=/usr/local/driveworks/lib:/usr/local/cuda-<CUDA-VERSION>/<MACHINE>/targets/aarch64-linux/lib
Note
The test data files required by some of the DriveWorks samples are not packaged in the Root File System by default, due to their large size. Samples from DW add-on packages are also not packaged.
Note
MACHINE is ‘orin/thor’, based on platform
In order to run the DriveWorks sample apps that require test data files, the scp
utility may be used to push the required files into the target at /usr/local/driveworks-<DW-VERSION>/data/
.
The additional DriveWorks samples may be pushed via scp
onto the target, at /usr/local/driveworks-<DW-VERSION>/bin/.
To run OpenGLES-based applications on the target, kernel module
nvidia_drm
should be loaded by usinginsmod
on the target device:
insmod /lib/modules/<kernel_version>-rt63-tegra/extra/opensrc-disp/nvidia-drm.ko modeset=1