Artificial intelligence startup Sightengine developed image analysis technology to detect if images and videos contain offensive material, such as nudity, adult content or suggestive scenes.
Using CUDA, Tesla K80s and cuDNN with the TensorFlow deep learning framework, the startup trained their deep learning models to determine whether or not an image is safe content and if it is appropriate to use as a profile image for social networking sites – similar to how human moderators would.
Developers and businesses can use Sightengine’s content moderation API which achieves 99% accuracy, and is able to handle millions of images per day.
Read more >
Related resources
- GTC session: Connect with the Experts: Using NVIDIA Developer Tools to Optimize Ray Tracing (Spring 2023)
- GTC session: Connect with the Experts: Professional Rendering and Virtualization (Spring 2023)
- GTC session: Connect with the Experts: GPU Performance Analysis and Optimization (Spring 2023)
- SDK: GPUDirect for Video
- SDK: NVIDIA Windows Management Instrumentation Toolkit
- SDK: Streamline