NVIDIA Achieves 4X Speedup on BERT Neural Network
Among the many knotty problems that AI can help solve, speech and natural language processing (NLP) represent areas poised for significant growth in the coming years. Recently, a new language representation model called BERT (Bidirectional Encoder Representations from Transformers) was described by Google Research in a paper published on arXiv. According to the paper’s authors,
The post NVIDIA Achieves 4X Speedup on BERT Neural Network appeared first on NVIDIA Developer News Center.