We love seeing all of the NVIDIA GPU-related tweets – here’s some that we came across this week:
Thank you @nvidia, #GTC17 was amazing. I learned a lot about GPUs and #deeplearning and their applications and met so many great people!
— Beat Buesser (@BeatBuesser) May 11, 2017
#NIPS paper submitted. Good luck to everyone putting on those finishing touches! As for me, I'm taking my GPUs and going home ;) pic.twitter.com/ZeNom8FzMV
— Smerity (@Smerity) May 19, 2017
finally got one of my own to play around @nvidia #jetson #tx2 @l33tdawg pic.twitter.com/ME2CLyIF2b
— Clarence Chio (@cchio) May 13, 2017
Dinner conversation today: @AlecRad revealing his tips and tricks for distinguishing CNN/RNN/RL training by the sound the GPU makes.
— Andrej Karpathy (@karpathy) May 17, 2017
but Nvidia's CUDA development environment provides a comprehensive set of well-developed tools for Deep Learning that AMD cannot match.
— Thinker86 (@gibson861) May 17, 2017
Started using a Pascal based GPU for ML a couple months ago. WOW, this thing is a fast cooker. I love my Titan X!#ML #AI #deeplearning https://t.co/BM0XTyMsxe
— Blaine Christian (@joltcola) May 11, 2017
Training a basic Keras example LSTM https://t.co/42Y7ibYrLj took 4.3x longer (69min) on my Macbook Pro CPU than Google Cloud Nvidia K80 GPU.
— Joseph Nelson (@josephofiowa) May 12, 2017
I like when command line tools give you encouraging feedback. "Sweet!" #GPUcomputing pic.twitter.com/o5CXxywghR
— Ryan O Schenck (@research_junkie) May 18, 2017
On Twitter? Follow @GPUComputing and @mention us so we’re able to keep track of what you’re up to.