We love seeing all of the NVIDIA GPU-related tweets – here’s some that we came across this week:
"Unless you've not been living under a rock, you know what this is ( pointing to a Titan GPU)". @MRAguy #deeplearning
— Raghav{endra}| ರಾಘವೇಂದ್ರ (@raghavian) May 19, 2017
My current dev box for #selfdriving #AI on #GPUs: triple boot, triple GPU, triple monitor #h2oai #deeplearning #machinelearning #python pic.twitter.com/dgERuViWqT
— Arno Candel (@ArnoCandel) May 25, 2017
Now I hate Apple for not including CUDA enabled GPU on Macbook Pro 2016
— Jussi Kujala (@jukujala) May 19, 2017
What u do when your #ML model is training for endless hours?U prepare more models for other stuff to keep #GPU busy. THIS is parallelization
— Santi PdP (@santty128) May 23, 2017
https://twitter.com/soldni/status/864821370374557696
nvidia deep learning day 참가중^^ pic.twitter.com/nTnsBsIrlD
— CT2017_김진만 (@DM_SMU) May 25, 2017
may all your neural networks be GPU rich & threadbare! A massive structure of threads hang over NN like a great wanton beast of computation!
— Paul Tulloch (@ptullochott) May 24, 2017
So lucky to be able to play with such toys. Arrived just in time for the long weekend :) ML in the fog tnx to @NVIDIAEmbedded #JetsonTX2 pic.twitter.com/B5SLhjkB5e
— Vlado Handziski (@vlahan) May 24, 2017
https://twitter.com/KrAbhinavGupta/status/865818919847710725
On Twitter? Follow @GPUComputing and @mention us so we’re able to keep track of what you’re up to.