We love seeing all of the NVIDIA GPU-related tweets – here’s some that we came across this week:
Forgot how fast GPUs are. Finished training a model, had to run for train. Downloaded model to run eval. Likely home before it finishes :'(
— Smerity (@Smerity) July 8, 2017
https://twitter.com/ZitaPatai/status/885842727279226882
https://twitter.com/JLGalache/status/883376778730852352
Thank you @nvidia for providing us the resources for our research. Your contribution to the ML community is greatly appreciated! pic.twitter.com/D49VbO6omX
— Jimit Mistry (@MistryJimit) July 7, 2017
A complete #protein #modeling workflow on #gpu from #ai propensity prediction to #visualization | Blown away by @NVIDIAGeForceUK #titanxp pic.twitter.com/9RYbpu8Awy
— Kamil Tamiola, PhD (@KamilTamiola) July 12, 2017
Full load even in summer on our servers for our #datascience and #AI experiments @LboroScience @LboroCDS #GPU #nvidia pic.twitter.com/ouNV1CE3Cv
— Andrea Soltoggio (@asoltoggio) July 7, 2017
Profitable time strategising our ERC grant with large-scale ABM and GPU acceleration. @LostFrontiersBD pic.twitter.com/OHBUWG4e96
— Eugene Ch'ng (@drecuk) July 12, 2017
On Twitter? Follow our new developer accounts @NVIDIAAIDev and @NVIDIAHPCDev.