Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 1
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 2191 times and has 0 replies Next Thread
Jim1348
Veteran Cruncher
USA
Joined: Jul 13, 2009
Post Count: 1066
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Nvidia GPUs for data science, analytics, and distributed machine learning

"There are tons of problems where scaling out is very natural, and provides almost no performance penalty. There are critical applications in our society that genuinely require large scaling to make progress (think genomics, climate change, imaging). Ideally the tools we use on small/medium data on a single machine can be the same tools we use on thousand-node clusters."
https://www.zdnet.com/article/nvidia-gpus-for...g-using-python-with-dask/
[Mar 23, 2019 1:40:57 PM]   Link   Report threatening or abusive post: please login first  Go to top 
[ Jump to Last Post ]
Post new Thread