Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 3
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 1618 times and has 2 replies Next Thread
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
The economy of GPGPU computing

I ran the numbers and found this interesting, so I thought I'd share:)

My peak processor wattage is 125W, graphix card is 200W (by spec, I haven't measured actual consumption so this math is an estimation) I'm running 5 of my 6 cpu cores to support the GPU WUs.

So I should consume about 105W (125/6*5) peak per CPU @100% compute time, or 83W at the 80% (ignoring the 20% idle consumption) cpu time I've configured. One hcc WU takes roughly 3 hours on this machine, I think.. it's been a while. So my cost for 5 results should be about 249W (83W @ 3 hours).. or about 50W per result. My 3 hour estimate might be wrong, I'll drop it to 2 hours just to be safe. 33W per result at a minimum on this machine.

On the GPU I'll estimate high and assume peak wattage for both the cpu and gpu, or 305W total per hour. At 60 results per hour with this particular card, the maximum per result should be 5W. I haven't been able to maintain full utilization of the cpu and gpu simultaneously, so it should be even less.

I saw a presentation that said that gpu processing can achieve 4x more computation per watt than a cpu so maybe my math is wrong, or I'm forgetting something, or the actual doesn't agree with the theoretical.. or all of the above:)
[Feb 26, 2013 3:53:02 PM]   Link   Report threatening or abusive post: please login first  Go to top 
twilyth
Master Cruncher
US
Joined: Mar 30, 2007
Post Count: 2130
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: The economy of GPGPU computing

I've never really paid attention to the science behind it, but isn't what the GPU does a lot like a game of FoldIt ? In which case what you're really talking about is sophisticated massively parallel pattern matching - or something. GPU's just like the human brain have a tremendous advantage there so I'd be surprised if the wattage advantage is a measly 4x.
----------------------------------------


[Feb 26, 2013 4:07:54 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: The economy of GPGPU computing

I've never really paid attention to the science behind it, but isn't what the GPU does a lot like a game of FoldIt ? In which case what you're really talking about is sophisticated massively parallel pattern matching - or something. GPU's just like the human brain have a tremendous advantage there so I'd be surprised if the wattage advantage is a measly 4x.


I think it's because the GPU processors are.. dumber. They're specialized and don't need to accomplish everything that a general purpose CPU needs to do, so there's less overhead. The pro is that the processors are therefore smaller and more efficient which is why so many can be crammed on a chip. The con is that what you can do on a gpu is pretty limited.. stuff I take for granted every day:)
[Feb 26, 2013 5:27:32 PM]   Link   Report threatening or abusive post: please login first  Go to top 
[ Jump to Last Post ]
Post new Thread