| Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
| World Community Grid Forums
|
| No member browsing this thread |
|
Thread Status: Active Total posts in this thread: 23
|
|
| Author |
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Yet another reason to go GPU is better explained here: http://www.isgtw.org/?pid=1002557
Which made me curious. Would that also mean that if Parallel programing would offer greater support for GPU's, that if might also be used in the same way as the WCG APP which sends lots of micro WU's out & back if it was done easy & the broadband speeds increase & latency decreases as a result of the need for speed in Cloud Computing? |
||
|
|
sk..
Master Cruncher http://s17.rimg.info/ccb5d62bd3e856cc0d1df9b0ee2f7f6a.gif Joined: Mar 22, 2007 Post Count: 2324 Status: Offline Project Badges:
|
Good article. It suggests CPUs are heading towards parallel programming in a big way. I've been banging on about the direction on computing for a few years now; basically saying more and more cores will be used, while clocks will not improve significantly. The article explains how these will be exploited generally. Presently most CPU Boinc projects run several separate (independent) tasks simultaneously (matched to core/thread count) but some projects could benefit enormously from parallelism. To some extent the present situation is DIY parallel processing using serial programming with the scientists collating the results. Parallel programming might call for more ordered projects, but the scientists would also benefit from a more automated system, as is the case with mature GPU projects.
Core counts are continuing to rise, but it will be when the frequencies start to significantly drop that there might be a natural drive to parallel programming. This has already begun, and is most noticeable in well designed IBM systems; designed for purpose, efficiency, and with ergonomics in mind, in modern servers, with high core count but low frequency and hence lower power consumption, and in laptops. While leading desktop computers have a need for speed, this is often driven by gaming, which has diversified to the extent that only a few are prepared to pay for top specs, and those that are know a top GPU is more important than a top CPU. Few games actually need top CPUs, and in the near future some may very well require many cores to support high GPU specs; gaming programmers will be among the first to use such parallel programming techniques. Hopefully many Boinc projects will follow, and I do think bandwidth would be slightly reduced, relative to work done, but reducing overhead. |
||
|
|
walter_hamilton
Cruncher Joined: Oct 8, 2007 Post Count: 8 Status: Offline Project Badges:
|
Folding at Home has GPU. Is there cooperation between WCG and FAH?
Is the software compatible? Today I have two quad processor machines and use WCG on 6 processors, FAH on 2 and FAH GPU on the video cards. I wouldn't mind adding extra video cards if WCG added their own GPU tasks. Walter Hamilton |
||
|
|
Sekerob
Ace Cruncher Joined: Jul 24, 2005 Post Count: 20043 Status: Offline |
FAH (Folding at Home), not to be confused with FAAH (FightAIDS@Home), are not compatible. Have no idea what BOINC agent porting discussion have ever taken place over at FAH.
----------------------------------------As for running BOINC and the FAH agent on the same machine, simultaneous, think they can run same time, but not at all aware how the OS and agents coexist to have the science processes not occupy the same cores.
WCG
----------------------------------------Please help to make the Forums an enjoyable experience for All! [Edit 1 times, last edit by Sekerob at Jun 26, 2010 3:12:38 PM] |
||
|
|
sk..
Master Cruncher http://s17.rimg.info/ccb5d62bd3e856cc0d1df9b0ee2f7f6a.gif Joined: Mar 22, 2007 Post Count: 2324 Status: Offline Project Badges:
|
Folding at Home has GPU. Is there cooperation between WCG and FAH? Is the software compatible? Today I have two quad processor machines and use WCG on 6 processors, FAH on 2 and FAH GPU on the video cards. I wouldn't mind adding extra video cards if WCG added their own GPU tasks. Walter Hamilton WCG does not use GPUs, so there is no competition on that front. As for Folding, I have also run it (crunching GPU tasks) on a system that has Boinc running WCG tasks (and for both ATI and NVidia tasks, separately). In the past there were discussions between Boinc and Folding, to unite under one application manager and so on, but they broke down. [Edit 1 times, last edit by skgiven at Jun 27, 2010 11:29:51 AM] |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Adding a GPU forum implies all kinds of deadlines and we're not ready to deal with that yet. GPU-based crunching discussions needed a home. There has been a ton of discussions about GPU-based crunching, which would have used the GPU forum, instead of being scattered throughout other forum headings. I promise to not pressure WCG to make a GPU-aware WU (frankly though, I'm not sure if that would be a good thing for WCG -- bbover3 WCG Admin [Jun 2, 2010 6:11:14 PM] post ). So, how about now for that GPU forum? Please?; |
||
|
|
KWSN - A Shrubbery
Master Cruncher Joined: Jan 8, 2006 Post Count: 1585 Status: Offline |
If nothing else, it will (hopefully) reduce the plethora of threads asking when/why for GPUs.
----------------------------------------Failing that, it will give me a heading I can completely ignore and thereby avoid all of the needless GPU related redundancy. ![]() Distributed computing volunteer since September 27, 2000 |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Seen on the forums just the one project having long talked about getting a GPU implementation. As what bbover3 wrote then, creating a GPU forum before a GPU launch would send the signal as were there such a project, when there is not, but making one with a sticky header post would work nicely as a honeypot. Till then, if folk can't type GPU or NVidia or GPU+NVidia or ATI+GPU to refine what's wanted in the search or advanced search box, well, that's reality of today.
:) |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
The plot thickens.
On the one hand, there are hints here and there throughout the WCG forum about WCG coming up with a GPU-aware WU in the near future or soon -- which is relatively good-enough news for GPU fans compared to the "precession cycle" during the early days. On the other hand, there is still no GPU forum even for purposes of having a home for GPU fans to talk about the "plethora of threads asking when/why for GPUs", not to mention "honeypot", to imply, as if like -- would only attract bad guys which WCG does not want. Nobody is afraid of a GPU take-over, or is there? So frightened that even a talk, let alone a forum about a GPU, seems so scary? GPUs will not take over CPUs as the weapon-of-choice for crunching, or will GPUs? If the direction is towards eventual WCG assimilation of GPU tech for crunching purposes, if ever "honeypot" is used, it would be in a positive sense as in, like -- an opportunity to get some ideas from GPU fans that may guide WCG towards shaping policies, operational and maintenance standards, towards a GPU-based crunching environment alongside a CPU-based crunching environment. And for that purpose, that this is to be done before actual implementation of the GPU-based project, and not after-, right on-, or just before-, the start of the said project. All bets are off should WCG choose to have a GPU-free grid. ; |
||
|
|
KWSN - A Shrubbery
Master Cruncher Joined: Jan 8, 2006 Post Count: 1585 Status: Offline |
I would like to point out that I'm not anti-GPU. However, having said that, I do often put forth the argument that it's not the ultimate solution to every problem and there is a lot more involved in making projects GPU compatible than many of its afficianados are willing to admit.
----------------------------------------Yes, GPUs can put out amazing quantities of results when the stars align, but there is a very heavy investment required in making everything work. For a lot of sciences, this is more effort than potential reward. As long as people kept that in mind and didn't expect miraculous results, I'd be much more supportive of the arguments. ![]() Distributed computing volunteer since September 27, 2000 |
||
|
|
|