Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 7
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 1403 times and has 6 replies Next Thread
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Using the GPU of my graphics card

Hi,

would it be not better to use my NVidia GeForce GPU with Shader to make floating points calculations? I don't have seen the source code of any of the BOINC clients / agents, but I think it could be interessting for some operations to use the GPUs instead the CPU.
[May 9, 2006 3:08:11 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Using the GPU of my graphics card

We've covered this before.

The bottom line is, it can't be done. If you want a detailed discussion, try searching the forums.
[May 9, 2006 3:18:50 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Using the GPU of my graphics card

There has been some work done on using GPUs for useful scientific calculations, can't remember the name of the project right now though. It is true they are somewhat limited in what they can do but as cards get more and more programmable it may become more feasible, not sure what DX10 feature set will bring in regards to this type of work, maybe that will improve things.

One thing that did occur to me the other day is what about using a physics card for some scientific work, might be useful for something like atomic level modelling for HPF, seeing as they are designed to calculate the interactions between large numbers of interacting objects. I suspect the physic in a current PPU are probably too simplistic for anything useful at this stage but don't know if anyone has sat down and looked at it yet. Sounds like another area that may show promise in time though.
[May 9, 2006 4:47:07 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Using the GPU of my graphics card

There is a thread about using the GPU over on Folding@Home ( http://folding.stanford.edu/ ). It begins in 2002. In 2004 they began coding. I have not looked at the thread since Fall 2005. They signed a nondisclosure agreement with the GPU companies in order to receive aid from them, so they were not saying much.

Summing it up:
A) It isn't easy or fast to program for a GPU.
B) The project scientists would have to do the programming. It is not something that could be handed off to the WCG.

Lawrence
[May 9, 2006 5:53:33 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Using the GPU of my graphics card

I just kinda jumped in after briefly reading the posts. Seems like deja vu.

I thought somewhere on here at some point in the past this was discussed...maybe it was on another distributed computing website.

Anyway, here is the link I remembered from that discussion. http://www.gpgpu.org/

Believe there is a powerpoint slide explaining the theory and usage.

"GPGPU stands for General-Purpose computation on GPUs. With the increasing programmability of commodity graphics processing units (GPUs), these chips are capable of performing more than the specific graphics computations for which they were designed. They are now capable coprocessors, and their high speed makes them useful for a variety of applications. The goal of this page is to catalog the current and historical use of GPUs for general-purpose computation. "
[May 10, 2006 5:22:31 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Using the GPU of my graphics card

Here is the link for the power point presentation on "Linear Algebra on GPUs".

http://www.gpgpu.org/s2004/slides/krueger.LinearAlgebra.slides.ppt

Hope this helps clear things up.
[May 10, 2006 5:28:10 AM]   Link   Report threatening or abusive post: please login first  Go to top 
SlumLord
Cruncher
Joined: Jan 23, 2006
Post Count: 2
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: Using the GPU of my graphics card

Here is the Forum on Folding@Home about using GPU's to fold.

Also the Folding@Home FAQ has a recent update about their work on GPU's.
[May 10, 2006 12:07:38 PM]   Link   Report threatening or abusive post: please login first  Go to top 
[ Jump to Last Post ]
Post new Thread