Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 21
Posts: 21   Pages: 3   [ Previous Page | 1 2 3 | Next Page ]
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 2736 times and has 20 replies Next Thread
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: jerbenn - this ones for you

With all due respect, how do you know that any distributed computing project is actually doing what they say they do?

None of them are currently open sourced that I am aware of. This is to prevent the data from being corrupted by malicious or malformed results. Keeping the client closed keeps the integrity of data which cannot be verified by any means other than duplication of effort.

Is the project compelling enough to overcome your doubts? At some point we all need to ask ourselves if there is enough inherent trust in the organization to donate our resources. With organizations like the WHO, Mayo Clinic, IBM, the EPA, the UN Development Programme, CIT, the Argonne National Laboratory, and the NSF on the Grid's advisory board I am inclined to believe that our results will be used for peaceful purposes and for the good of mankind.

Given that distributed computing will be rapidly evolving over the next few years, it will be interesting to see how projects address this issue. In reality though, for those of us a bit more jumpy, you really don't know if you are modeling proteins or modeling nuclear yield effects; searching for "ET" or searching databases for terrorists. I prefer to believe that we're modeling proteins here.

Incidentally, although the smallpox results did go to the DoD, the results were for a smallpox cure. Again, depending on where your belief lies.

Semper Paratus
[Nov 23, 2004 3:22:30 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: jerbenn - this ones for you

I went to UD.com, the people who sell this grid engine technology. Ya, they're brave, but they have a niche market in my mind.

from what I gleaned,computer setup wise, a grid is not supposed to replace a supercomputer, it is supposed to support it.

You have a supercomputer, and you have everyone's office computer thrown on a grid at company XYZ. The supercomputer gets the top priority stuff, and the grid gets all the leftovers.

So why not buy a second supercomputer at XYZ? A grid is easier to upgrade in most cases, benefiting from Moore's law as opposed to dying by it. The question I have for UD then is, how much do you have to upgrade a grid in order to feel a computational difference?

for this orginization, World Community Grid, I believe the decision to go to a Grid is more a way to make the results shared and mechanism ensuring they stay that way. You put all these projects on a supercomputer and a CEO from XYZ will eventually get greedy and stop sharing the results. Using a grid in this specific instance ensures that will never happen more than once, as most users would immediately quit.

~B
[Nov 23, 2004 5:28:47 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: jerbenn - this ones for you

If these are precached protein shapes they must havea lot of them. I've been watching for repeats and have as of yet seen none. Considering ALL of your processing power right now is going towards calculating their shape, adding a few colored spheres as place holders every 2 minutes or so is HARDLY a taxing process on your processor.

This is like the ID arguement that evolution is impossible because it's too complex, which begs the question, if this is too complex to be random, then why is it so simple? You think this is the best their graphics design department could precreate. If I were one of the 3d artists for the projects I would have had some sick procedural animation of schnazzy looking molecules, not these incomprehnsible blobs of primary colors.
[Nov 23, 2004 6:00:17 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: jerbenn - this ones for you

The information on the graphic within the application is nicely explained at:

http://www.worldcommunitygrid.org/forums/wcg/viewthread?thread=602

In this current thread I had mentioned my perception that the graphic was a placeholder because it was not specified ANYWHERE that I could find -despite looking and asking extensively. If the answer given is correct then I wasn't too far off in that the graphic is a placeholder but more accurately it is a snapshot taken at random(?) intervals and rotated for the unspecified period of time until the next snapshot occurs.

I am curious to know where this is referenced:

adding a few colored spheres as place holders every 2 minutes or so is HARDLY a taxing process on your processor.



▒▲▲▲▲▲▒
«NsmileMAD»
▒—☼—☼—▒
----------------------------------------
[Edit 1 times, last edit by Former Member at Nov 23, 2004 7:46:13 PM]
[Nov 23, 2004 7:45:12 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: jerbenn - this ones for you

Experimentation is how I can solve this. I just used a max script to create somthing that looks like a complex molecule on demand based on a few simple variables. Now these wern't just little simple colored spheres I decided I wanted to actually have a nice presentation. Each Sphere had about 1000 faces. Each "Molecule" was 3000 atoms. Total processing time was less than a 4 seconds on an Athlon 2600 including rendering the first frame. With a sphere w/ 100 faces processing time was approximate 2 seconds. If we are in fact producing protein shape calculations it seems simple enough that they could format that into a visual representation based on a simple vertex array.

Now as to little processing power.

Right now it looks like every minute or so the picture will update. Which means we are only seeing a processing loss of 3% processing power. Thats not to say we wouldn't lose that if the shapes were already predefined. Even if the shapes were predefined by an algorithm they would require that much loss.

So do we KNOW that they are not showing us what is actually being processed. No. Is it feasible they are actually showing us some of our work. Yes.
[Nov 23, 2004 8:33:44 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: im_thatoneguy

Well the process has already been explained by following the link I provided a few posts up in this thread.

It caught my attention after you mentioned this -

If these are precached protein shapes they must havea lot of them. I've been watching for repeats and have as of yet seen none.


which seemed to refer that you have processed more than just the two units that your statistics would otherwise seem to indicate.

Then when you stated specific times and parameters of update it appeared that you might have a link to information not provided anywhere else.

However, the specifications of the graphic are apparently irrelevant in regards to processing. It just seemed appropriate that it should be clearly stated on the application that it is representative of the protein under analysis AND that the graphic is only updated every ? minutes as a measure to limit unnecessary processing time AND the processing time of the graphic takes an additional ?% of resources while displayed.

An interesting thing to add, would be a slider that a user could adjust how often the protein is updated or level of detail or types of views.
However, this all goes back to the cost-effectiveness equation, which I addressed to a small degree in another post.


▒▲▲▲▲▲▒
«NsmileMAD»
▒—☼—☼—▒
[Nov 23, 2004 10:13:39 PM]   Link   Report threatening or abusive post: please login first  Go to top 
joatmon
Senior Cruncher
Joined: Nov 17, 2004
Post Count: 185
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: im_thatoneguy


which seemed to refer that you have processed more than just the two units that your statistics would otherwise seem to indicate.



When I was in the reset mode, I was downloading a new workunit every hour or so, but I didn't look at the graphic or notice any specific workunit serial number. I've had a lot of workunits myself, but only a couple completed. Maybe that's the case for other users as well wink
[Nov 23, 2004 10:19:26 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Is the general public just a bunch of mindless sheep...

I, for one, will remove myself from the WCG if I suspect that the benefactors of these projects are anything but the general public.



If grids become successful enough then it leaves supercomputers time to work on other things- such as weapons testing.
The projects being worked on via the grids would be just an excuse to say "why replicate the effort"... even while knowing that the supercomputer could complete the task in a fraction of the overall time.
Just remember that (due to overhead and cross-testing) a grid operating at 10Teraflops is far less efficient than a supercomputer operating at 10Teraflops. The ratio will never be better than 2 to 1.

Many projects are generally put on grids because people don't want to waste their supercomputer processing resources on things that will not be of definite benefit -such as searching for ET.
[Nov 25, 2004 1:38:51 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Is the general public just a bunch of mindless sheep...

If grids become successful enough then it leaves supercomputers time to work on other things- such as weapons testing.

The projects being worked on via the grids would be just an excuse to say "why replicate the effort"... even while knowing that the supercomputer could complete the task in a fraction of the overall time.


With that you are assuming that if there is now grid, these supercomputers would be used for the same projects that are run on the grid. Do you think that the corporations and governments interested in weapons testing etc. would use the supercomputers they control for humanitarian project, just because there is a need for it.
And do you think that these people need an excuse like "why replicate the effort". Projects that are going to be run on the grid have not been suported by these people until now and will not be supported by them in the future, be there a functioning WCG or not.
[Nov 25, 2004 2:02:19 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Is the general public just a bunch of mindless sheep...

Many projects are generally put on grids because people don't want to waste their supercomputer processing resources on things that will not be of definite benefit -such as searching for ET.

But that`s the main benefit and goal of a grid like this, right? That worthy projects that otherwise would not happen, can be done.

Is that 2:1 ratio your personal assumtion or is this supported by verifiable facts?
[Nov 25, 2004 2:10:50 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Posts: 21   Pages: 3   [ Previous Page | 1 2 3 | Next Page ]
[ Jump to Last Post ]
Post new Thread