Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
![]() |
World Community Grid Forums
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
No member browsing this thread |
Thread Status: Active Total posts in this thread: 4
|
![]() |
Author |
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
After your computer processes the information, the results are sent by World Community Grid to The Scripps Research Institute where they are analyzed by the Scripps research team. The process takes an enormous amount of computing time, I'm new in WCG, i just started 2 days ago. Look around regarding this Project, i read that there are million possibility that the drug(s) molecules fits into the pockets of Human Immunodeficiency Virus Protease. It requires enormous time to do the task, WCG helps in reducing that enormous time by distributing the task between members. Members to the task simultaneously. My questions are: 1. Did the task given to each member a exactly the same (ex. same drug and same docking possibility) 2. If one members completed the task (1/1,000,000 possibility) is that possibility eliminated if it turns out to be a bad one or even if it is good as it was finished, so others can try a better one. Thanks a lot... |
||
|
Sekerob
Ace Cruncher Joined: Jul 24, 2005 Post Count: 20043 Status: Offline |
Hi JOSEPH421 and welcome to WCG.
----------------------------------------Question one: The problem has been split into millions of pieces. Of each piece 3 copies are send out to random members. When the 3 results are returned, they are compared and must equal (the validation process to determine if there is quorum). After, they go thru an assimilation process and moved to a central library. Question Two: The scientists want to have both the good and the bad as it will help them to determine and optimize future research. Also todays knowledge might not provide the full comprehension of the data that may look bad. Future analysis and technology could reveal new facts. Classic case is seismogeological data....oilfields on old interpretations and lesser computing power were completely overlooked.....today with 3d imaging things are seen in the curves, not seen before.....i.e. it would be bad practise to dispose of old data.
WCG
----------------------------------------Please help to make the Forums an enjoyable experience for All! [Edit 2 times, last edit by Sekerob at Aug 29, 2006 3:30:22 PM] |
||
|
we45dfa35gh3476
Advanced Cruncher Joined: Apr 19, 2006 Post Count: 57 Status: Offline |
Question one: The problem has been split into millions of pieces. Of each piece 3 copies are send out to random members. When the 3 results are returned, they are compared and must equal (the validation process to determine if there is quorum). After, they go thru an assimilation process and moved to a central library. Good analogy w/the seismological data Sekerob. Just to clarify a little further, each piece is one potential drug molecule against one variant of HIV-protease. There are various numbers of compounds (http://fightaidsathome.scripps.edu/), and in this project there are 270 different mutations of HIV protease. [Edit 4 times, last edit by we45dfa35gh3476 at Aug 29, 2006 4:41:35 PM] |
||
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
in this project there are 270 different mutations of HIV protease. Thanks for the Welcome @Sekerob. Thanks for the info (Sekerob & we45dfa35gh3476) , now i get the real picture of this project. Because it will be a great waste of time if so many members will keep on processing same data where in results are already known. One drug won't fit totally on a specific variant but will have a better fitting on another...And validation was made in triplicate... I hope i can convince my friends to join WCG... Thank again [Edit 1 times, last edit by Former Member at Aug 29, 2006 5:33:21 PM] |
||
|
|
![]() |