Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 17
Posts: 17   Pages: 2   [ 1 2 | Next Page ]
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 3915 times and has 16 replies Next Thread
pravoslavnie
Cruncher
Joined: Nov 8, 2007
Post Count: 6
Status: Offline
Reply to this Post  Reply with Quote 
hypothetical - using a supercomputer vs. current methodology

What if I "rented" Titan (http://top500.org/system/177975) to crunch? (let's just say it would be possible)

There would have to be some way to run the BOINC software on there as well of course.

Would that be faster or slower than the current system with distributed computing?

Thank you! :)
[Feb 10, 2013 8:27:31 PM]   Link   Report threatening or abusive post: please login first  Go to top 
BladeD
Ace Cruncher
USA
Joined: Nov 17, 2004
Post Count: 28976
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

Or 'borrow' this one...


----------------------------------------
[Feb 10, 2013 9:01:15 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

Hello pravoslavnie,
First look at Statistics at https://secure.worldcommunitygrid.org/stat/viewGlobal.do. Right now, Yesterday shows Points Generated 911,550,164. You would have to dig into the explanations in Help to find that, for historical reasons, 7 WCG points = 1 BOINC credit. WCG did not use BOINC at the start. So Yesterday shows that we computed 130,221,452 BOINC credits. Looking at the BOINC site, we find the definition of BOINC credit at https://boinc.berkeley.edu/wiki/Computation_credit where it says that 200 credits = 1 Gflop per day. So Yesterday WCG computed for 1 day on the equivalent of a 651,107.26 Gflop computer. Using teraflops, we ran for 1 day on a 651 Tflop computer-equivalent. That is a bit less than 2/3 of a petaflop per second for a day.

Hope I did not make an arithmetic error. That should help you with your comparisons.

biggrin
Lawrence
[Feb 10, 2013 10:45:46 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Hypernova
Master Cruncher
Audaces Fortuna Juvat ! Vaud - Switzerland
Joined: Dec 16, 2008
Post Count: 1908
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

What if I "rented" Titan (http://top500.org/system/177975) to crunch? (let's just say it would be possible)

There would have to be some way to run the BOINC software on there as well of course.

Would that be faster or slower than the current system with distributed computing?

Thank you! :)


Supposing that a special software would be developed using optimally the Titan and that enough WU could be supplied then he would be at least 20 times the whole WCG grid.
----------------------------------------

[Feb 10, 2013 11:16:00 PM]   Link   Report threatening or abusive post: please login first  Go to top 
rilian
Veteran Cruncher
Ukraine - we rule!
Joined: Jun 17, 2007
Post Count: 1460
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

let's kickstart the renting of it for 1-day ? :)
----------------------------------------
[Feb 11, 2013 4:02:41 PM]   Link   Report threatening or abusive post: please login first  Go to top 
pravoslavnie
Cruncher
Joined: Nov 8, 2007
Post Count: 6
Status: Offline
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

Hello pravoslavnie,
First look at Statistics at https://secure.worldcommunitygrid.org/stat/viewGlobal.do. Right now, Yesterday shows Points Generated 911,550,164. You would have to dig into the explanations in Help to find that, for historical reasons, 7 WCG points = 1 BOINC credit. WCG did not use BOINC at the start. So Yesterday shows that we computed 130,221,452 BOINC credits. Looking at the BOINC site, we find the definition of BOINC credit at https://boinc.berkeley.edu/wiki/Computation_credit where it says that 200 credits = 1 Gflop per day. So Yesterday WCG computed for 1 day on the equivalent of a 651,107.26 Gflop computer. Using teraflops, we ran for 1 day on a 651 Tflop computer-equivalent. That is a bit less than 2/3 of a petaflop per second for a day.

Hope I did not make an arithmetic error. That should help you with your comparisons.

biggrin
Lawrence


Thank you.

Ok so Titan is a ~17,000 TFlop/s computer.
WCG is a ~650 TFlop/s computer.

That is pretty amazing. I didn't know WCG is that powerful. "Leaps and bounds" really wouldn't be made if we rented Titan even for a week. We wouldn't see that much time saved.

Maybe if we dedicated those 17k TFlop/s to one singular particular cause (i.e. - cancer or AIDS) I can see there being a big difference in time saved.

Anyway, thanks for the math. :)
[Feb 11, 2013 8:43:49 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

I've thought about this too sometimes. But let's put it this way: what if they donated 4% of Titan's computing power to running WCG? That could double the amount of work done.

Good points everyone! Another aspect is efficiency, which is a little harder to compare.
[Feb 14, 2013 8:59:45 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Hypernova
Master Cruncher
Audaces Fortuna Juvat ! Vaud - Switzerland
Joined: Dec 16, 2008
Post Count: 1908
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

Hello pravoslavnie,
First look at Statistics at https://secure.worldcommunitygrid.org/stat/viewGlobal.do. Right now, Yesterday shows Points Generated 911,550,164. You would have to dig into the explanations in Help to find that, for historical reasons, 7 WCG points = 1 BOINC credit. WCG did not use BOINC at the start. So Yesterday shows that we computed 130,221,452 BOINC credits. Looking at the BOINC site, we find the definition of BOINC credit at https://boinc.berkeley.edu/wiki/Computation_credit where it says that 200 credits = 1 Gflop per day. So Yesterday WCG computed for 1 day on the equivalent of a 651,107.26 Gflop computer. Using teraflops, we ran for 1 day on a 651 Tflop computer-equivalent. That is a bit less than 2/3 of a petaflop per second for a day.

Hope I did not make an arithmetic error. That should help you with your comparisons.

biggrin
Lawrence


Thank you.

Ok so Titan is a ~17,000 TFlop/s computer.
WCG is a ~650 TFlop/s computer.

That is pretty amazing. I didn't know WCG is that powerful. "Leaps and bounds" really wouldn't be made if we rented Titan even for a week. We wouldn't see that much time saved.

Maybe if we dedicated those 17k TFlop/s to one singular particular cause (i.e. - cancer or AIDS) I can see there being a big difference in time saved.

Anyway, thanks for the math. :)


Do not forget a small, very small detail. WCG grid is to make it simple free of charge to scientists. Renting TITAN for WCG use would incur costs in millions of dollars.

This is the beauty of grid computing. You get very respectable processing power at "no" cost. The cost is shared by hundred of thousand of crunchers that donate essentially free cpu cycles (screen savers etc..) to the grid at minimal cost for each.
----------------------------------------

[Feb 15, 2013 8:08:30 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

Another way to help the cause would be to donate some time using Amazon's EC2 as your cruncher. As an experiment, I just set up an instance of BOINC running on one of their spot instances which are significantly cheaper than dedicated instances. The cost is $0.018 per instance hour for a dual-core 2.17GHz equivalent computer, so this comes out to $157.68/year plus any add-ons you get like increased storage. There is a tutorial here:
http://www.boinc-wiki.info/Installing_The_BOI...are_on_Amazon_EC2_(Linux)
I'll report back when I get some hard numbers on points and results per day. I'd be interested if anyone else here has tried this as a way to replace their full time crunching. For those of us without a lot of room to spare for a server farm, this could be an option.
[Feb 19, 2013 8:17:17 PM]   Link   Report threatening or abusive post: please login first  Go to top 
branjo
Master Cruncher
Slovakia
Joined: Jun 29, 2012
Post Count: 1892
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: hypothetical - using a supercomputer vs. current methodology

Signed and testing "Free tier" (Micro). If it will somehow work, will try also some of "High-CPU Spot Instances"

NI!
----------------------------------------

Crunching@Home since January 13 2000. Shrubbing@Home since January 5 2006

[Feb 20, 2013 8:54:27 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Posts: 17   Pages: 2   [ 1 2 | Next Page ]
[ Jump to Last Post ]
Post new Thread