Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 363
Posts: 363   Pages: 37   [ Previous Page | 10 11 12 13 14 15 16 17 18 19 | Next Page ]
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 586579 times and has 362 replies Next Thread
OldChap
Veteran Cruncher
UK
Joined: Jun 5, 2009
Post Count: 978
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Some results data that is very lacking in detail but may give users a feel for the direction this may be going in regards card efficiency:



So far this takes no account of CPU performance but I will endeavor to add info as I get it
----------------------------------------

----------------------------------------
[Edit 1 times, last edit by OldChap at Oct 13, 2012 2:57:55 PM]
[Oct 13, 2012 2:31:01 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Crystal Pellet
Veteran Cruncher
Joined: May 21, 2008
Post Count: 1316
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
applause Re: HCC GPU APP HAS LAUNCHED.

I would be curious to know if anyone had seen an improvement in the duration estimates over the past 6-8 hours now that these statistics should be much more accurate. This should hopefully address some of the issues with non-gpu workunits being very badly estimated and excessive downloads occurring.

The estimates for my 2 machines without an app_info are much better now.
That's not said well enough. They're fantastic now. Within 15%.

Good job, Kevin!
----------------------------------------

[Oct 13, 2012 2:35:44 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Crystal Pellet
Veteran Cruncher
Joined: May 21, 2008
Post Count: 1316
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Some results data that is very lacking in detail but may give users a feel for the direction this may be going in regards card efficiency:

..image..

So far this takes no account of CPU performance but I will endeavor to add info as I get it

Very nice OldChap. If you wanna add a column more about efficiency:

How about Watts/WU?

Edit: E.g. My i7-2600 pulls 120W from the socket with 8 cores running 100% (only CPU-tasks).
When a task is replaced by HCC GPU on my ATI 7770 it needs 30W more.
----------------------------------------

----------------------------------------
[Edit 2 times, last edit by Crystal Pellet at Oct 13, 2012 3:01:43 PM]
[Oct 13, 2012 2:45:05 PM]   Link   Report threatening or abusive post: please login first  Go to top 
OldChap
Veteran Cruncher
UK
Joined: Jun 5, 2009
Post Count: 978
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Anyone who wants to add to this info then all I need is card type and total run time in mins and secs. add cpu type and speed as well please.

watts??? anyone that uses a Kill a Watt and can provide more details???

Do you have thoughts on how best to show that Crystal Pellet?
----------------------------------------

[Oct 13, 2012 2:53:39 PM]   Link   Report threatening or abusive post: please login first  Go to top 
mmstick
Senior Cruncher
Joined: Aug 19, 2010
Post Count: 151
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Watts isn't a good idea due to the discrepancy between machines. It's more than about what kind of CPU and GPU you use, but the efficiency of the power supply and motherboard is at powering these devices. If you want power consumption, you should simply get official numbers on power consumption.

Running a 7950 OCd to 1100Mhz core and 1575MHz memory with a 4Ghz FX-8120. ~40 work units per hour while also x264 encoding. Not 100% confirmed that this is the max speed since I am x264 encoding as well (with some high 10, 10-bit depth, placebo, subme 11, etc. extreme settings).

Update: After setting x264 to use four cores, and having the 7950 process four work units with one core each at 85% max GPU utilization, 46.2 seconds per work unit completed (3 minutes and 5 seconds divided by four work units), therefore 78 work units per hour. GPU is only used half the time, so I will now up to 8 work units at the same time.

Update2: With one core doing x264, and the other 7 processing 8 GPU work units, GPU usage is 92% for 3 minutes, followed by 1.5 minutes idle waiting for CPU. Overall about 290 seconds to complete 8 work units, which is at one work unit completed every 36.25 seconds, or 99.31 work units per hour. I'm certain that if you pause a couple work units and resume them you can increase the potential max work units per hour by keeping the GPU busy, as whenever the GPU is running only half of a single core is being used per work unit on my processor.

Sample app_info.xml
Required Files (kernel + ATI APP)
----------------------------------------
[Edit 12 times, last edit by mmstick at Oct 13, 2012 5:17:32 PM]
[Oct 13, 2012 2:59:14 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Crystal Pellet
Veteran Cruncher
Joined: May 21, 2008
Post Count: 1316
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

watts??? anyone that uses a Kill a Watt and can provide more details???

Do you have thoughts on how best to show that Crystal Pellet?

I agree with mmstick that just the watts a machine pulls isn't a good figure, but how many watts a machine extra pulls when running HCC on the GPU.

This should be measured carefully and over a longer period, because the GPU is idling half of the time due to start and end CPU-usage by the tasks.

In my edit in my former post I mentioned 30W, but this is when the GPU is really busy.
----------------------------------------

[Oct 13, 2012 3:09:52 PM]   Link   Report threatening or abusive post: please login first  Go to top 
captainjack
Advanced Cruncher
Joined: Apr 14, 2008
Post Count: 144
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Kevin,

Yes, my run time estimates are much more accurate now.

Thanks for all your effort.

CaptainJack
[Oct 13, 2012 3:38:23 PM]   Link   Report threatening or abusive post: please login first  Go to top 
LUFTY
Cruncher
Joined: Apr 27, 2007
Post Count: 25
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Hi nanoprobe

Many thanks for your config file.

On my I7 with an ATI 7970 it is taking about 2 minutes to complete each work unit running the 4 together which works out at 30 seconds per work unit laughing

Fantastic and thanks once again


Lufty
[Oct 13, 2012 3:40:58 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

On my Q9550 and ATI 6950 at default clocks, it takes 3 minutes and 30 seconds

Great job!
----------------------------------------
[Edit 1 times, last edit by Former Member at Oct 14, 2012 11:55:17 AM]
[Oct 13, 2012 8:29:20 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Bearcat
Master Cruncher
USA
Joined: Jan 6, 2007
Post Count: 2803
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: HCC GPU APP HAS LAUNCHED.

Don't think you can trust what boinc is showing for crunch time. I monitored a few and saw after the completion time hit zero, it stayed on that wu for a little bit more time. Boinc shows 3 min 30 sec (varies per wu) but when I add the total of GPU time and CPU time, its around 5 minutes.
----------------------------------------
Crunching for humanity since 2007!

[Oct 13, 2012 9:00:23 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Posts: 363   Pages: 37   [ Previous Page | 10 11 12 13 14 15 16 17 18 19 | Next Page ]
[ Jump to Last Post ]
Post new Thread