| Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
| World Community Grid Forums
|
| No member browsing this thread |
|
Thread Status: Active Total posts in this thread: 12
|
|
| Author |
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hello, I have faah5017_1hv1_4phv_00_3 running with a short deadline - 6 am 15th August (Friday morning). So far, 47 hours CPU time completed, 41% done and 51 hours remaining. It is running at high priority, full throttle. It is not going to complete.
Is there any way to change reporting deadline? I have a P4 HT running at 3.2 Ghz with 2 Gigs memory. Two WCG tasks run at the same time. If I were to suspend one and refuse new work, would this speed up the processing of this file to perhaps meet the deadline? It seems an awful waste of work otherwise. Many thanks, Mark |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
This is probably an extreme example of the large work units from a recent batch. It has probably failed to complete on a few other computers already.
Please continue working on it. Even if it misses the deadline, the techs can still accept it (and award credit, of course). You may want to check that this huge work unit won't hit the built-in CPU time limit. knreed posted instructions on how to manually change the limit. On a hyperthreaded computer, suspending the other task will give a significant improvement. Go for it. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hi Didactylos,
----------------------------------------Thanks for the reply. I'll try suspending all other work. However, the CPU instructions may not work here as I am running Linux (SuSE). Hopefully therefore there is no CPU issue. I'll let you know what happens. edit: I've checked the Linux system monitor information. It still shows this particular work thread taking only 50 percent of the CPU power. ie there is no change in percentage of use where either or only one work unit is running. Any thoughts? Thanks, Mark This is probably an extreme example of the large work units from a recent batch. It has probably failed to complete on a few other computers already. Please continue working on it. Even if it misses the deadline, the techs can still accept it (and award credit, of course). You may want to check that this huge work unit won't hit the built-in CPU time limit. knreed posted instructions on how to manually change the limit. On a hyperthreaded computer, suspending the other task will give a significant improvement. Go for it. [Edit 1 times, last edit by Former Member at Aug 14, 2008 1:49:04 PM] |
||
|
|
JmBoullier
Former Community Advisor Normandy - France Joined: Jan 26, 2007 Post Count: 3716 Status: Offline Project Badges:
|
I've checked the Linux system monitor information. It still shows this particular work thread taking only 50 percent of the CPU power. ie there is no change in percentage of use where either or only one work unit is running. Any thoughts? Mark, if Linux reports percentage of use for a P4 HT the same as Windows what you see is normal. What you must actually watch is if the CPU runtime increases faster than before and if the time to completion decreases faster than before at the same time. I would bet that this is the case and that your WU will complete about twice faster than expected when you suspended the second WU. Cheers. Jean. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Mark,
Your work unit is actually part of a batch that were sent out with really long run times at the beginning of the month. I've made a note of your work unit. Please continue to run if you can and we will do the best we can to provide you with an appropriate amount of credit. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Mark,
Just out of curiosity, when did this WU arrive on your work queue as in Date & Time you got it, do you know? |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hi Barney, Nelsoc, and Jean,
Many thanks for the replies. (1) Barney, the file arrived mid-day Monday 11th August. It started shortly thereafter running at high priority. However, it then stopped, and was replaced by another much shorter file also with a 15th August deadline started at high priority. It restarted in the evening of 12th August. I'm hands off with these tasks so I don't know why it started, stopped and restarted. (2) Nelsoc, I'll let it run itself out. Many thanks. See my comment to Jean - I might be surprised by tomorrow if it gets even close to completing. (3) Jean, I noted no difference in performance by letting only one file run. I've been at work about 10 hours, and that is about the amount of CPU time completed since this morning. I could be wrong. It has now 27 hours to go, with 70 hours completed and 11 hours to reporting deadlines. (4) Barney, Nelsoc, and Jean, even if it does not complete on time, I understand that the work is not wasted because there is built-in redundancy and that someone will be providing the results so all is not lost. It is the case of Dumas' three musketeers: un pour tous, tous pour un. ;) Thanks all for the very kind and helpful responses. I'll let you know how close I came to completing this task, tomorrow. Kind regards, Mark Mark, Just out of curiosity, when did this WU arrive on your work queue as in Date & Time you got it, do you know? |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Mark, the CPU time won't help with showing whether running a single task is helping. It will still show the same CPU time as before.
Hyperthreading is a complicated thing. Lawrence wrote an excellent explanation, but I'll give you the short version, just in case you are interested. Hyperthreading simulates a second CPU by using spare clock cycles on the real CPU. For some general purpose computing, there are a lot of spare clock cycles (the CPU is waiting for memory accesses, usually) and the virtual CPU can squeeze some extra performance out of your CPU. For other types of computing, the CPU is running at nearly maximum, and most of the memory needed is cached on the CPU. In this situation, if you use hyperthreading, neither task will get full use of the CPU. |
||
|
|
JmBoullier
Former Community Advisor Normandy - France Joined: Jan 26, 2007 Post Count: 3716 Status: Offline Project Badges:
|
(3) Jean, I noted no difference in performance by letting only one file run. I've been at work about 10 hours, and that is about the amount of CPU time completed since this morning. I could be wrong. It has now 27 hours to go, with 70 hours completed and 11 hours to reporting deadlines. About 10 hours of CPU time in about 10 hours of time is what I expected you to see by running only one WU. What I think (and which you might have not noticed when you had not to care about deadlines) is that each of the two WUs running in your HT would have been getting only 5 hours of CPU during the same 10 hours, i.e. would have progressed twice slower toward completion. But in some HT configurations this may not always be true. Cheers. Jean. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hello, Didactylos and Jean,
I did a very rough calculation by observation when I let the one work unit run by itself as compared to two units running at the same time. On my machine, there was virtually no difference in speed. While running two units, I clocked one work unit which said 1 hour left and it completed that work in about one hour. It needs more data to come to a conclusion, but realistically, I saw no difference. And I noticed that the monitoring programme reported no change in the cpu usage of the 90 hour work unit after I left it to run by itself. I wonder if there is a time slice tweak I can do in SuSE linux in the way I used to tweak IBM's OS/2 Warp. At any rate, the unit completed its work and it was uploaded. Whether the data will be considered useful I do not know. Many thanks for the informative discussion. Kind regards, Mark |
||
|
|
|