| Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
| World Community Grid Forums
|
| No member browsing this thread |
|
Thread Status: Active Total posts in this thread: 315
|
|
| Author |
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
The BOINC client's default transfer limit per project (e.g. WCG) is 2 transfers at a time and 8 transfers at a time total. I wonder if manually increasing this would help with the congestion. Measured, the more threads uploading, the worse the performance and the user experience. They could smarten BOINC up on the upload front by prioritizing small files, meaning the reporting of everything else is quickened much. |
||
|
|
Yarensc
Advanced Cruncher USA Joined: Sep 24, 2011 Post Count: 136 Status: Offline Project Badges:
|
I've got an odd issue that I can't figure out. Not sure if its the beta's fault or BOINC's. I was running one unit for this new Beta:
6/18/2019 10:03:37 AM | World Community Grid | Starting task BETA_ARP1_0000370_004_1 Then 6 hours later I get another one (yay!). But it looks like this paused the first unit, since its stuck at 6 hours of processing time. New unit: 6/18/2019 4:10:22 PM | World Community Grid | Starting task BETA_ARP1_0000526_003_0 I thought there might have been some strange concurrency issue, so I waited. The second one has since finished and the first is still in Waiting to Run. I had some FAAH2 units which have short deadlines, so tried suspending everything else, still Waiting to run. I clicked 'no new work' and waited for everything else to finish. There was then just the beta and 6 HST units (12 core Ryzen system), and it was still Waiting to Run. Finally though "lets turn it off and on", so exited BOINC and started it again. Then it started running. |
||
|
|
nanoprobe
Master Cruncher Classified Joined: Aug 29, 2008 Post Count: 2998 Status: Offline Project Badges:
|
They could smarten BOINC up on the upload front by prioritizing small files, meaning the reporting of everything else is quickened much. IIRC further development of BOINC ended in 2018. 7.14.2 is the last version.
In 1969 I took an oath to defend and protect the U S Constitution against all enemies, both foreign and Domestic. There was no expiration date.
![]() ![]() |
||
|
|
hchc
Veteran Cruncher USA Joined: Aug 15, 2006 Post Count: 865 Status: Offline Project Badges:
|
@nanoprobe, check out the BOINC/boinc repository on GitHub:
----------------------------------------https://github.com/BOINC/boinc/projects Looks like 7.16 could be the next public release of the client. There's also a BOINC conference in Chicago in July. The repository is fairly active with development work.
|
||
|
|
Yarensc
Advanced Cruncher USA Joined: Sep 24, 2011 Post Count: 136 Status: Offline Project Badges:
|
The official support for BOINC ended where there were dedicated people maintaining it (funding didn't get renewed I believe), but there are still volunteers maintaining it, that just leads to slower development cycles
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
All that changed was that BOINC became a true community based development project although Dr. Anderson still seems to reign supreme over his brainchild (and he did get funding via I don't want to know IIRC)
----------------------------------------Returning to Beta, got an ARP1 that is predicted to run 2.5 days. Why not sure, but maybe BOINC or servers are confused with short SCC1 and long ARP1 betas running at the same time. The credit should be glorious for a change. [Edit 2 times, last edit by Former Member at Jun 20, 2019 1:30:32 PM] |
||
|
|
nanoprobe
Master Cruncher Classified Joined: Aug 29, 2008 Post Count: 2998 Status: Offline Project Badges:
|
@nanoprobe, check out the BOINC/boinc repository on GitHub: https://github.com/BOINC/boinc/projects Looks like 7.16 could be the next public release of the client. There's also a BOINC conference in Chicago in July. The repository is fairly active with development work. There is also a repository on the Berkeley.edu website that shows a list of every version ever released. 7.14.2 is the latest one shown there. https://boinc.berkeley.edu/dl/
In 1969 I took an oath to defend and protect the U S Constitution against all enemies, both foreign and Domestic. There was no expiration date.
![]() ![]() |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Is doubling the checkpoints to every 3 simulated hours feasible? Very few people seem to understand that the data analyzed is not from a 6-hours period, but the the data of one fixed time e.g. 06:00 UTC or 12:00 UTC and not the period from 06 to 12. So check-pointing more often will be very hard or one have to run this on a virtual machine where one could make snapshots more often during the analyzing process. I don't believe that's true based on what I was seeing in the output while the WU was running for the 3 domains. It is simulating a 48 hour period but checkpointing every 6 hours. I don't have one running right now to pull the output. Did not understand the CP answer, but earlier it was explained by one of the techs that it let's say, the weather/climate data analysis standard... measurements are offered of data points taken every 6 hours. Besides not looking for 1gig of data being intermediately stored in hopes the model can be reloaded without bites going the wrong way. Already my machine is sweating with limiting it by 1 at the time. Last upload took so long, there were 4 other tasks results backed up by the time the upload was finished. Going back and re-reading the post, I can see where CP may have been referring to the input data versus the simulation period. If that is the case, I don't have any reason to dispute his claim as I haven't looked at the raw input data. If true, that seems strange to me as upper-air data is usually collected at 12-hour intervals whereas surface data is normally collected at 1-hour periods (and sometimes more). I send my data to the ingest system in Boulder Colo every 15 minutes. It seems like 6 hour data periods would make the model less accurate. It may be a trade off between jobs running for weeks vs days and not necessarily needing pin-point accuracy. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
I have RHEL6.10 running on a 64-bit machine. Therefore, I get mostly x86_64 programs to run. However my ClimatePrediction stuff is all x86 stuff (although x86_64 programs will be coming out in the future). For WCG stuff, I have some programs still lying around and, as far as I know, they have all worked. Here are a few. wcgrid_beta11_7.00_i686-pc-linux-gnu: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), statically linked, for GNU/Linux 2.2.5, stripped wcgrid_beta11_qchemB_prod_linux.x86.7.00: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), statically linked, for GNU/Linux 2.2.5, stripped wcgrid_beta17_7.36_i686-pc-linux-gnu: ELF 32-bit LSB executable, Intel 80386, version 1 (GNU/Linux), statically linked, for GNU/Linux 2.6.18, stripped wcgrid_beta17_gfx_7.43_x86_64-pc-linux-gnu: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.32, stripped wcgrid_beta17_map_7.43_x86_64-pc-linux-gnu: ELF 64-bit LSB executable, x86-64, version 1 (GNU/Linux), statically linked, for GNU/Linux 2.6.32, stripped wcgrid_beta19_7.21_x86_64-pc-linux-gnu: ELF 64-bit LSB executable, x86-64, version 1 (GNU/Linux), statically linked, for GNU/Linux 2.6.18, stripped Note that the 32-bit ones were statically-linked, so it does not matter what libraries my machine has. It happens that I have the compatibility library for the C++ stuff, but it is not always used. On the other hand, it appears that compatibility libraries are no longer required. Here is a ClimatePrediction example: $ ldd hadam4_8.09_i686-pc-linux-gnu linux-gate.so.1 => (0x00ba9000) libpthread.so.0 => /lib/libpthread.so.0 (0x007c3000) libdl.so.2 => /lib/libdl.so.2 (0x007eb000) libstdc++.so.6 => /usr/lib/libstdc++.so.6 (0x00c44000) libm.so.6 => /lib/libm.so.6 (0x00aaf000) libgcc_s.so.1 => /lib/libgcc_s.so.1 (0x0013b000) libc.so.6 => /lib/libc.so.6 (0x0062a000) /lib/ld-linux.so.2 (0x56617000) $ ls -l /usr/lib/libstdc++.so.6 Jun 19 2018 /usr/lib/libstdc++.so.6 -> libstdc++.so.6.0.13 $ rpm -qf /usr/lib/libstdc++.so.6 libstdc++-4.4.7-23.el6.i686 $ rpm -qf /usr/lib/libstdc++.so.6.0.13 libstdc++-4.4.7-23.el6.i686 On my machine, the compatibility library is $rpm -qf libstdc++-3-libc6.2-2-2.10.0.so compat-libstdc++-296-2.96-144.el6.i686 And it does not seem to be needed anymore, at least for ClimatePrediction programs. The support isn't going away until October of this year but I may go ahead and remove the 32-bit architecture from my Ubuntu systems and see what fails. It was added way back in the CEP days as that was the only way to get those WU units. It may not be needed anymore |
||
|
|
Jim1348
Veteran Cruncher USA Joined: Jul 13, 2009 Post Count: 1066 Status: Offline Project Badges:
|
The support isn't going away until October of this year but I may go ahead and remove the 32-bit architecture from my Ubuntu systems and see what fails. I was thinking of that too. How do you remove it (Ubuntu 16.04/18.04)? |
||
|
|
|