| Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
| World Community Grid Forums
|
| No member browsing this thread |
|
Thread Status: Active Total posts in this thread: 24
|
|
| Author |
|
|
TonyEllis
Senior Cruncher Australia Joined: Jul 9, 2008 Post Count: 286 Status: Offline Project Badges:
|
Thanks adriverhoef and twilyth
----------------------------------------updated from wcgresults version 40-5 to 47-5 and modified my command lines... more results shown ![]()
Run Time Stats https://grassmere-productions.no-ip.biz/
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
....Hopefully I can return part of the favor by noting that setting &limit=0 will give you all of the current wu's. Wow - what a great tip! Thanks for sharing that. I wonder why that's not documented anywhere official? I just finished writing a script that downloads data using limit and offset! I too am interested in how folks are putting their data into SQL databases. Best, -mark |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hopefully I can return part of the favor by noting that setting &limit=0 will give you all of the current wu's. Even if there are more than 500 records on the Result Status page? That would be huge for big crunchers if the call restriction was removed. TIA |
||
|
|
twilyth
Master Cruncher US Joined: Mar 30, 2007 Post Count: 2130 Status: Offline Project Badges:
|
lavaflow - not sure if this will answer your question. Screen grab from listing current results in firefox using a plugin to format the XML
----------------------------------------![]() ![]() ![]() |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
It looks like there is no longer a 500 record limit.
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
lavaflow - not sure if this will answer your question. Screen grab from listing current results in firefox using a plugin to format the XML ![]() That's indeed super huge. In effect, no longer is there a risk of missing results when cycling through 250-500 portions in a live environment. Maybe techs could make a formal confirmation, not some programmatic slip which later on causes confusion and disappointment when the back-stop is put on again. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
I'm thinking about doing a project to import the API data for my devices into a MYSQL database. I haven't worked with a db in many years so I may be in over my head but it will be interesting to see how far I can get. Hi twilyth, When I saw your post at the end of March I too was starting almost the same project! To that end, I've got a small script together that uses the WCG API to download current workunits, reformat that output to CSV, and then loads it into a MySQL database. The code can be found on GitHub here: https://github.com/msellan/wcg_bash The script contains a function to create a table with appropriate schema definitions. So far it's been running for a week without issue. I'm happy to answer questions and am interested in feedback or suggestions from anyone. Best, -mark [Edit 3 times, last edit by Former Member at Apr 13, 2019 9:45:48 PM] |
||
|
|
adriverhoef
Master Cruncher The Netherlands Joined: Apr 3, 2009 Post Count: 2346 Status: Offline Project Badges:
|
That's a very useful script, Mark, thank you very much!
I thought I should do the same and integrate some SQL statements into my own 'wcgresults' script, so right now I'm testing my efforts. There's one question popping up for me when looking through your script, considering this line: printf "\"1970-01-01T00:00:00\"," >> "${output_file}"Why is it there, could you enlighten me? ![]() |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hi adriverhoef,
Thanks for taking a look at the script! There's one question popping up for me when looking through your script, considering this line: printf "\"1970-01-01T00:00:00\"," >> "${output_file}" Why is it there, could you enlighten me? nerd Sure, when I first downloaded sample workunits from the API I thought there were 18 fields. But then later when I downloaded a full set of workunits I noticed that some records had an extra field, "ReceivedTime" which is not sent for a workunit until after the work has completed and it's been 'received' by the server. But in the JSON, the field doesn't show up for a record until it has a timestamp. So really a complete record is made of 19 fields. So I decided to insert the 'ReceievedTime' field, as a placeholder, even for records that don't have it assigned yet. I choose to fill it with the Unix Epoch date, rather than null, as it will stand out and allow me to query all workunits that haven't been received knowing that later UPDATE statements will overwrite the Epoch date once the workunit is processed. Hope that helps! Best, -mark |
||
|
|
adriverhoef
Master Cruncher The Netherlands Joined: Apr 3, 2009 Post Count: 2346 Status: Offline Project Badges:
|
Ah yes, that makes sense now. Thanks Mark!
|
||
|
|
|