| Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
| World Community Grid Forums
|
| No member browsing this thread |
|
Thread Status: Active Total posts in this thread: 28
|
|
| Author |
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
When the application starts up, it allocates 300MB of memory. This can put a strain on the swap space of smaller/older machines. The United Devices application when running ligandfit allocated much less memory.
|
||
|
|
Eric-Montreal
Cruncher Canada Joined: Nov 16, 2004 Post Count: 34 Status: Offline Project Badges:
|
Same here
Athlon 2500XP 512Mb RAM WCGrid_Rosetta process Memory usage : 6 332 Ko Peak memory usage : 58 696 Ko Virtual Memory size : 297 336 Ko UD.exe (the interface) Memory usage : 11 196 Ko Peak memory usage : 16 012 Ko Virtual Memory size : 9 828 Ko Clearly, the software will be a drain for any machine below 256 Mo ram. Reducing the memory footprint should be a priority or most users will experience slowdowns and simply give up. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
so ... when we wrote Rosetta we were underpaid grad students ... so we wrote it to take advantage of all the killer computers we had. We never thought 500MB was big ... IBM has shrunk the declared memory to 300 MB but Rosetta is a big code-base that does a lot more than just fold proteins... so there was a limit to which we could shrink the footprint. I think there will be some fixes along these lines soon... So the client could shrink to potentialy half what it is now.
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
I've actually been running it on a few of the used compy's we have at the shop lying around and even some p 166's with 64meg of edo seemed to handle it fine sure it takes longer to load but once it's done it kinda hides away
|
||
|
|
Bonta768
Cruncher Joined: Nov 17, 2004 Post Count: 6 Status: Offline |
On my system WCGrid_Rosetta.exe has 326 MB of RAM currently allocated, but is using only 5 MB of RAM in physical memory at the moment.
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
I noticed this too. As a professional software engineer, I don't know much about protein folding, but this seems like a stupidly large amount of memory. You can't say it has anything to do with the "code base" size, because this couldn't possibly be more than about 2 MB!
It is just another thing that will put people off using it. It has to have virtually no effect on the user's PC for them to consider using it. |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Perhaps they could include this as one of the options to be set by the user. I personally would prefer to have the program use more physical memory and less virtual because I have plenty of physical memory and prefer not to have the extra wear on my hard drive. I would love to see an option to have the program load into memory and then use nothing but physical memory to keep from constantly accessing the hard drive and therefore shortening its life.
|
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Dito!
----------------------------------------300MB increase in swap file. Its the LARGEST VM usage in the list - much greater than photoshop (which is only 50MB typically with no document open. It did go to 100MB once though) and any other program. I'm using WinXP. In the task manager, it shows: Mem usage: 5K (no problem there) VM Size: 298,000K ![]() [Edit 2 times, last edit by Former Member at Dec 3, 2004 7:05:19 AM] |
||
|
|
Alther
Former World Community Grid Tech United States of America Joined: Sep 30, 2004 Post Count: 414 Status: Offline Project Badges:
|
I noticed this too. As a professional software engineer, I don't know much about protein folding, but this seems like a stupidly large amount of memory. You can't say it has anything to do with the "code base" size, because this couldn't possibly be more than about 2 MB! It is just another thing that will put people off using it. It has to have virtually no effect on the user's PC for them to consider using it. How can you make that statement "as a professional software engineer" when you admittedly don't even know the problem space? The code base is much larger than your 2MB guess. We use the Rosetta application, which was developed by the Baker Lab at the University of Washington. Rosetta is designed to do a lot more than what we are currently using it for and thus the large size. ISB themselves run Rosetta on a very speedy cluster. Rosetta does a LOT. A note about the memory it uses. Yes, it allocates ~300MB of RAM but only touches ~70MB. Initially this causes quite a bit of strain on the computer if you don't have enough RAM. After a while, Windows realizes ~230MB hasn't been touched and permanently swaps it out, freeing up physical RAM. When we first got the Rosetta code from ISB, it was taking up ~500MB. We got it down to ~300MB by the time deployment came around. I am still working on reducing it's footprint. You have to realize we're making modifications to a very large and complex program and the changes need to be carefully made to ensure the results are correct. You'll be happy to know that we've reduced the footprint by a substantial amount. This updated version should be deployed on the servers sometime this week. The new RAM specs are: 200MB virtual memory, 25MB physical RAM. There is still some work to do, but that's a big step in the right direction and should help quite a few people out.
Rick Alther
Former World Community Grid Developer |
||
|
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Hurray, Rick!! Yes, I have noticed that the working page set is only a quarter as large as the VM allocated. So the only time this really affects the computer is when Rosetta is first loaded into memory, usually at boot when a lot else is also going on. But I can sympathize with the shock and denial that others seem to feel.
I am going to do the “Back when I was a GOOD programmer” routine so anybody who does not want to be terminally bored can mutter “WAS is the important word” and stop reading. I can still remember 30 years ago when I was working on a gamma-ray spectral analysis program and had to invert a matrix. It was with a sense of shock that is hard to understand now that I suddenly realized that it was an impossibly huge problem. Our new 1970s analyzer was just a small machine we put into a corner in an upstairs lab room, but it could produce much more data than a 1960s machine at about a tenth the price. But I could not analyze all those spectral lines in our computer. By dropping three quarters of them I could cut the matrix down to 4 megabytes in single precision. But this was not a sparse matrix, and paging in VM would be prohibitively time consuming. I worked out the time and space requirements for several different approaches and took them to our department head. Then I was assigned a new project. These days it is impossible to describe the visceral shock produced by a simple program that could require 5 MB to 129 MB, depending. But if you had spent as much time researching a $5,000 proposed expenditure for an additional 16 KB memory for our minicomputer, you might understand how Bill Gates would later describe 64 KB as enough for any home computer. Later, working on Numerical Analysis programs in Van Vleck Hall I would cynically keep in mind that the exercises in the text books were carefully chosen to be solvable in a tiny memory footprint. The real world required hundreds of megabytes that only a few National Laboratories could afford. |
||
|
|
|