Rhino fills up my software memory (RAM). When I use a file on 60MB it takes about 6-8 GB of the memory. It’s starts on about 500MB when I open it, and then it just escelate rapidly from there. When I use some commands it just’s out of control. Worst is when I try to “print to pdf” it peaks on 16GB in no time. Rhino of course stops working for a while when this happens.
I haven’t discovered this problem before, because I have been working on smaller files. That was ok, even tough som functions like “print to pdf” is incredibly slow.
I have noticed that some other software like Photoshop also is taking up a lot of space and can be slow on removing data from the memory, but not like this. I don’t think I should use 7-8 GB of RAM to print out a paper.
I have a Yosemite 10.10.4, 2,2 GH i7 and 16 GB of RAM.
Anybody experienced the same thing or have a solution?
I too had similar issue.
I’m a beginner, hopefully others will suggest a solution soon. In the mean time, to get by, you can look at: Tools manu, File utility, Purge Unused Information. [maybe command+S before to be safe as I’m a beginner here]
also you can get [free] from the Apple store the Memory Clean app.
Thanks! I’ll check it out!
Can you please post or send us your model so we can duplicate this? See Reporting Problems as a guide for the kind of information we need. Thanks.
@epiphany How long was your modeling session? Are you noticing any patterns of activity - for example, printing to PDF as mentioned above? - that seem to be causing the memory footprint to balloon? Does it happen with specific files?
modelling session: 6 hours
PDF printed 2 times
I’m having the same problem now… any Idea???..
It opens and immediately slows down all the system.
I can use Rhino for modeling for as long as I want. I doesn’t seem to be any problem, but I have learned that when I want to print, especially to pdf I need to restart the program within ten prints or so. I have been working around the problem for some time now, but today rhino crashed at 66GB while I was trying to do a render_preview. I shut down, opened the file again and when a pushed to render_preview-button it startet to accumulate and crashed at 66 GB. I fiddled around a little but nothing seems so work. Is this a problem for just a few of us? I can’t imagine.
I have the exact same problem. Rhino keeps hogging RAM until I finally get a notice saying that my Mac is out of application memory. This happens with all files, faster with larger ones (a 25 MB file with mostly lines and hatches will take around a half an hour to use up 20 GB of memory). Memory usage keeps increasing even if I’m not really doing anything (just rotating and moving the view). So I’m able to model, but need to quit/force quit regularly. The memory usage doesn’t reduce when I close the file, only when I close the application itself. Navigating the model is also relatively sluggish.
I have a Mac mini, Intel Core i7 3 GHz, 16 GB of memory, El Capitan ve. 10.11.4. My version of Rhino is 5.1 (5B161). My HD has 200 GB free space left. Only other program running is Firefox.
I’ve tried deleting duplicate lines / hatches and bad objects and purging. This might be an OS X problem (with permissions?), but does anyone have any idea what it’s about? It seems that Rhino keeps saving some kind of data which would not need to be saved.
Printing was incredibly slow before (printing an A4 with a few lines would take up to 20 min) but this problem was fixed by updating to the latest version of Rhino.
I had the same issue with a model and managed to resolve it by removing blocks. RAM usage reduced from 30Gb to 1Gb and model file size only reduced by 100mb. In this test is it safe to assume that the way Rhino Mac and Rhino Windows process blocks in a slightly different way (or more like the way the OS interfaces with Rhino). I do find that having complicated geometry in blocks tends to eat up RAM a lot easier in MAC. The same issue does not occur when the model is referencing another file.
OSX Yosemite 10.10.3 iMac 3.4GHz core i5 8Gb Ram, Rhino 5.2.1 (5C254)