Pagefile size

I was wondering, the pagefile sometimes just becomes very big (especially when working with heavy files). This of course is expected behaviour, when the ram is full, it will write data to disk.

Yet, after things have settled down, normaly the pagefile should also get cleaned up a bit. This seems to not happen properly. The pagefile is at the moment 17 gigs and after restart/deleting it it tends to increase size rapidly again when using Rhino.

Is it possible Rhino is missing a cleanup for the pagefile somewhere?

Right now it is not yet giving problems, but I’ve been working on some very heavy files before and these just made the entire C drive fill up with the pagefile, and then after a while crash the computer or rhino.



The pagefile should be purely system specific. You can even define a max size, if you like. Once the application is closed, the system should free any associated resources. A reboot should completely purge the pagefile.

What’s the filesize and memory footprint, you are see? I have never come to need a pagefile with 16GB of ram.

The problem is that Rhino does not seem to release all it’s resources after closing or finishing an action. Also using 16 gigs of ram here. The file in question contains a complex hose with high detail and a large length. The file size is not shocking, but the amount of calculations needed to display a mesh of it is high and takes very very long,

Can you verify that in the task manager?
Depending on your Windows version you can get pretty detailed information on how much memory is allocated, used and reserved for a certain app. Once you close Rhino and the process is fully gone from the task manager, all memory should be reclaimed by Windows.

Rhino doesn’t really have anything to do with the page file, far as Rhino is concerned it’s just part of the memory(I remember using Rhino way back when I was using virtual memory to max out the 2GB maximum available on Windows 2000,) and its operation is not quite as simple as “when RAM is full use the disk,” it shuffles stuff around all the time. If you’re using a ton of memory there’s either a memory leak (which would be indicated if it keeps going up and up,) or you’ve just got a huge model. Do you have any undos? It can use a ton of memory as it saves complete copies of everything that changes. Having memory use “settle down” after operations is not really expected, except for specific things like rendering.

I do have a high amount of undos I believe. And yes I had a feeling there might be a slight memory leak but have only one file that really crashes the pc I believe. Which would be the monstrous notorious shower-hose.

The undo settings are:
Minimum number: 99
Max memory used 999MB -> 1gb

This feels like something that should not cause problems when having 16 gigs of ram.

I forget if it’s still the case but I’m pretty sure the minimum undo setting overrides the max memory used, so on a big file that could very easy cause crazy memory use. Set the minimum to 0 and the max memory to something like what you have.

Well i’ll never put the minimum to 0 that would be very bad for making accidents during the work, but I’ll see if I can lower it to say, 30 minimum undos.

In my workflow it is not strange to have done some actions, copy a couple of objects, undo back 15 steps, and paste the new objects there. : )

Also said file, the big one, already fills up my harddrive when the only action you do is, Open the file, change perspective viewport from wireframe to shaded.

No, put it to 0–like, that setting should simply be removed from the dialog to stop people from running out of RAM. 1 if you insist. Use the max memory instead, on a simple model with simple operations 1GB will provide many many undos.

Yes of course, but this is something the software has a strange limitation on.
Say in Maya I have set unlimited undo’s but never ran out of memory in 10 years.

If a software would force me to have 0 undos in very heavy files, that seems strange to me at the least.

I think if you set it to zero, it ‘disables’ that function. Sort of like mesh parameters.
But it’s just a guess.


Well, however you want to interpret the numbers, they can come in conflict.
Say you have a file that uses up more than 1GB for an undo, a minimum number of 1 will exceed the memory limit. Now the question is, will the memory limit be ignored until the minimum number is reached? If so, the max mem is pretty useless for heavy cases. If Max mem will override the min number, you will end up with one or even no undo, anyway.

It’s probably a good guess to say min number will override max memory as enforcing that number makes more sense in workflow. So if you need to preserve memory, a number of 0 will make sure, the memory limit is observed, a number of 1 will make sure at least one undo is recorded but the memory limit can be exceeded once.

Now having said that, for active processes, Windows will try to keep swapping at a minimum. That is, unless your file pushes Rhino to more that about 12 gigs, no swapping should occur and the pagefile should remain more or less untouched. Check the allocated memory with the task manager. Does it exceed your physical 16GB? If so What is the amount Rhino takes up?

Jim is correct - the minimum number of undos is removed from the UI in V6, and Undo only pays attention to the amount of memory you assign. In V5, you should set the minimum number of undos to 1 and assign as much memory to Undo as seems reasonable for your system. This amount of memory will then be available for whatever number of Undo operations fit.


Really? Removing the minimum undo’s from the options, giving the chance that on heavy files, you have 0 undo’s? This seems like a strange “improvement” to me, could you elaborate why this is a logical thing to do, as said with véry heavy files, it wil already default to 0 undo’s.

Yes i’ve been tracking the process in the taskmanager, and been tracking the pagefile size during the changing of the viewport from wire to shade.

But all and all, is there a way to make sure rhino does not fill up the entire drive? Or a way to improve the performance? Perhaps you could take a look at the object in question @pascal?

That’s the choice between losing your undos and running the system out of memory (if the undo exceeds the memory available) and crashing (thus potentially losing some work).

This sounds odd… How big a file is it? What are you doing that could eat up all your RAM in one single operation undo? That and the pagefile size you indicated seem to indicate there is a memory leak somewhere…


The file used to be 1GB but is stripped down to 266 MB by removing a lot of geometry. The memory used to fill up at just turning on the perspective viewport from wireframe to shaded…

i’ll have to run a test on 266mb file have not worked with it since.


I’ve exported the file which is the main cause the problem, anyone got an idea? Perhaps there is a real obvious solution, but I don’t know about it.
It is build using a spiral, but the spiral does not have a good way to use less control points, if we use less control points the spiral becomes less round strange enough.

The file is (compressed) 90 MB so I can’t seem to upload it here. (20 mb limit)

Hi Peter- users regularly set this number to 99 or something and end up crashing Rhino with ‘Out of memory’ errors. Just set the maximum allowable memory to some large number that makes sense with the amount of memory you have on your system- then you’ll always have that available -If you are working in a way that one undo can use that up, then something is wrong - either the number is too low (again, you need to pay attention- maybe 10% of total RAM or something is a reasonable start) or your operations or models are … massive.


The model is quite massive, but is it ONLY the undo function that is causing this massive memory usage? If so I know that I’ll have to work without undo’s on this model.

What mesh settings do you have on that one? And how many polygons does that result in?

Hi Peter- it is possible I guess - what is the memory allocated to Undo? (options > General page) What proportion of your total RAM is that?

So: if you set the minimum # of Undo to 1, you will get at least one Undo, no matter how much memory is assigned, provided that one Undo does not run the machine out of memory. If your operations are smaller than the allotted memory, then you will accumulate Undos until that memory limit is reached. Depending upon the operations, this could be a few or many- the point is that the memory allowed, after one undo, is the limiting factor. Does that make sense?