Sky high memory usage while working with SubD

Working on a simple SubD part. Rhino WIP [Mac] is gobbling so much RAM.
it came up to 13.7GB [I have only 16GB on this old MBP].
I quite and reopen the same model, and it went down to normal usage [700 MB]
there really need to be a way for Rhino to cache some of this memory to disk or release unneeded memory.

  • in Zbrush where I also work, you can define the amount of memory the program can use, the rest it writes to temp disk in the background, all without performance issues [ and it keeps separate undo for each of it subtools [their layers system equivalent ] so it’s a lot of memory to manage discreetly .

thanks a lot

Thanks for the report of what you’re seeing. Can you reproduce this memory use spike with a certain series of commands or steps? If you can, could you please provide a 3dm and explain what you do when it occurs? I’ll need to reproduce this here for a bug report. I’ll also keep an eye on memory here to see if I can find what might not be releasing it but I haven’t noticed this myself yet. Thanks again.

You’re also on the Mac v7 WIP correct?

hi Brian
I can send you the file, forgot the upload URL.?

Yes V7 WiP Mac 10.14.6
now for example,

  1. restart Rhino and opened the file at 700MB RAM,
  2. did only the FILL command on one hole made of 2 faces.
  3. use the fill auto option, didn’t like and undo the results
  4. tried the one face option, didn’t like and undo the results
  5. tried the Auto in 2 stages each for corresponding face, perfect results.
  6. and some zooming and rotating [using the 3D Connexion basic device]
    that’s all not even a layer switch… and I’m already at 5GB+

thanks a lot

That’s what operating systems DO. Mac programs pre-OS X had to manage their own ‘virtual memory,’ but that was a long time ago.

Thanks for the specific steps, I’ll try and repro here using any model. You can Private Message me here on the forum or use and say for BrianJ in the comments along with what you’re reporting.

I tried to send you the file via upload and got this:


If the file is smaller than 20mb, you could email it to as another approach, I’m not sure what the problem is with the upload link… It is working fine here so it may be a network limitation or firewall on your end at a guess.

Regarding the memory spike, it may be the undo stack taking up the memory, try the command ClearUndo to see how that impacts the memory usage when you see this please. Any change?

thanks Brian
I was able to upload the file just now, it’s 28mb
I’ll try the clear undo command, later on. [was not aware of it]
Still, 3 undo should not bring memory up by 4.3 GB this is like 1.4gb RAM per undo…
It feels this is something to do with working in SubD, as I did not have this issue with Nurbs and meshes.
thanks a lot

Hi Brian
Thanks for the new subD video. it’s excellent .
Here with the aforementioned file, and working only SubD commands : deleting faces, using the new Fill command, and Bridge. and memory usage jumps at the rate of nearly a 1GB RAM per every command executed . clear undo does free a lot of memory. Yet working without an undo stack is not a viable solution.

thanks a lot

Thanks for the file and the report. I have reproduced the issue and recorded a video for the developers to help explain. I have filed this as RH-54827 which is a confidential bug report due to it containing your model. Thanks again!

1 Like

[Not entirely on topic] (But i havent found another post with similar issues)

I have a similar situation in the files that i am working on. When I am cleaning out pointclouds, my memory usage goes “sky high”, can even max out 64GB, and clearing the undo, will drop it back down a bunch, but when I continue to clean out (delete) chunks of pointclouds, it will go back up. Why is this possible? I thought the setting in the options would limit the amount of RAM that the program used for undo functions, but when dealing with large pointclouds, this doesnt seem to be the case.


@pascal @stevebaer

Can you share a point cloud file with me in a private message here or use and mention Sky high memory usage while working with SubD in the comments.

Also indicate the steps you take to see the spike please.

Hello - thanks, I’ll see if I can reproduce that.
@Mason - I did this, just as a test - I made a large mesh - 7 million or so points and made a point cloud from it.
Watching TaskManager, I alternately moved the mesh and the point cloud. When I move the mesh, the memory use goes up, then comes back down to about where it was - sometimes not quite , but close, on the graph. If I move the point cloud, the memory use goes up and stays up. I don’t know what this tells us exactly, though it seems to support the idea that there is something amiss perhaps with how point clouds are handled. Moving the mesh sometimes lets the memory usage drop to below where it was before the move, if it had been jacked up by moving the point cloud.

Volumes could be written describing what I don’t know about how all this works, or ought to work, under the hood… @stevebaer - does it seem to you from my description that there is something to look into?


Video of issue uploaded to link above.

sounds like there is something to look into

Added RH-57573 Point clouds and memory use

It’s got your name on it, for now… =)