Denoiser memory leak (both Intel & Nvidia)

Hello everyone! Long time user (more than half my life…), first time poster. I’ve never had much trouble with Rhino…until I recently started dipping my toes into the wonderful world of rendering.

I’ll try to keep it short. I’m doing a project involving a sequential series of point clouds (check out if this peaks your interest :slight_smile:). I’m using Grasshopper to import these point clouds and do some simple Delauney meshing. All good so far. In the past I’ve simply used the default Grasshopper slider animation function to export decent animations using the viewport and the Mesh Preview component, but in the new year I want something better, so I’m trying to work with the render engine in Rhino 7.

To that end, I rewrote an ancient baking/rendering/saving/deleting script from these very forums (I’ll edit this and link to it if anyone is interested but it’s on my other computer) and integrated it into my gh definition, allowing me to use the real render window and export files programmatically, creating my own batch renderer within gh. I proceeded to make such a render and was reasonably happy with the results, although there were a large number of odd glitches in about 1/5th of the output frames, requiring repeated redos of the rendering. Alas, that seems to be a problem for another day.

The real problem is that after the initial batch, I wanted a smoothed version of the render. I checked the box for ‘Intel Denoiser,’ having downloaded the plugin, and let the script do its work. Much to my chagrin, this caused a severe memory leak into my regular RAM on the order of a GB per frame rendered. I built this machine a few years ago for ML purposes, so I only have 16GB of RAM. It appears that Rhino will chew up that entire GB per frame with the denoiser, and give nearly none of it back until it is about to go completely out-of-memory on the RAM. At that point, it seems to do a small but inadequate amount of garbage collection, and the used memory drops by about 3GB. But in the course of the next four frames or so, that freed-up bit is again chewed up. It seems to go thru this cycles about 3 or 4 times before the application finally crashes without so much as a whisper. The OS is left unscathed and Rhino can safely be restarted, though it doesn’t save an emergency save of the file in question. GH does make a recovery file, but it’s out of date to a bit before the animation was started.

I’ve tried both the Nvidia and Intel denoisers at this point (don’t have an AMD anything on this machine) and they both exhibit the same memory leak. There is a difference in the quality of crashes: for the Nvidia one, Rhino appears to believe that the GPU is going OOM instead of the RAM, and is able to throw an error window then render a few black frames before finally crashing with an EMERGENCY_SAVE file successfully created. However, I have used nvidia-smi repeatedly throughout the rendering process to confirm that no such OOM is occurring; the render itself takes about 1GB (for a 1024px square) during rendering, but it is successfully released every time the render window closes, and the ‘baseline’ about stored in the GPU memory returns to the same value every time.

Well, so much for keeping it short. Kudos on all the work you all have done to get Rhino 7 up and running, overall it is a fantastic upgrade! Hope this urgent issue can be examined and addressed at your nearest possible convenience.


I wrote this novella and forgot to specify my system:
Windows 10 64bit
GTX 1080
Rhino 7.1.20343
GH 1.0.0007

1 Like

@jeff.geiringer, thanks for reporting. I see this behavior. Reported as RH-62316 Using denoisers leaks memory for @DavidEranen