Estimating the memory footprint of Grasshopper definitions

Posting this here, because it should be mostly interesting to anyone running Rhino/Grasshopper in the context of cloud applications.

After spending some time (ok, some years) developing a system that makes Rhino and Grasshopper as stable as possible when used as a backend application, there is one more thing I am dreaming of: a way to quickly get a good estimate of the memory footprint of a Grasshopper model in its current state (including its volatile data).

Why is this important?

To make response times as fast as possible, we want to keep as many Grasshopper models loaded in memory as possible. Going too far will eventually cause Rhino to crash. Clearly, we want to minimize these events. This means we need a stable decision criterion on when to unload which model.

Measuring changes in Rhino’s memory consumption and deducing a Grasshopper model’s memory footprint like so is not a stable possibility (believe me, I tried). The .NET garbage collection gets in our way.

A crude way to do it would be to iterate over all components of the Grasshopper model, all parameters of the component, and their volatile data tree, and get an estimate for each data object. Ideally, there would be a way to get the (unmanaged) memory usage for each CommonObject. Also, we would need a way to avoid counting duplicates. I guess both shouldn’t be too hard by exposing further properties/members on CommonObject, but I might be mistaken. Is this something McNeel would be willing to help with?