"Disk Usage Analyzer" for Rhino files - listing objects by size

It would be cool to list, which objects, layers or blocks take the most space in Rhino files.

“purge” command helps a lot when cleaning large files, but in more developed models it takes time to find, which parts of the model bloat it. They may be hidden in layers or block definitions.

For inspiration: Apps/DiskUsageAnalyzer - GNOME Wiki!

The Audit3dmFile command in Rhino builds a list of everything in the file including how many bytes each element uses.
Parsing the list can be a bit of a chore, but that has the detail I think you’re looking for.

Thanks, I didn’t know this command. Yes, that is the idea. It was slow, but it did work. More “graphical” output would be also be nice though.

Hello - @Daniel_Krajnik you can try this script - it counts down number you ask for, starting from the largest, using Rhino’s ‘memory estimate’ for the run-time memory use, not the size on disk.

FindLargestObjects.py (4.0 KB)

To use the Python script use RunPythonScript, or a macro:

_-RunPythonScript "Full path to py file inside double-quotes"


1 Like

Thanks, that looks really cool. I’ve tested it on two models, one seemed to work fine, but the other one keep yielding these errors:

on different indices. Not sure why yet. The output in this model seemed wrong, as well as it annotated some small, flat surfaces.

Could well be buggy - can you send me the offending file?


I will ask about it tomorrow, but at first glance it may be because it wasn’t checking for blocks? Will test that

yep, most likely. I can add that.


thanks, but tried it and error was the same. Will see if I can send the file later.