I’m trying to clean up and optimize geometry on some large project files (Rhino v5). I’m wondering if there’s a tool that shows how much each layer contributes to the overall file size - this would give me some broad indicators of which parts of the model are bloated, and that I might direct my attention to.
This is a quick hack, you can try this python script - should print out a memory estimate by layer.
(I sorted the layers alphabetically)
import rhinoscriptsyntax as rs
layers=rs.LayerNames()
mem_dict={}
for layer in layers:
total_mem=0
for obj in rs.ObjectsByLayer(layer):
total_mem+=rs.coercerhinoobject(obj).MemoryEstimate()
mem_dict[layer]=total_mem
print "Memory estimates by layer:\n"
for key in sorted(mem_dict.keys()):
print "{} : {} bytes".format(key,mem_dict[key])