File size contribution per layer

filesize
rhino5
layers

#1

I’m trying to clean up and optimize geometry on some large project files (Rhino v5). I’m wondering if there’s a tool that shows how much each layer contributes to the overall file size - this would give me some broad indicators of which parts of the model are bloated, and that I might direct my attention to.

Has anyone done anything like this?


Rhino file size Again:
#2

This is a quick hack, you can try this python script - should print out a memory estimate by layer.
(I sorted the layers alphabetically)

import rhinoscriptsyntax as rs

layers=rs.LayerNames()
mem_dict={}
for layer in layers:
    total_mem=0
    for obj in rs.ObjectsByLayer(layer):
        total_mem+=rs.coercerhinoobject(obj).MemoryEstimate()
    mem_dict[layer]=total_mem
print "Memory estimates by layer:\n"
for key in sorted(mem_dict.keys()):
    print "{} : {} bytes".format(key,mem_dict[key])

–Mitch


#3

Thanks Mitch, this is very helpful!


#4

Here’s an alternate script, shamelessly based on @Mitch’ script here above, and sorted by size (descending), with full layer path names:

import rhinoscriptsyntax as rs
import operator

#m_size_unit = ""   # = Bytes
#m_size_unit = "KB" # KB
m_size_unit = "MB"  # MB
m_digits_count  = 10
    
#------------------------------------------
def LeadingZeros(aInteger):
    global m_size_unit
    global m_digits_count

    if m_size_unit == "KB":
        num = round(aInteger/1024,1)
    elif m_size_unit == "MB":
        num = round(aInteger/(1024*1024),1)
    else:
        num = round(aInteger,1) # = bytes
        m_size_unit = "Bytes"
    sInt = str(num)
    while len(sInt) < m_digits_count:
        sInt = "0" + sInt
    return sInt
#------------------------------------------

layers = rs.LayerNames()
mem_dict={}
for layer in layers:
    total_mem = 0
    for obj in rs.ObjectsByLayer(layer):
        total_mem += rs.coercerhinoobject(obj).MemoryEstimate()
    name_tmp = "{} {} | {}".format(LeadingZeros(total_mem), m_size_unit, layer)
    mem_dict[layer] = name_tmp
    
sorted_list = sorted(mem_dict.items(), key=operator.itemgetter(1), reverse=True)
for i, item, in sorted_list:
    print item

Example output (from my test file). Unit “KB” in this case :

00000318.9 KB | Default::Layer 01
00000000.0 KB | Huh?
00000000.0 KB | Default
00000000.0 KB | ASM::BEARINGS::ROLLERS::BDef
00000000.0 KB | ASM::BEARINGS::ROLLERS
00000000.0 KB | ASM::BEARINGS
00000000.0 KB | ASM2::BEARINGS::ROLLERS::BDef
00000000.0 KB | ASM2::BEARINGS::ROLLERS
00000000.0 KB | ASM2::BEARINGS
00000000.0 KB | ASM2
00000000.0 KB | ASM

// Rolf


#5

Very useful tool :slight_smile:

Thanks guys !

It’s already in a toolbar here :wink:

EDIT:

Thanks Rolf for the ‘operator.itemgetter()’ thing …
I didn’t know about that …