Model/geometry data size

Is there any way to see how much memory/data it takes to define a particular geometry? For example, for me to see how much data is in a chair mode (which is an element of an architectural model)l, I have to copy it and paste it into a new file, and then save to see what the resultant file size is. Is there any way to select an object and find out how much data is in that object/geometry?

Hi Lawrence - not in plain Rhino - I seem to remember seeing something like this in the Rhino Common SDK, but I can’t find it now so I was probably hallucinating. I’ll poke a bit more but I’m not too optimistic.

-Pascal

It’s a bit crude, but you could make a new file with nothing in it ans save it.
Record the file size. This is the file overhead.
Make another file with the object in it and save it.
Subtract the two to get a rough estimate of how much bigger the new file is.
Things like embedded images, render meshes, material assignment, etc. will greatly effect the file size.

@pascal, did you mean this ?

c.

I guess what I was hoping for was something more like the properties window. When you select a geometry, you can see its layer, material, type, etc. Why not the amount of data required to define the geometry as well?

The Audit3dmFile command breaks down the file size based on categories (not individual objects), but you might find what you need in that list.

@clement - yes, thanks that’s the chap! I could not find it again - thanks. Dunno if it is really useful here but maybe worth an experiment.

Did I do this right?

import Rhino
import scriptcontext as sc
import rhinoscriptsyntax as rs


def test():
    Id = rs.GetObject(preselect=True)
    if not Id: return
    
    x = sc.doc.Objects.Find(Id)
    print  "Runtime memory use estimate:" ,str(round(x.MemoryEstimate()/1024,3)) , "k"
    
test()

@lawrenceyy the above bit of python may tell you what you want to know - it is not the file size on disc but the runtime memory footprint that is estimated. I have no idea yet if or how things like materials and textures affect the estimate or if it counts render meshes (probably not). Wrong. It does appear to take render meshes into account.

Here it is as an actual py file (RunPythonScript).

ObjectMemoryEstimate.py (292 Bytes)

-Pascal

1 Like

It sounds like what I’m looking for does not exist (displaying object data size in properties panel). Maybe this could make it into the next version of rhino? Sometimes when I open another person’s model, I can’t tell what is making the model slow or is making the file size so large. Being able to see the object data size would be really useful for auditing the model.

What could be really cool is if there was a shaded mode where data size was represented in color so you can see where there may be atypical objects that are using a lot of data. Or maybe like a infrared filter where high concentration of geometry and data results in a hotter color while areas of less geometry or data is cooler.

1 Like

It may not be 100% accurate, but the most heavy part of your model would the in most cases the mesh or render mesh used to display your NURBS objects. So there is a relationship between how many triangles each object’s mesh has and how heavy the file is (well, with Block Instances it will count only once even if you have many of them). So, as mentioned above there is no automated way to preview your file in ‘heavy-filter’ (I like the idea though, should be doable via scripting) but if you want to check any particular objects, try _PolygonCount command. It will give you the # of triangles per-object or per-selection if many selected. Rule of thumb will be anything above 100,000 means ‘heavy’. There is probably close to linear increase in object ‘weight’ in file based on its polygon count. Most NURBS object definitions don’t take that much memory/disk space, since it is mostly math. It is the representing meshes that do. That’s why SaveSmall command makes files small - render meshes are discarded (files with mesh objects will not really get smaller).

I may take a stab at scripting the ‘infrared’ mode at some point since I like the idea. Unless someone gets at it first.

hth,

–jarek

@lawrenceyy, i too like the idea of some kind of display mode using hot cold colors, however it might be slow to draw all objects according to their memory estimate. :wink: Below is a script which just selects the object (must be selectable PointCloud, Curve, Surface, PolySurface or Mesh) with the largest Memory estimate.

SelectLargestMemoryEstimate.py (1018 Bytes)

Note that an objects memory estimate or its polygon count are not necessarily what “slows down” a file, i could imagine having a 10Mio poly mesh shown without wireframe which navigates smoothly while a single cube having the isocurve density bumped up too high, influencing the display speed much more.

c.

2 Likes

Oh yeah ! You´re always faster :wink:

c.

Good point @clement lement, what usually would slow Rhino down is object count, not how heavy they are. You can have 1 object with 1,000,000 polygons running very fast vs. 10,000 light objects going very slow. Sounds like you bring geometry from other software - I have seen SKP or Revit files with a small shrub element slowing down entire building model because the block consisted of 10000s of sinle mesh triangle faces…

-j

Nice - finding the biggest one is probably more useful than per object query.

-Pascal

Hi Clement & Pascal, Thanks for these great scripts - very useful!

I have lots of items in blocks in my file. With @pascal’s script I get a nominal object size, presumably corresponding to the cost of an instance, not the unerlying geometry. With @clement’s script I get another object, not part of a block, and which I suspect to be smaller than some block objects.

Did either of you (or anyone else out there) do anything to handle blocks / block instances? I would like to be able to identify the objects, including blocks, which are increasing my file size.

Best regards,

Graham

Hi @Dancergraham, in this case you would need to recursively (in case of nested blocks) “explode” the block virtually and try if the method gives you memory sizes of the geometric objects inside the block. I have not done this yet because i use blocks only to reduce the memory footprint in case of many, many geometric instances, which i of course optimize before making them to a block.

_
c.

1 Like