Is there any way to see how much memory/data it takes to define a particular geometry? For example, for me to see how much data is in a chair mode (which is an element of an architectural model)l, I have to copy it and paste it into a new file, and then save to see what the resultant file size is. Is there any way to select an object and find out how much data is in that object/geometry?
Hi Lawrence - not in plain Rhino - I seem to remember seeing something like this in the Rhino Common SDK, but I canât find it now so I was probably hallucinating. Iâll poke a bit more but Iâm not too optimistic.
-Pascal
Itâs a bit crude, but you could make a new file with nothing in it ans save it.
Record the file size. This is the file overhead.
Make another file with the object in it and save it.
Subtract the two to get a rough estimate of how much bigger the new file is.
Things like embedded images, render meshes, material assignment, etc. will greatly effect the file size.
I guess what I was hoping for was something more like the properties window. When you select a geometry, you can see its layer, material, type, etc. Why not the amount of data required to define the geometry as well?
The Audit3dmFile command breaks down the file size based on categories (not individual objects), but you might find what you need in that list.
@clement - yes, thanks thatâs the chap! I could not find it again - thanks. Dunno if it is really useful here but maybe worth an experiment.
Did I do this right?
import Rhino
import scriptcontext as sc
import rhinoscriptsyntax as rs
def test():
Id = rs.GetObject(preselect=True)
if not Id: return
x = sc.doc.Objects.Find(Id)
print "Runtime memory use estimate:" ,str(round(x.MemoryEstimate()/1024,3)) , "k"
test()
@lawrenceyy the above bit of python may tell you what you want to know - it is not the file size on disc but the runtime memory footprint that is estimated. I have no idea yet if or how things like materials and textures affect the estimate or if it counts render meshes (probably not). Wrong. It does appear to take render meshes into account.
Here it is as an actual py file (RunPythonScript).
ObjectMemoryEstimate.py (292 Bytes)
-Pascal
It sounds like what Iâm looking for does not exist (displaying object data size in properties panel). Maybe this could make it into the next version of rhino? Sometimes when I open another personâs model, I canât tell what is making the model slow or is making the file size so large. Being able to see the object data size would be really useful for auditing the model.
What could be really cool is if there was a shaded mode where data size was represented in color so you can see where there may be atypical objects that are using a lot of data. Or maybe like a infrared filter where high concentration of geometry and data results in a hotter color while areas of less geometry or data is cooler.
It may not be 100% accurate, but the most heavy part of your model would the in most cases the mesh or render mesh used to display your NURBS objects. So there is a relationship between how many triangles each objectâs mesh has and how heavy the file is (well, with Block Instances it will count only once even if you have many of them). So, as mentioned above there is no automated way to preview your file in âheavy-filterâ (I like the idea though, should be doable via scripting) but if you want to check any particular objects, try _PolygonCount command. It will give you the # of triangles per-object or per-selection if many selected. Rule of thumb will be anything above 100,000 means âheavyâ. There is probably close to linear increase in object âweightâ in file based on its polygon count. Most NURBS object definitions donât take that much memory/disk space, since it is mostly math. It is the representing meshes that do. Thatâs why SaveSmall command makes files small - render meshes are discarded (files with mesh objects will not really get smaller).
I may take a stab at scripting the âinfraredâ mode at some point since I like the idea. Unless someone gets at it first.
hth,
âjarek
@lawrenceyy, i too like the idea of some kind of display mode using hot cold colors, however it might be slow to draw all objects according to their memory estimate. Below is a script which just selects the object (must be selectable PointCloud, Curve, Surface, PolySurface or Mesh) with the largest Memory estimate.
SelectLargestMemoryEstimate.py (1018 Bytes)
Note that an objects memory estimate or its polygon count are not necessarily what âslows downâ a file, i could imagine having a 10Mio poly mesh shown without wireframe which navigates smoothly while a single cube having the isocurve density bumped up too high, influencing the display speed much more.
c.
Oh yeah ! You´re always faster
c.
Good point @clement lement, what usually would slow Rhino down is object count, not how heavy they are. You can have 1 object with 1,000,000 polygons running very fast vs. 10,000 light objects going very slow. Sounds like you bring geometry from other software - I have seen SKP or Revit files with a small shrub element slowing down entire building model because the block consisted of 10000s of sinle mesh triangle facesâŚ
-j
Nice - finding the biggest one is probably more useful than per object query.
-Pascal
Hi Clement & Pascal, Thanks for these great scripts - very useful!
I have lots of items in blocks in my file. With @pascalâs script I get a nominal object size, presumably corresponding to the cost of an instance, not the unerlying geometry. With @clementâs script I get another object, not part of a block, and which I suspect to be smaller than some block objects.
Did either of you (or anyone else out there) do anything to handle blocks / block instances? I would like to be able to identify the objects, including blocks, which are increasing my file size.
Best regards,
Graham
Did either of you (or anyone else out there) do anything to handle blocks / block instances? I would like to be able to identify the objects, including blocks, which are increasing my file size.
Hi @Dancergraham, in this case you would need to recursively (in case of nested blocks) âexplodeâ the block virtually and try if the method gives you memory sizes of the geometric objects inside the block. I have not done this yet because i use blocks only to reduce the memory footprint in case of many, many geometric instances, which i of course optimize before making them to a block.
_
c.