Rhino file size Again:

I’m working on a Wiki article now taken from info in this thread and others… Just haven’t had time to finish it, too many other things on at the moment, hopefully before the end of this week. Here is the work in progress… The “discovery” section is more or less finished, the “remedy” section is yet to come.

Edit - 05.10.17 - finished first draft, have a look and please feel free to comment or add stuff…

–Mitch

5 Likes

Thanks Mitch. I think @ec2638 is working on a medal for you - at least that’s what he promised. I’m pretty sure. Up there ^.

-Pascal

Might be a while. Still in the ‘conceptual’ design phase.

images

images2

Very cool command!

@John_Brock Will the textures applied to the surface added to the byte size of the actual object? or Audit3dmFile purely for the object?

I imported many envirnoments to check which HDRI to use, and the file became huge… just removing the unsed HDRI saved huge space. Didn’t realize those and also textures sometimes are so huge…

Any possibility to upload to a cloud drive and send a link and password to mcneel support?

No. The materials definition will be stored as a material in the file. It was a Vray bug that was attaching multiple material definitions as user data to the individual objects that caused the bloat I referred to.

Finished first draft of Wiki article… --Mitch

6 Likes

Very useful Mitch.got a couple good tips.—Mark

Hi Mitch

I came across this incident a few months ago where a part model that should have never exceeded 30MB came to become 2GB.

A bit of background. I use Rhino to perform feasibility studies to the parts our company manufactures. The way it goes, one cad model travels between at least 4 design groups: the OEM, our part design contractor, our tooling source, and of-course myself.

Now in this back and forth travel, and before the part it is officially released by the OEM, the model is modified by all the above mentioned people in as much as 3 or 4 different cad programs.

In the mentioned incident, the OEM would design in Catia, ToolShop in NX, Contractor in Catia and myself in Rhino. At one point the tool shop is modifying my own part, which I converted out in stp at about 30MB. Must have been close to 2000 surfaces in that model.

When the shop exports it out in stp again (after the change) that part becomes 2GB. All the troubleshooting by NX designer inside his machine revealed nothing as the part was also sized correctly between 24-28MB.

Exported out in stp though, grows to 2GB. Inside Rhino (5/SR14) I could not use any tool to identify the culprit (better said, maybe I did not how to use all tools to identify the anomaly).

Notice I did not use the word “bad object” because the tools extract bad surfaces /select bad objects did not work in this case. I eventually solved it, not in 3 clicks, but rather in 3 hours by saving a copy and then deleting and -incrementsave until I found it. One measly fillet surface in its own was sized at about 1.8GB. Neither the shop nor myself generated that surface.

Now, the point of my blurb is that I read your procedure twice and I do not think it would have caught my oversized surface (again not a bad object as per Rhino).

Since this post came up I did play with the Audit3dmFile & Audit tools and it seems to me that these commands were not designed to be used by the people doing modeling day in and out, but rather for the people doing scripting. I should say, they are not written in “plain english” :wink:

As well, the tools _List & _What did not help as one cannot go through reading all that info for 2000 surfaces. And even if one did, I do not see where exactly I could have caught the “little” troublemaker.

I am sorry I do not have that file anymore, but the funny thing is that I thought at the time to post the file out here in the forum. Then, I too thought the size would have been an issue so I gave up.

Regards

Costel

Hi Costel - it would have been good to get that object to see if we could reproduce the problem. I don’t *think" this is the sort of thing Douglas1 is referring to as there’s no easy way out - As I read Douglas1’s ‘comments’ , I think the nub of his gripe, without all the decoration, is this: If Rhino can make a bloated file smaller with a few commands, then why make me run the three commands, just do it. The answer depends on what the bloat is, as discussed above, which is why we’d like to get a file in these cases.

-Pascal

In the case of bloat as a result of geometry, for example an inserted part with lots of detail, @Helvetosaur and @RIL provided scripts …

https://discourse.mcneel.com/t/file-size-contribution-per-layer/42241

… which list the the layers within a file and the size of the geometry on each layer. The list is shown on the commandline where it can be reviewed or copied to the clipboard from there.

The process of figuring out where the heavy geometry is, can be simplified by identifying the layer where the geometry resides.

That geometry can then be exported out of the file if appropriate, and brought back in when needed if it’s causing a slow down, or the layer can be isolated (others turned off), to make it easier to find the problem geometry.

If you need to know how to paste the script onto a button…

https://discourse.mcneel.com/t/python-script-to-rhino-button/38806

Yes
And along the same lines we can quickly look for the largest object by memory size

import Rhino
import rhinoscriptsyntax as rs

def main():
  gids = rs.GetObjects( 'Objects to check ?', preselect = True )
  if not gids:
    return
  obs = [ ( gid, int( Rhino.DocObjects.ObjRef( gid ).Object().MemoryEstimate() ) )
      for gid in gids ]
  obs.sort( key = lambda x: x[ 1 ], reverse = True )
  print( 'Size = %.1f KB' % ( obs[ 0 ][ 1 ] / 1024.0 ) )
  rs.UnselectAllObjects()
  rs.SelectObject( obs[ 0 ][ 0 ] )
 
main()

HTH, regards

1 Like

Can you clarify here? Was the .stp 2Gb, or only the Rhino file after importing and saving? If the .stp was already huge, then clearly the problem was caused by NX. If the file only got huge after the STEP was imported, then the problem might be with the Rhino STEP import function, for example it might have had some trouble with a weirdly defined surface exported by NX…

It would really be interesting to have seen that surface and to understand how it was created - if it’s one single fillet it must have had a gazillion control points and some extreme render mesh or something else… If I generate a 1920 x 1920 point surface and save it I get a file size of only about 90Mb. So it would have to have had 20x that many points to get to 1.8Gb!

Where did it come from then - it had to come from somewhere…?

Well, the object table would have been huge, but in V5, yes, there would not be a per-object detail, so you would have to go on an easter-egg hunt. In V6, as John Brock said, there will be a more granular analysis. On top of that, as RhinoCommon has a function to estimate a Rhino object’s “memory footprint”, there are some scripts that will allow you to analyze the whole file and find the heaviest object(s) such as Emilio’s above.

I will add a bit more info to the Wiki article soon for this type of situation, thanks for sharing your experience.

–Mitch

Gentlemen

It is common knowledge that person whom become so skilled, never want to ‘consider-try’ anything else. With all due respect that appears to be occuring!

John Brock’s statement is welcome.
However: if Select All or other fails to detect, will the improved Audit3dmFile command in Rhino V6 report all of the object detail?

Likewise the welcome suggestion of Toshiaki_Takano
However: To upload to a cloud drive, or another requires the same stable internet link.

Next one notes the exhaustive effort of Mitch in writing his Wiki article, more on this later.

Costel
This information is very relevant. In the past I have experienced mysterious appearance of surface slivers, 3 in all that the Lads at Orcad3D found in a drawing that was messing up their Stations-Buttocks-Watelines.

If I understand Costel correctly neither Catia, ToolShop in NX, Contractor in Catia nor Rhino visibly identified the rouge sliver. Which apparently took Costel 3 hours to locate. He also states that other in his group spent time.
‘Gentlemen Time + Money’
The points he makes re Audit3dmFile & Audit tools and reading all that info, and probably still not being able to see where exactly he could have caught the “little” troublemaker.
Where did it come from then He exclaims - it had to come from somewhere…?
Costel also mentions that he thought the size would have been an issue (obviously up loading it) so he gave up.

(Douglas1 comment)
I would not be too concerned as to where it came from. More the fact that you have no way of knowing ‘apart from Rhino file size’ that is

Pascal
Is ‘not’ correct in thinking this is ‘not’ the sort of thing Douglas1 is referring to.
To all intent and purpose it is! (That which one cannot see)

BrianM
Points to scripts available from @Helvetosaur and @RIL.
If one is attempting to assemble a solid from numerous surfaces, as the surfaces are joined they end up all on one layer.

emilio
States one can quickly look for the largest object by memory size.
Does it really have to be the largest object by memory size?
As Costel states his final product has been through four different cad programs. What if they have all contributed rouge lines curves – surfaces that he cannot see! There could be zillions of them!
One cannot take the file size of another format. The only relevant file size is the one Rhino reports.

UPDATE:

The following has been applied to individual copies of the 616 MB file.

  1. _SaveSmall, IN 616 MB Out = 609 MB
  2. _SaveAs with a different name and check the box “Save small” in the dialog. IN 616 MB Out = 616 MB.
  3. _Purge command with _Materials=_Yes. dialog. IN 616 MB Out = 616 MB.
  4. _-SaveAs (with the dash!); on the command line you will see an entry _SavePlugInData. Set that to _No and save. 616 MB Out = IN 616 MB Out = 616 MB.
    As one can see only Item 1. Made a difference of only 7 MB. Hardly anywhere near a reduction from 616 MB down to 11.6 MB.

It is clear that the file size on the 18 Sep 2017 was 379 MB, by 21 Sep 2017 the file is 616 MB.
Was the 18 Sep 2017 file (379 MB) already inflated or not?
To ascertain ‘FACT’ the same 3 Clicks procedure has been applied. Resulting in a reduced file size of 41.6 MB. Therefore it is clear that the 18 Sep 2017 file was already inflated by ’337.4 MB’ before it shot to 616 MB.
Items 1.2.3.4. Have been applied to the Original 379 MB 18 Sep 2017 file, which yielded a similar result to the above I.e. only Item 1. _SaveSmall made any difference.

Gentlemen there is a Problem:
Rhino does not identify ‘all’ that is in the file!
Furthermore: No one save the writer appears to have a method of seeing what is there!
Example attached

Wiki articles – awards are therefore premature!

Untrue. Dropbox, and probably any other good cloud drive will eventually have the file synced to the cloud. It will take some time if the connection is unreliable, but eventually it will be ‘on the cloud’. Then you share a link to us and we all can download and investigate, and, if bugs are found, fix code.

Until such a moment it is equally a waste of ‘gentlemen time’ to continue writing long texts with guesses, however educated those may be.

1 Like

From my own experience it´s always a plugin or in most cases (99%) the imported data. But as long as nobody knows anything about that file nobody can help you.

I personally never tried file half GB, but if it continues to upload from where it left off that seens a good solution in this case.
Many have around 5GBfree? I Suppose.

@Douglas1 seems like a plan… though not something you want from mcneel… at least things may move forward. Just my opinion…

I’d be more frustrated in one laptop maker promising XXWh battery when max it ever reached was (XX minus 5)Wh and only for a week… only to see it go down…errh

BS…

If file size is what you are concerned about, the largest object(s) will be the likely culprits. However, it is possible that your file size is due to thousands of small objects; in that case no.

I would think this would be a very rare occurrence - “zillions” of phantom objects that you cannot detect in a Rhino file but I guess it could happen. And the question would be - why are they “rogue”? Whose responsibility is it?
I think this would not be a Rhino problem then, it would be best to complain to the “other” program’s support.

Sure you can. Windows explorer reports file sizes on all files.

You have still not told us what 3 clicks they were. Obviously you know something we all don’t, so please enlighten us with your knowledge so that all may benefit from it.

Your image example doesn’t tell us anything at all. What has Rhino failed to identify and what steps were used to identify them?

And that’s the way it will stay. If you cannot provide us with an example to be examined, NOTHING will ever get fixed.

It’s a bit like going to your car mechanic and saying “There’s something wrong with my car”. The mechanic says “OK, what’s wrong?” You say “It’s not running right.” Mechanic: "Well, that could be a lot of things, can I see the car and take it for a test drive? You: “No.” Well, your car’s never going to get fixed that way…

Wiki articles aren’t written for awards, they are written for information. There is no one-procedure-solves-everything, hopefully the info will be useful for a number of people, but nothing can be written that can cover every possible case that could go wrong. That’s also why support forums exist. But to get proper support, one has to provide sufficient info to allow support people to find the problem. In this case, its only the file that can help.

–Mitch

Here’s a slightly different version,

  • if any items are preselected, it will report the memory footprint of all selected items
  • if nothing is preselected, it will ask you how many items you want to check. This way will also include hidden or locked objects as well as those on off or locked layers.
  • Lists (largest first) the object ID, est. memory use, and whether it’s valid, locked or hidden

Formatting is not incredible, but it gives you the info in a text out window that you can copy somewhere.

Edit: revised version
DisplayObjMemEstimate2.py (2.7 KB)

–Mitch

1 Like

Looking at the attached Jpg in @Douglas1 post above, I doubt whether you could reduce the size of the file anyway - looks like a really heavy mesh with thousands of faces.

Here is a screenshot showing how meshes, converted from a clean surface in this case, can increase (edit) the file size, obviously growing with the fineness of the mesh.

Thanks a lot @Helvetosaur and @emilio for these scripts, really useful having object size knowledge just a click away.