Huge Autosave File Sizes

Does anybody have any experience of massive autosave file sizes - my example in question is a 280mb file, autosave file becoming a whopping 11GB. This is crashing my system every time - of course I can switch off, but I should be able to use the autosave function of course for safety. So that’s hardly a fix.

So far McNeel think it may be something materials related. I also have suspicions about it being something related to UV meshes perhaps, like the default surface mapping in the 0-1 space. I can’t attach a file right now as it’s not so far been able to be replicated, and also contains sensitive information. I noticed a peculiarity during purge, where there were 2094 instances of ‘legacy materials’.

I know I can’t expect any kind of fix with the info provided so far, but just want to see if anyone has shared my experience. It’s really not a complex or taxing file. (lesson learnt, i could break the file up a bit more of course).

2094


systeminfo-hugeautosavefile.txt (2.2 KB)

Did you use any large image files for material definitions, backgrounds, or Picture surfaces?
They are stored in the 3DM file by default.
Use the lowest resolution images you can that do not degrade the display clarity you need.

Render meshes can add about 10x of data to the file too.
SaveSmall strips the render mesh when saving but will need to be regenerated when the file is opened again

“Bad objects” can create huge render meshes too.
Run SellBadObjects snd get them fixed.

Also done ClearAllMeshes. SaveSmall is not even completing right now.

Textures wise, I have used some at approx 2K, we’re tallking 26kb files

audit.txt (941 Bytes)

Plug-ins can save lots of information in the file too.
What extra plug-ins have you added?

The Audit3dmFile command in V6 will give you the detail.
The bottom section of the file is the plug-in section.
Scan through the results looking for big data chunks.

Dear John,

Can you read anything into the above audit text? Apologies, away from my desk for a bit now.

No.
The Audit command doesn’t generate the needed detail that the Autid3dmFile command does.

So out of curiosity I checked my own Autosave files, and as of yesterday they seem inflated also. Not as big, but the last couple are 1.4 GB where the file saved with meshes is about 1GB(300MB without) and all the rest of the autosaves hanging around for this project are closer to the SaveSmall size, autosave is set to not save meshes. I can’t say for sure something weird is actually going on, but…huh.

I see the autosaves are larger than expected here as well… I’ll take a look - thanks.

original: Archive size = 2883882 bytes (end mark size = 2883882)
autosave: Archive size = 4144990 bytes (end mark size = 4144990)

@JohnM - do you have an idea?

In my case, there are render meshes on the autosaved objects, plus apparantly some other bloatage. Still poking…

https://mcneel.myjetbrains.com/youtrack/issue/RH-53781

-Pascal

Here’s my full audit3dmfile.

jh-audit3dmfile.txt (2.1 MB)

Save render and analysis meshes in autosave file is set to off.

image
tmp-files
Also getting a large TMP file - as Rhino chugs and tries to complete the SaveSmall command, I can watch the TMP file write and gradually overtake the 3DM file size by a large margin again.

Anybody have any tips… I think I’ll just have to break this file I’m working on down into smaller chunks and try and hope things work out.

Hi all,

Okay so I managed to (after an arduous morning) maneuvre myself out of my problem, siphoning parts of the file out until I was left over with whatever was bloating it. End result is an incredible 9GB 3dm file, with the following audits. Naturally, this isn’t something I can easily transfer via WeT etc. Any further thoughts?

audit-3dmfile-v9-02-design_9.12GB.txt (210.4 KB) v9-02-design_9.12GB.txt (861 Bytes)

have yo tried to split the file?. a worksession is a good option to maintain project

Hi Javier,
Yeah I split the file up to get all the constituent parts and isolate the bogey part. Even so, I’m pretty sure there’s something up that is more than meets the eye. The dodgy part was a very basic Make2D, if the contents of the file I see are what is reflected in the file size (however, I think it is probably not that simple).

I’ve sent a download link file to Daniel Wunsch at MNeel, with the problematic file in question (~9.12GB). I also sent him my simple polysurface, which saves and loads easily @ 10.5MB, but the generated Make2D kills it completely.

Hi JD

Could you also send the link to me - andy@mcneel.com - I would like to look into this.

  • Andy

Hi Andy, I’ve sent you the wetransfer.

Steps to replicate:

  • Bottom View

  • Make2D

  • Attempt to SaveAs

  • Rhino becomes extinct

try-to-save

The 9Gb file that I received has a corrupt texture mapping table. I can see from your audit that it wasn’t corrupted when it left, so something has gone wrong in the download/upload.

What I notice is that the 9Gb version of the file contains only curves and text dots (as objects). Since the problem is caused by the texture mapping table, and curves don’t have texture mappings, it’s possible you have deleted all of the objects that caused the problem in the first place in order to make the file smaller.

Ideally I would need a link to the original, gigantic file, no matter how big it is.

  • Andy

Thanks Andy.

Were you able to replicate the separate make2D aspect? This contains the polysurface which becomes problematic once the curves are produced.

Do you have a preferred way for me to send the 9GB file?

bump… anybody replicated my issue?

Did you send Andy the gigantic file as he asked?