I’m running a pretty complex GH definition in a pretty big rhino file, I’ve been working on it for more than a month but as of today it became insanely slow. The problem seems to be that everytime I plug some component (either using input or output) it recalculate all the definition even if it is not affected by my pluggin (eg even linking a panel to a text). Do you have any knoledge of updates of GH or even plug-ins that could cause this?
Well as a workaround you can deactivate Autosave (or reduce it to a max), with the drawback of not having autosave…
The deeper problem, why saving takes so long: I guess there are many potential reasons a lot of components on the canvas, internalized data, etc… I think clustering or bundling components in bigger script components (probably precompiled ?) could help as less data needs to be duplicated, but I don’t really know the details. I think there have also other posts here related to this issue.
thank you dsonntag! unluckily I doubt the problem is cause by the “architecture” of the definition cause last week was working as expected and this morning is 4-6 times slower, I tried to see if there were some updates but couldn’t find anything so far. I’ll reduce the autosaves hoping to accelerate it a little but I thinks this problem needs a long term fix
edit: I managed to restore its previous speed disabling the autosave every wire and data mapping event
The reason autosave is slow is almost always because you have internalised a large amount of data. It just takes time to serialise a thousand breps or meshes.
A possible solution to this is to break your file into smaller pieces, and especially put any precomputed or internalised date in separate files. You then write this data to one or more *.ghdata files using rhe Data Output component, and reference the data back into your working file using the Data Input components.
You can also split your working file into a sequence of loose files this way to reduce complexity.