Save all from inside of a cluster?

Hi,
this is something, which is bothering me for a longer time already (especially since switching to Rhino 6 which seem to love to crash every here and there) and I wasn’t able to find a solution anywhere so far.

Problem: I’m ofter working on large file taking a lot of time to calculate with a lot of clusters and the only way I know to save the whole file is to save and exit a cluster I’m in, get all the way to the main script and save all from there. That’s super annoying when you’re working in a large cluster taking a lot of calculation time, not mentioning, that when you re-open the cluster, you’re not zoomed-in where you finished before. I’ve tried in the past “disable solution”, close, save, open, “enable solution” to save time, but then sometimes the cluster doesn’t get the inputs right.

Because of the one gets lazy to close, save and re-open, but then when it happens and rhino crashes when you’re inside a cluster, you just loose everything…

Am I missing some feature how to effectivelly save all from inside a cluster without leaving it? Or if nothing like this exists in grasshopper, would it be so hard to add a “save all” button?

Thank you,
MH

I believe you’re misusing the clusters.

Yeah they are useful to shrink the definition and make it neat and clear of all wires, but one should create clusters when he’s absolutely sure the definition inside the cluster works properly. My understanding of the cluster is that they are meant for reuse, RE-use, to create a definition make a cluster then create UserComponent or whatever it was called and simply drag it to the canvas as normal component.

If you don’t care the additional work and you are annoyed so much about going inside the clusters then you’ve lost the advantage they give you, so you’re better off exploding them.

I can’t think of a way to perform a save-all in GH, it would make sense to have such a feature.

If your clusters aren’t embedded but rather referenced to actual files on disk, then the saving wouldn’t have to be recursive all the way up the document tree, but there are other complications that arise when using referenced documents as clusters as opposed to just nested documents.

You can use .gh files as clusters?
How?

but that’s a ghcluster not gh. It is the same as if it is dragged from the ribbon, am I getting this right?

It is not like rhino worksessions right?

Ivelin

→ In a way you’re right about me misusing the clusters concept, but just for illustration, I’m attaching a screenshot of a file I’m working on - pretty much everything on the screenshot is a cluster, sometimes consisting of hundreds of grasshopper elements, or even more clusters… There’s no way to keep this organised without Clusters (just with groups for example)… It’s super-effective to keep the file organised like this, especially when you need to often update version of some cluster or completely replace it with a new version.

I’m used to work like this from Rhino 5, where I could work for hours inside a cluster within another cluster and nothing crashed until I did something (like something causing total CPU or memory overload). Now Rhino 6 seems to like crashing randomly at simpliest tasks, which brought me in deep frustration into starting this forum discussion…

David → Thank you for your answer, it would be more than super-appreciated to have a “save all” feature, ideally with some convenient keyboard shortcut. I guess when you’re having a cluster saved in an external file, it works better with the autosave? I should start using that…

Another smaller question - is there a way how to setup data dams in the way that they don’t pass data when opening the file? I don’t really see a reason why they do by default actually?

Thank you for your answers guys, it’s much appreciated!

Yeah it’s not ideal either. The potential is there and you can edit referenced cluster from whatever file uses them (you don’t have to open your important massive file that takes forever to solve), but it’s hardly an improvement.

We’d really like to learn more about that. Are you submitting those crash reports? Is there ‘submit report’ window after a crash?

I feel you brother! You should see my definitions. But exactly because I cannot save all I simply group them and keep them distant from each other instead of creating clusters. Hopefully in GH2 the clusters will be able to update themselves, then I’ll use more of them.

Another way to avoid such crashes is to pu DataDams where you know you’ll trigger huge calculation. This will help you make changes smoothly.

Also I try to use ghpython as much as possible, thus reducing the number of components even more.

1 Like

I’m usually submitting when I get a crash report, but often I only get this standard windows “this program stopped working” with no submit report window, it’s kind of 50/50. With this particualr file I often get crashes even if I only “run” the whole file “at once” (If I disable some key elements, then go from start to finish enable one, let it calculate, enable another one and another one, etc. then it works fine). Rhino 5 would not do that, it could let me wait for good couple of minutes, but still load after… Then I ofter get a crash while running something super simple like move or mirror component. If I then open the same file, do the same, with the same inputs, it’s suddenly no problem. This seems to be happening quite randomly, not really connected to any element in particular…

Yep, already having “save all” would sort my problem, as I’m quite used to pressing ctrl+s almost automatically, but as described earlier this is not a solution when you’re inside a cluster…

I am using DataDams already (highlighted in green in my previous screenshot). Would you have an anwer to my previous question:

“Is there a way how to setup data dams in the way that they don’t pass data when opening the file? I don’t really see a reason why they do by default actually?”

I’m often using c# as well to reduce the computation time…

@DavidRutten has to answer this, I assume they use sticky or some other technique to store in the memory and it needs to be updated (filled with data) on launch.

Yep, I’d be keen to find out. It would make my life so so so much easier to have datadams which are not passing data on launch… Otherwise the whole element is kind of loosing its meaning for me.

if they cannot you can create simple ghpython scripts with two inputs, one is the data, the other one is boolean toggle button, and it won’t let anything through unless you toggle it :wink:

I use such ‘switches’ to get some time to save while using self-updating components :smiley:

I suppose I could do that, just that it would be nice to use the default component when it’s already there…

Bi stable switch: Open / Close

False:


True

// Rolf

Thank you for that, but that’s not really solution for my problem, as like this I would have to switch all “gates” to false manually before saving and closing the file, otherwise they will be opened when opening the file again… What you’re proposing is a more complicated version of what I’m already using. You can also just break the wire with a bridge element and disable/enable it to let the data through or not…

image

Yeah, there are many different ways to skin a cat… :slight_smile:
// Rolf

I think that what you need is the “False Start Toggle” component. Unfortunately it belongs to the Lady Bug add-on. Installing it for just that component seems counterproductive.

Just like a normal Boolean Toggle, except it always reverts to “False” on file open.

As a side note, I think that what Rolf is proposing is slighlty better than what you are doing. You could connect one single boolean to multiple gates, disabling them all with a single double click.