Managing Large Projects

Simply give a name to the input of the output component.

How? Where’s the documentation for this?

right click on the empty case (where usually componant have a letter as input)

OMG. Somebody must be kidding? I renamed the input to ‘D’ and it now works.

So I can save only one list per file? Aha, I see “+” adds more inputs to Data Output so different types of data can be mixed in the same file. Thank you very much.

Where is the documentation for this feature? :man_facepalming:

test_out.gh (7.4 KB)
test_out
test_in.gh (6.5 KB)
test_in

2 Likes

…except they are extremely unreliable.

They have been quite reliable for me. I have 24 of them in one project with combined data of 16MB. pts, curves, breps, meshes, integers, floats. I use these data files to get data between project files.

My projects are more around 1.5 GB, but that’s not the issue.
In my case, which I have already sent over to David, some data doesn’t go through, and that breaks the deal.
There are also issues with some file paths, but hey ! I’m happy it works for you.

I guess they are “rarely” unreliable instead of “extremely” ? :man_shrugging:

The problem I have is that they don’t work with server file paths.

@osuire
with the data not going through. I have had that problem, but that is because the data is in a custom class/param and I haven’t serialized/deserialized the class properly.

OK, let’s just say that when it doesn’t work , I can’t finish my project.
In this case, even “rarely” unreliable breaks the deal completely.

1 Like

No, in my case, it was a specific Brep that just wouldn’t pass through for some reason.

1.5Gb … thats some Brep!
In theory this is irrelevant. The disk write looks like just a binary dump of the data STRUCT from the object in question. In low level languages this is trivial, in very structured languages, there can be problems passing data or pointers to data that big within the framework. .Net and Java are famous for problems like this. I have no idea in what language GH is written, but given the multiplatform goals of Rhino, and the notes I saw once about brittle C++, it is probably very structured.

Its hard to imagine a Brep that size. Is it a complete building all joined together or something? How many surfaces/solids? just curious

What the hell are you talking about ?
It’s the size of the Rhino file, not the size of a single Brep…

The implication was that the Brep was 1.5Mb, but whatever. You seem upset so ill refrain from further comment

Learn to read, man.
Yeah, feel free to “refrain”.

To be fair, we have been talking about the
Data Input and Data Output components. @Proterio mentioned he has 24 of these components in one project, which amount to about 16MB of data being passed on.

@osuire for what seems to be a reading miscomprehension mentioned he works with 1.5GB files, which is a totally bizarre piece of data for the conversation. Who cares what size your files are and why is it relevant to the discussion ? :rofl: Clearly @Proterio was talking about the amount of data being passed on, not the size of the Rhino file. Maybe you need to learn to read, man?

Why do you feel the need to add more irrelevancy here ?
It’s not a size issue, of course. I just mentionned the size of my projects because Proterio seemed to claim he was passing lots of data through the INPUT/OUTPUT.
If you can READ, I specified :

I think that if your goal is just a bit more ambitious than playing around, this workflow cannot be trusted, and ends up being a waste of time as it is not robust enough yet.

We found a workaround to the Server file path problem as we store our Data on Google Drive and mapped it to a virtual local drive with the Google Fileshare system.

Yet, there is no way to dynamically change the GHDATA file path, and this information needs to be set by right-clicking the component which is awkward of a “Data-flow” system.

Regarding the specific B-rep that never managed to pass through in the GHdata file, I’m struggling right now to put my hands on it.

Needless to say that without that single bit of data, the whole process came to a halt, and all the efforts in segmenting the workflow in multiple GH files were rendered useless.
We had to reconfigure our whole approach to the problem, and this was a huge waste of time, plus a big disappointment.

I have the feeling that the discussions in this forum are really loosing their constructive and positive character in almost any thread. Don’t know if this is related to the general Corona depression. Any problem which is related to Software can be solved to a certain extend. Its not about providing perfect solutions but to hint guidance and propose different ways of thinking. This however requires people to stop complaining and investing own brain power to correctly ask for help and finding own solutions.

If Data Input/Output is not going to work for some reason, maybe someone with coding knowledge can propose a different solution. By the way, there are pointers and low-level solutions for C# and the .Net platform in general. I deal with even bigger data on time critical applications so there is definitly a solution to whatever the problem is. Just saying…

Merely stating my painful experience here.
It’s great that you have the time and skills to develop your own data transfer system between GH definitions, as you are keen to point out.
I don’t have these skills yet, and hoped to rely on the INPUT/OUTPUT GH components.
To tell the full story, I first discovered this feature while in advanced GH training with David in Barcelonna.
We used it to carry data along in a chain of definitions which each had it’ own didactif focus.
It gave me the goosebumps as I understood the potential of this to split the workload between multiple developers on a large upcoming project.
Sadly, it just didn’t work.

Unfortunately we are not aloud to map certain folders on the server. We’ll have to implement a work around when we get up to the next stage of the project.

Are you able to try and internalise the brep and save the file. Then when you reopen the file, is that brep still there?
That might check if the brep has issues deserializing or serializing originally.