Managing Large Projects

Why do you feel the need to add more irrelevancy here ?
It’s not a size issue, of course. I just mentionned the size of my projects because Proterio seemed to claim he was passing lots of data through the INPUT/OUTPUT.
If you can READ, I specified :

I think that if your goal is just a bit more ambitious than playing around, this workflow cannot be trusted, and ends up being a waste of time as it is not robust enough yet.

We found a workaround to the Server file path problem as we store our Data on Google Drive and mapped it to a virtual local drive with the Google Fileshare system.

Yet, there is no way to dynamically change the GHDATA file path, and this information needs to be set by right-clicking the component which is awkward of a “Data-flow” system.

Regarding the specific B-rep that never managed to pass through in the GHdata file, I’m struggling right now to put my hands on it.

Needless to say that without that single bit of data, the whole process came to a halt, and all the efforts in segmenting the workflow in multiple GH files were rendered useless.
We had to reconfigure our whole approach to the problem, and this was a huge waste of time, plus a big disappointment.

I have the feeling that the discussions in this forum are really loosing their constructive and positive character in almost any thread. Don’t know if this is related to the general Corona depression. Any problem which is related to Software can be solved to a certain extend. Its not about providing perfect solutions but to hint guidance and propose different ways of thinking. This however requires people to stop complaining and investing own brain power to correctly ask for help and finding own solutions.

If Data Input/Output is not going to work for some reason, maybe someone with coding knowledge can propose a different solution. By the way, there are pointers and low-level solutions for C# and the .Net platform in general. I deal with even bigger data on time critical applications so there is definitly a solution to whatever the problem is. Just saying…

Merely stating my painful experience here.
It’s great that you have the time and skills to develop your own data transfer system between GH definitions, as you are keen to point out.
I don’t have these skills yet, and hoped to rely on the INPUT/OUTPUT GH components.
To tell the full story, I first discovered this feature while in advanced GH training with David in Barcelonna.
We used it to carry data along in a chain of definitions which each had it’ own didactif focus.
It gave me the goosebumps as I understood the potential of this to split the workload between multiple developers on a large upcoming project.
Sadly, it just didn’t work.

Unfortunately we are not aloud to map certain folders on the server. We’ll have to implement a work around when we get up to the next stage of the project.

Are you able to try and internalise the brep and save the file. Then when you reopen the file, is that brep still there?
That might check if the brep has issues deserializing or serializing originally.

Hi Christopher,

I’m unable to get my hands on that specific example, unfortunately.
The Brep in question was a GH objects, not a referenced Rhino object if I remember correctly.
At the time, I just baked the said Brep and checked if Rhino considered it as a “Bad” object, which it wasn’t. It was a complete mistery.
I think I sent the whole thing to David, but I can’t find that darn message fo some reason.

Just asking, have you succeeded in isolating the issues, regarding Data I/O, into a minimal definition others can test? I’m a little bit interested here.

Yep, thats what I wanted to propose. Create a new thread with an minimal example and maybe someone has fun solving it for you.

I really don’t like this statement. Almost everybody in the Western World has time. Eventhough working fulltime and caring about a child after work, I frequently find time for some reading and playing around.
Of course you have to invest a lot of time to get better in something. But essentially creating GH definitions and solving coding puzzles is not so different at all. Just that writing code gives you more freedom. In the end you might save time at work if you implement a taylor-made solution.

I don’t like this statement either however I agree with osuire. :rofl:

ps: it’s my 1st time to use nested quotes!

All my data between serializing and deserializing is generated in grasshopper, hence why I was wondering if there is something about that brep could be blocking the reading writing process when saving.

The data may seem fine going between components like my case, but when it comes to saving, there was issues with the data.

I agree of course. But if a feature exists, I rather use it than re-do it myself, specially if the one who made it is David.
If you have made something that works better than the OUTPUT/INPUT components, you might want to share it here, since this is the “Managing large projects” thread, instead of patronizing.

There’re indeed other (de)serialization solutions, don’t know if you’ve tried some.

Christopher, did you send this example to David ?

8 posts were split to a new topic: Brep goes invalid between grasshopper files

That was a custom Param so I found the error in code. And fixed it.
The error meant it worked in code when moving between components. But the data could not be saved when opening and closing the file.

my my my… someone needs a hug …

1 Like

I was contributing here before and sorry this thread is too generic for your particular problem. There is not always a generic solution to something. The initial thread was about dealing with definitions which contains a lot of components and not big data! I don’t know if my solution fits your use case. This is why I’m asking you to define it properly in a new thread with a minimal example.

You are right. I’m just kind of pissed off by the folks who come out of the blue to tell you everything is working fine on their end.

Been there )))) see my thread on Surfacing :confused:

1 Like

Care to share the link ?

Well, you are right. Large projects and data flow are related but they are not exactly the same thing.
One could deal with large geometry and properties data sets without using a generative workflow.
Yet, we are in the Grasshopper part of the forum, and it makes sense to use of dataflows on large projects involving teams working in different locations.
So I guess anything related to the INPUT and OUTPUT paradigm is relevant here.

Of course, there are many ways to do this, and Frontinc even gave it a name : Building Information Generation , wisely presented as a more mature form of BIM.