I’ve uploaded ‘‘20211224 Moondoor grid maker’’ to my account ‘‘moondoor’’ / ‘‘firstname.lastname@example.org’’
The configuration updates in the GH model. But when I upload it to Shapediver, the Shapediver Export Download Component stops working after 1 succesful attempt.
Please visit: Test environment to see how it doesnt update.
I just tried it, it happens in both the new and the legacy platform.
However, the downloader seems to work fine when I try to output a .3dm file. So I guess it has to do with the compliation of my .IFC file. I will ask GeometryGym for help.
Good idea, I actually did not. But now that I did, I can confirm the streaming works fine locally.
I copied the data from the stream for two different configurations and they came out as expected: Test locally 2.txt (15.8 KB) Test locally.txt (12.1 KB)
Im not allowed to upload IFC but here they are in .txt Changing the format to .ifc allowed me to load them into an IFC viewer (Solibri anywhere) just fine.
I could not reproduce this issue on my side yet. Is it possible for you to create a minimal file using the gglFc stream component, check whether you can reproduce the issue on this minimal file and then share it with me for testing?
Geometry Gym IFC components don’t conform to typical Grasshopper conventions.
IFC has a primary hierarchy for the elements within (spatial breakdown), and an element has a mandatory host to be contained in a spatial element, or to decompose a parent element.
But years ago we decided to make the host element a mandatory input to a component, rather than require all the elements to be wired into a host. This is also because there is a secondary placement hierarchy where the product placement should be relative to it’s host. Rather than force users to comply with this, it’s easier to mask this (and other complexity) back of house.
Other reasons there are orphaned chains of components without a single root collector component, is that some users want to edit an existing ifc data set. Wiring all these changes into a single root component isn’t ideal.
So our work around at present is to force a staged solution. We have an event watcher and multiple steps to solving occurs, and the file stream is forced to delay until all other ggIFC components have solved.
If shape diver is looking for components with upstream data expiry, it might be never forcing our file stream component to recompute if it is orphaned.
I have considered a number of times a more typical approach, but have never really come up with an acceptable solution (ie in a bottom up approach, how do you warn a user an element isn’t wired into an appropriate host when you can’t detect that at compute time). And do you force any editing component to somehow be wired into a root component. We don’t treat the IFC objects as immutable and part of this is because of the multiple tiers of relationships (ie a material that is common to many elements).
I’m more than happy to discuss this. The streaming did work in SD when we looked at this but it was some time ago and perhaps there was subsequent changes that have stopped streaming to work.
Jon, first of all thanks for your elaborate answer.
Just wanted to add that I just uploaded a twin version. In this version I connected the panel showing the IFC data to the data-export. Just to test.
It gave me an export result that was neither one I previously exported (for it was the first export), nor the current state, but a state of the geometry that I updated by moving the sliders before exporting. Afterwards, this export again remains the same whatever I do.
Let me explain what’s happening on ShapeDiver’s workers: Whenever one of our workers processes a compute request, it sets the input data and then makes a single call to GH_Document.NewSolution. Once this is done we read the results from the volatile data of those components which are denoted as carrying output data.
I guess this explains why the staged solution approach is causing problems. I wonder how it can ever have worked properly on ShapeDiver, probably it never did or only by chance.
What would be the best criterion to find out whether Geometry Gym has finished all processing steps? Would that be possible based on GH_Document.SolutionState ?
I went and checked test files.
Seems that the streaming was forced to solve last by wiring “dummy” dependencies. Something like the attached. Not sure if we’ve broken this in latter builds of the plugin, but should work with the version on ShapeDiver.
The fact that the issue seems like a bug sounds promising for my particular case.
Often I try 2 or 3 times to make sure. Also my retry just now failed.
Just now I also uploaded it at the old platform. The initial upload was succesful, but then I got this:
‘‘Oops! The server encountered an issue with your model:
Computation of your model failed. Please ask for help on the forum, including the reference shown below in your message.
‘‘Error Computation of your model failed for an unknown reason.’’
‘’ Error Oops! The server took too long to process your model. Model “d85c9ec9-781f-46b1-99a4-9a5644b6db45” on system "https://sdeuc1.eu-central-1.shapediver.com). Please check our forum on how to tune performance of your model.’’
Does this mean it exceeds the computation time (10sec)?
Since I reduced the previous (accepted) .gh file to a version with just the bare minimum, the time-consumption then must lie with this new ‘‘dummy component’’ workaround Jon and you suggested. Since we’re dealing with delaying information, it is not unthinkable.
Is it possible for you to check where the problem lies?
Also, the old platform returns:
‘‘Oops! The server encountered an issue with your model:
Computation of your model failed.