Batch analysis

I am trying to supplement a simple beam analysis with some uncertainty quantification calculations. I thus have a dataset of 100 [beam width, beam height, beam length, Young’s modulus, force] samples, represented in GH as a datatree/in GH Python as a list of lists.

I would like to feed this into Karamba and get the corresponding dataset of 100 [displacement, stress] results. How can I do this efficiently, i.e. by not duplicating the same model with different input parameters 100 times?

You could step through each state with a timer + a counter, and save the results with a data recorder? Here is a very simple example showing changing a load and recording the displacement. Use the counter to instead pick your parameters from a list:


stepper
Very simple beam.gh (16.2 KB)

Hi @hkit.yong, Johns suggestion is indeed a great way to analyse and record many sets of data. You could also try looking into Colibri: Colibri Release | CORE studio

1 Like