Memorising and using data in a definition

Hi all,

To the best of my knowledge, GH always recomputes the entire definition when an input is changed (if this isn’t so, please correct me). This issue becomes problematic because in calculation intensive definitions there is always static data that keeps getting recalculated.

For instance, let’s say I have a grid with 1 million points that acts as a “resolution grid” for other calculations in my problem. This grid is set once in the beginning and I shouldn’t need to recalculate it everytime I change a parameter downstream, right?
I will eventually create a component where this issue is easilly solved, but having this level of control could be useful in the definition environment. Can it be done?

You can use “data dam” component.

But again, with your “grid with 1 million points” if you want it static and if you did your definition correctly, it shouldn’t recalculate at all!
Maybe you have a changing variable that is used somehow in the construction of the grid, and so it must recalculate.
But a component recalculate only if one of its previous component have updated.

If you make a subtraction like “A-A=B” , B will always be 0, but B will still update at every change of A, and every component linked to B will update consequently.

Turn on Display>Canvas Widgets>Profiler to actually see which components are taking most of time.

Riccardo, thanks for your clarification.
The Data Dam component directly answers my question.

The Profiler widget always shows the last calculation time, so it’s difficult to understand whether a recalculation was performed or not. Maybe there could be a sign next to the calculation time indicating if the component updated or not. This at the same time would indicate if the component has an indirect relation to the input being changed - it can be hard to visually identify this situation presently.

By the way, how can I display the complete solution time? I know after a threshold this info will show. Can I control this threshold?


You could manually internalize or use data dam (which does not save data in gh file), and then disable heavy components.
Or something like this:

to “disable” components by removing inputs. (i know it is a dumb solution…)
It really depends on the situation, maybe it’s better to try to optimize the algorithm instead…

For the total runtime…
At the bottom of the grasshopper windows there is a single line message box, usually it says:
“Autosave complete (x seconds ago)”
but if it did a long calculation it will tell how long it lastes (like a couple of seconds or more)