Let me give you and example:
I work a lot with very “dirty” meshes. They need some very heavy processing for cleaning them up before I can start doing my atual analyse of the. If the GH definition is to recalculate starting from this cleaning process, every change (like iterative form finding) would take forever. I can’t even have the Cleaning Definition opened because it slows everything else down. For other steps in the process I have different definitions for different meshes (depending on quality), so therefore it makes sense to modularize the definition for only those steps that need variants. And so on. But at times several such steps must be active at the same time in a long process chain (due to user interaction, one can not save part-results temporarily only to open next definition, and so on, it must be one responsive chain of definitions).
Although there may be alternative solutions, like DataDams, and logical “process paths”, too big definitions isn’t easy to maintain. Definitions should be like components (chained into networks), only on a higher level.