I have posted this problem previously as well but I guess I didn’t explain the problem well. Let me give it one more try.
I have a workflow to convert 2D images into IDF files, I need to repeat this process to create multiple IDF files with corresponding 2D images. Image 1 shows my workflow to covert 2d into 3d, the path node takes the path of images to be processed.
Fig 1:
Select an item ( image) from the list (list of images).
Generate an IDF,(with corresponding image name),
Clear the work-space
Repeat until all images are processed.
So far,
problems I am facing
I created a list of images to be fed into the loop, but the problem is I don’t have a way to clear the space after each image is processed.
In image 3, image is being fed to path(shown in image 1.)
FIg: 3
IDF files are basically used for energy simulation in energy-plus and a plugin called honeybee has been used to create IDF files.
Also the work space meaning the rhino’s display window. Basically I don’t want the models to overlap so I want the previous model to get cleared before the new one gets created.
import scriptcontext as sc
from Rhino import RhinoDoc
sc.doc = RhinoDoc.ActiveDoc
for i in _yourloop:
# do your things
sc.doc.Views.Redraw()
# export image
sc.doc = ghdoc
OK, so I guess @Will_Wang’s recommendation above should work. Or you could use this function, which recomputes the GHPython component every time you call it:
import Grasshopper as gh
def update_component():
"""Updates this component, similar to using a Grasshopper timer."""
# written by Anders Deleuran, andersholdendeleuran.com
def call_back(e):
"""Defines a callback action"""
ghenv.Component.ExpireSolution(False)
# Get the Grasshopper document
ghDoc = ghenv.Component.OnPingDocument()
# Schedule this component to expire
ghDoc.ScheduleSolution(1, gh.Kernel.GH_Document.GH_ScheduleDelegate(call_back))
If your GHPython component is recomputed, all the components that come after it should recompute to, which means that if your display data is created here, it will be cleared or rather overwritten by your new data, causing it to update.