Hi everyone,
I want to run Meta´s new segmentation algorithm inside Grasshopper:
It is publicly available here: GitHub - facebookresearch/segment-anything: The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
It requires PyTorch and Torchvision.
Can I run it using ghpython? what about gh_cpython?
Do I need to create an environment and run it background and query it via Grasshopper?
Thanks a lot!
It won’t work in GhPython. Python 3.8 with pip is required, most likely CPython.
How are you feeding in the image from Grasshopper anyway? It’s straightforward to save that to a file and let an external Python instance process that file. It’s even got a CLI.
Hi James,
I evaluate perspectives that are captured from the viewport and analyze them using MSER. However, Segment Anything seems to be much superior.
This is a paper and a presentation of part of my research.
For me, it is essential to be able to run it in Grasshopper because I use the output of my analysis as a criterion for Galapagos, i.e. So I can´t save an external file to run it.
Thank you,
V
That’s really interesting - thanks for those links.
The user doesn’t need to know a temporary file is being used behind the scenes, or are many iterations required?. If SAM can be run locally, then SAM doesn’t know what context it’s being run from. It’s definitely possible to run SAM on a temporary image file, and read its output back in to Grasshopper (via another file if necessary). I’d be very surprised if it’s not possible to export an image from the viewport in Grasshopper to a file.
Doing the same thing entirely within Grasshopper is a large job with no guarantee. I’m not sure how else to get an image from a Grasshopper format (a System .Bitmap or SVG?) into a Python 3 process running MSER. Back-porting SAM to Python 2 and Iron Python is an option, but should be a last resort.
I’ve got some availability at the moment if you would like me to work on this.
Hi James, when one is navigating a solution space using, for example, Galapagos, one is creating hundreds of variations and, therefore, images. It is not reasonable to save them all, run SAM in the background and open the output.
I found a solution with Hops. I capture the viewport as a System.Bitmap, turn it into a base64, send to a CPython flask server using hops and bring it back as contour points. It works pretty good!
Thank you!
1 Like
Thanks for the explanation. Glad you’ve figured it out - well done!