CURL for Hops component

Hi all!

I’m working on modularizing my models with Hops components. Especially, i want this to work along with rhino.compute.

An issue i have is, where and how can i host my Hops components so that they are accessible to my rhino compute instances, but not freely on the internet. I’ve been thinking of a few solutions:

  1. Have them hosted on Github and work with a Private Access Token but it seems the use of PAT in the URL have been deprecated
  2. Have the definitions hosted on each of the servers with a dedicated folder path, but for a reason i don’t really get, it seems that having rhino.compute run on a specific server doesn’t imply that i can query the the subcomponents from the very same server.
  3. Have the definitions hosted on an open blob storage and whitelist IPs but it seems kind of shady and not very efficient especially for my local machine, that doesn’t have a static IP.

Something useful (i think) would be to be able to turn Hops path in a Curl, so that credentials enabling to logging to a private Github repo (for example) could be included.

Do you see another possible solution? For step 2 of what i tried, is there something that appears to be missing?

Thanks in advance!

Hi all, to be a bit more specific on step 2 maybe.

What i tried (and seems to be my best option so far) is:

  1. I developed a small component that gets an environment variable from its name. It seems to work fine: cf screenshot here

  1. I installed the plugin and set the environment variable on both my local machine and remote server.
  2. Locally it runs fine with my local server: here’s the answer to the request: as you see the path is correct, a Mesh is provided and calculated height is correct
  3. When i try and request the remote compute server, this server doesn’t seem to compute the Hops component even though the path it gets is correct (see string output):

BUT: the solution works in two very similar cases:

  1. If i run it with the GUI on my remote server the solution computes fine

  2. Or if i hardcode the path and then run it with the remote server again.


    Nota: here i developed on my local machine and the path associated to environment variable is not the same so obviously the solution doesn’t compute.


Result of the request with hardcoded (which is exactly the same as the one of step 4 that didn’t work, if you look at the screenshot).


Do i miss something here?
@AndyPayne sorry for Yet Another Compute Related Question, but i think this may be closely related to the topics we’re already discussing?

All three files used can be found here (note that they use our component for folder path environment variable but i guess you can do it with a simple c# component):
hops_multiplication.gh (3.2 KB)
Easy hops model with hardcoded folder path.gh (16.4 KB)
Easy hops model.gh (17.2 KB)

Thanks in advance!

Nota: after more tests, i am now pretty certain that Hops component doesn’t work in compute if the Path variable is changed during the compute phase.

See following model where the user is prompted to choose between two files located on an open Github: hops_choose_component.gh

This works fine with one of the models (the one by default when model is launched with compute) but fails with the other one


Example working with hops_addition (default option)


Example failing with hops_multiplication

Why is this important IMO: i think having such an option would be an interesting way to share files that rely on a Hops subcomponent that should remain private, making it accessible via an environment variable etc.

Happy to discuss it anytime, or if somebody finds a workaround.