Hi everyone,
I’m building custom Grasshopper components using the Rhino 8 ScriptEditor (CPython) and packaging them into a YAK installer. Currently, my users have to perform two manual setup steps before they can use my components:
1. Create & initialize the Rhino-Python venv via a GH ScriptEditor component
I have them drop a special Grasshopper definition with a ScriptEditor component that runs this snippet on first use:
#! python3
# venv: myvenv
import locale
locale.setlocale(locale.LC_ALL, 'en_US')
# r: numpy==1.21.5, pyarrow==7.0.0, pandas==1.4.2, comtypes==1.4.8, pydantic
That creates the "myvenv" environment under %APPDATA%\.rhinocode\py39-rh8\site-envs\ and installs all public PyPI dependencies listed in the # r: line.
2. Install private GitHub-hosted packages
We host several in-house libraries on our company’s GitHub. Today each user must:
- Clone each repo locally.
- Manually add its path to their
C:\Users\<USERNAME>\.rhinocode\python-3.pth.
I’d love to turn this into a zero-config install so that users can simply run the YAK and immediately use every component. Ideally the YAK would:
- Vendor both the public and private packages inside the bundle (so no pip installs or
.pth edits).
- Or leverage the ScriptEditor’s
# r: feature to automatically pip install everything (including private GitHub URLs) on first component run.
My Questions:
- What’s the simplest, most reliable way to bundle both my public PyPI packages and private GitHub libraries into a single YAK so that users never have to manually set up a venv or edit
.pth files?
- If I opt instead to lean on the ScriptEditor’s
# r:-style requirements, how can I handle private GitHub installs (authentication, caching, version pinning) without exposing tokens or forcing extra config?
- Are there any other workflows for delivering complex Python-powered Grasshopper plugins with ~zero user setup?
Thanks in advance for all the help!
Daniel
CC: @eirannejad, would love your input if you have the time 
2 Likes
There is diffCheck which is a python grasshopper plugin, and here’s some extra details on how this is working with no pre-installations from the user’s side.
Compas is also based on the same structure as done in diffCheck. In both: you create a PyPi package with all the dependencies indicated in the setup.py, you build the wheel, make it available in the PyPi index and in all your components, you use the interpreter r: mycustompackage==x.x.x notation to download your library and all the necessary dependencies.
At least this is one to go, fully CI friendly..but there are many of course 
How private and secure do the private repos need to be? Is it allowable to include a token in the plug-in?
It’s not great security, and is limited to sdists, and I’ve had a bit of flak for it (but no suggestions for alternative solutions). But I often pip install as normal, from URLs to Github private repo, with a limited-privileges, read-only scoped, Private Access Token. Works fine for me, but I don’t know what’s different about Github Enterprise.
TY Andrea! I’ll take a closer look at these plugins.
Hey James,
I think I have an idea of what you mean. How do you usually do it?
Do you drop something like this inside your GH python component?
#! python3
# r: git+https://<YOUR_TOKEN>@github.com/yourorg/privaterepo.git#egg=privaterepo
Just a word of warning: I’ve only used this for compute jobs, and web server apps I install and run only myself, and completely control access to.
For shipping code to users, as well as limiting the token scope to the absolute minimum, you should be even more careful - either the update process needs to account for token expiry, or you’ll be forced to use eternal non-expiring tokens.
That said, I’ve not tried it inside a CPython3 component, but I would imagine something like 'privaterepo @ git+https://<YOUR_TOKEN>@github.com/.../.. will do the trick, as long as it’s a valid pip “dependency specifier”, which can just be a url. Within a pyproject.toml, I used it like this:
dependencies = [
"pypi-compatible-project-name @ git+https://github_pat_11A....UIB@github.com/OrgUser/PrivateRepoName",
]
1 Like
Andrea,
I have a question about the diffCheck approach.
The way I’m understanding this is that, diffCheck GH components have baked in the # r: diffCheck. So, when running any given component, I assume that it will install diffCheck and its dependencies to the python environment here:
C:\Users\<USER>.rhinocode\py39-rh8\site-envs\default-XXXX
An then, when we run other GH components in that library, it won’t reinstall diffCheck because it’s already in the environment. Since we don’t know which GH component gets used first, then all of then have the # r: diffCheck inside.
Does this sound right?
Yes correct!
The catch here is that all the dependencies and their corresponding versions are defined in the Python wheel in the setup.py:
setup(
name="diffCheck",
version="1.3.1",
packages=find_packages(),
install_requires=[
"numpy",
"pybind11>=2.5.0"
# other dependencies...
],
This limits the conflicts you could have (not as goodd as a seperate virtual env) and group together all the needed dependencies. When we stamp the interpreter flag # r: diffCheck (via the CI pipeline), it will always grab the most recent PyPI version.
2 Likes