RhinoCode ScripEditor for development of libraries

@Petras_Vestartas Ok I got it to work but the solution is quite complicated:

Important Notes:

  1. compas_wood depends on wood_pybind11.cp39-win_amd64.pyd that has its own list of dependencies.

The command dumpbin /dependents C:\Users\ein\compas_wood\src\wood_pybind11.cp39-win_amd64.pyd provides a list of dlls that the .pyd depends on. We need to make sure python can find these dlls.

File Type: DLL
  Image has the following dependencies:
    liblapack.dll
    libblas.dll
    python39.dll
    libcblas.dll
    KERNEL32.dll
    VCRUNTIME140.dll
    api-ms-win-crt-stdio-l1-1-0.dll
    api-ms-win-crt-heap-l1-1-0.dll
    api-ms-win-crt-runtime-l1-1-0.dll
    api-ms-win-crt-string-l1-1-0.dll
  1. .pyd is compiled against your conda env python which means that python.exe can easily find these aforementioned dlls.

  2. The compas_wood packages is also installed there which means Libs/site-packages/easy-install.pth file points the python runtime to where compas_wood modules is located.

Now to get the python in Rhino to load the package installed in a conda environment, we need to:

#! python 3

import os
import os.path as op
import sys
import ctypes

CONDA_ENV = r'C:\Users\ein\.conda\envs\wood-dev'
COMPAS_WOOD_PATH = r'C:\Users\ein\compas_wood\src'

# add the paths of site-packages in conda environment so other packages that compas_wood
# depend on can be found e.g. compas
sys.path.append(op.join(CONDA_ENV, r"Lib\site-packages"))

# add the location of compas_wood source so we can import this
sys.path.append(COMPAS_WOOD_PATH)

# tell python where it can find dlls. this is required to find all other .dll files that
# are installed as part of the other packages in the conda environment e.g. fblas
os.add_dll_directory(op.join(CONDA_ENV, r'Library\bin'))

# tell python where the wood_pybind11*.pyd is located
os.add_dll_directory(COMPAS_WOOD_PATH)

# now we can import the module and test
from compas_wood.joinery import test

print(test)
test()

I also added a ticket to possibly improve this:

RH-80486 Investigate loading a conda environment directly in Rhino

3 Likes

Thank you, going through the instructions, hopefully it will work on my machine too.

Interesting that it depends on these .dlls

    liblapack.dll
    libblas.dll
    python39.dll
    libcblas.dll
    KERNEL32.dll
    VCRUNTIME140.dll
    api-ms-win-crt-stdio-l1-1-0.dll
    api-ms-win-crt-heap-l1-1-0.dll
    api-ms-win-crt-runtime-l1-1-0.dll
    api-ms-win-crt-string-l1-1-0.dll

Which seems to be windows compilation dependencies.

When it comes to these kind of issues, how do you debug the whole process?

It works.

I need to buy you many beers…

1 Like

Do you think it would be possible to automate this process of linking these two directories similarly how egg-links are made?

Maybe. Not sure the extents of possibilities with egg-links files

I think the more failsafe approach would be to allow binding Rhino to a conda environment. It has a fully deployment of python (at least on windows) so it should theoretically be fairly easily possible. That’s what I made the ticket for.

1 Like

Hello @eirannejad, super interesting topic thanks @Petras_Vestartas for bringing this up! Does it mean that if you compile pyd against the Rhino python would work?

I have a quite similar problem than Petras, I am using an editable pip install and it all works except the pyd, I have this script:

import sys
sys.path.append(R"F:\diffCheck\build\Release")

import diffCheckBindings  #<-- this is my wrap

But RhinoPython cannot recognize it (whereas the standard python itnerpreter on my machine can) and throws this error:

ModuleNotFoundError: No module named 'diffCheckBindings'

I tried this in cmake and it does not seem to change anything:

set(PYBINDMODULE_NAME diffCheckBindings)

download_submodule_project(pybind11)
add_subdirectory(deps/pybind11)

set(Python3_EXECUTABLE "C:/Users/andre/.rhinocode/py39-rh8/python.exe")

# Find the Python interpreter and libraries
find_package(Python3 REQUIRED Interpreter Development REQUIRED)

# print the path to the python interpreter
message(STATUS "Python3_EXECUTABLE: ${Python3_EXECUTABLE}")
message(STATUS "Python3_INCLUDE_DIRS: ${Python3_INCLUDE_DIRS}")
message(STATUS "Python3_LIBRARIES: ${Python3_LIBRARIES}")

pybind11_add_module(${PYBINDMODULE_NAME} src/diffCheckBindings.cc)

target_link_libraries(${PYBINDMODULE_NAME} PRIVATE ${SHARED_LIB_NAME})
target_include_directories(${PYBINDMODULE_NAME} PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/src)

Although the python interpreter is set correctly, it doesn’t seem to change anything.

Could this be because of the missing “invisible” DLLs you mentioned? I have these for examples:

1 Like

@andrea.settimi I’m a little blind here as I can not see your setup. Is there anything you can share that I can setup here to test and help? Have you seen the post above marked as solution that sets up search paths for native dlls inside Rhino python 3? does any of that work / not work?

Indeed I apologize @eirannejad for the lack of context! Yes it worked, and that’s actually how I starting developing wraps for python in Rhino. Following your advice, I now distribute my pypi package for Rhino with bindings in a like-wise way. This is the init.py of the module:

import os
import sys

__version__ = "0.0.24"

PATH_TO_DLL = "dlls"
extra_dll_dir = os.path.join(os.path.dirname(__file__), PATH_TO_DLL)
os.add_dll_directory(extra_dll_dir)

This is an approach taken directly out of how numpy works.

And my package structure looks like this (with the dll folder containing the dlls):

image

By using the #r: my_module flaging it’s working well for a lambda user who just want to install the plug-in via the yak installer (without pointing to an anaconda).

Sorry for polluting a solved thread but this a way too interesting topic :slight_smile:

1 Like

Oh your are absolutely fine. Thanks a lot for the explanation. I can see now that the natively compiled libraries are under dll/ folder inside your package distribution and you have added the path to dll search paths in python.

This is great example!

What is the PyPI package by the way? I would like to add this to my unit tests so I can detect if installing this package ever fails using # r: <> syntax

Great! We are still testing it internally but as soon as we set a stable release to test it would be great if you could test it! I’ll pop a message again here.

@andrea.settimi @eirannejad

Consider the following scenario:

  1. Two completely the same laptops.
  2. One laptop builds C++ via pybind or any other binding library to run on Python. You push it to PyPi.
  3. The other laptop can install the package via PyPi on Rhino8. But would not load it.

Does it have something to do that “the other” laptop has no MSVC compiler ?
What are the .dll you need to ship together and how does this dynamic library shipment works across different OS?

I am wondering why this never happens on .NET since these standard .dlls like KERNEL32.dll should somehow be used already in Rhino, since it is a C++ software.

image

I am wondering if it is possible to pack them via static linking via CMake?

if(MSVC)
    # Modify runtime library configuration
    set(CompilerFlags
        CMAKE_CXX_FLAGS
        CMAKE_CXX_FLAGS_DEBUG
        CMAKE_CXX_FLAGS_RELEASE
        CMAKE_C_FLAGS
        CMAKE_C_FLAGS_DEBUG
        CMAKE_C_FLAGS_RELEASE
    )
    foreach(CompilerFlag ${CompilerFlags})
        string(REPLACE "/MD" "/MT" ${CompilerFlag} "${${CompilerFlag}}")
    endforeach()
endif()

@Petras_Vestartas

The dlls in the screenshot above should be available on the target machine when Rhino 8 is installed. Python39.dll is also available in the python environment under .rhinocode\py38-rh8

What’s the package name on PyPI? I can test and make sure it installs and loads if you have an example.

The same one wood-nano :slight_smile:

A user is trying to install on windows and cannot. I do not know why.

1 Like

We can try to repeat the issue if you send us your pyd file (or pip instructions)

The most minimal code of scripteditor for testing is following:

# r: wood_nano, compas_wood
from wood_nano import test as wood_nano_test
wood_nano_test()

This needs to be tested on windows that has no C++ compiler installed.

I’ll test today and will report in :smiley: Thanks!

I tested on my home old computer with windows11. Dependencies seems to install.

But I get this exception, same as a user sent me.

How to find out what is happening?


Same grasshopper file:
10_vidy_chapel.gh (63.2 KB)

UPDATE

The problem is, that .whl that is installed in ScriptEditor does not have files that the actual wheel has.

This is the contents of the script editor, what it installed:

And this is the contents of the .whl:

Why this happens, there is only one wheel on pypi?

After I specify the recent version e.g. wood-nano==0.1.1 and run this file, seems to work
10_vidy_chapel.gh (66.7 KB)
But I do not understand why in the first place it did not install the recent and the only one package…

The user tried to install by explicitly naming the version, it worked # r: wood-nano==0.1.1.

But if you have Grasshopper components that still has just import wood_nano it still does not work.
I need to manually delete all these hash r sign.

The safest solution for now is following:

  1. Open Rhino Script Editor, install the pip dependencies using #r
  2. Then run any needed codes without mentioning #r, plus you would never need to manually update example files.

It still feels that #r could be automated by other means e.g. using package manager.

Phew… this was very tricky to catch…

This is what I got on Rhino 8.8 RC with this script in Grasshopper:

# r: wood_nano, compas_wood
from wood_nano import test as wood_nano_test
wood_nano_test()

wood-nano

explorer_m8HJoZNRHR

compas-wood

explorer_ccvgbqqbdu

Thank you for checking.

Do you have any clue what could have had happened?