Memory leak when using Karamba in Hops

Hi!
I tried to encapsulate a Karamba definition into a Hops component for a project. This led to the rhino.compute server consuming all the memory. It seems like each solution stays in the memory, even though caching is disabled.

I tried to cause the same problem without Karamba components, but couldn’t. It seems to be Karamba related. I put together a little example, with nothing but the “simple frame” example file from the Karamba website inside the Hops definition.

What is happening here? Mistake on my side? Bug?
hops memory test function.gh (45.5 KB)
hops memory test.gh (7.1 KB)

I tried it on different machines now, and on the problem did not occur on a machine were version 2.2 is still installed.

Hello @cristoph,
sorry for my late reply. I found the bug in version 3.1.40531 and a corresponding new version will be out soon.
– Clemens

I’ve been dealing with a Rhino Compute memory leak for some time now. My solution is to terminate the Rhino Compute process with a command line before starting Rhino Compute. Memory wouldn’t free up before the compute idle time ends. This is probably not the best solution, but it works.

something like this:

def terminate_compute_process():
    """Terminates the compute.geometry.exe process on the Windows VM and returns success status."""
    try:
        session = winrm.Session(
            VM URL,
            auth=(USERNAME, PASSWORD),
            transport='ntlm')
        result = session.run_cmd('taskkill /F /IM compute.geometry.exe')
        print(result.std_out)

        return True

    except Exception as e:
        print(e)
        return False
1 Like

Hello @abadi,
did the Karamba3D version I sent to you not fix the problem?
– Clemens

Hi!
My problem has been fixed that way. Thank you!