Possible define an image and assign to a material?

Too bad I don’t have time to read this (ø:

GhGL could do this, but it would only work for OpenGL display and not for rendering.

You know me, I prefer realtime anyway, but I’m good anyway now that the script is working well :slight_smile:

@nathanletwory Thanks alot for your in-memory texture sample.

I’m having some trouble getting it to update the texture each time I run the command. It seems to generate a custom texture the first time, then reload the same one even if the supplied bitmap has changed.

If I modify the mapping channel, it seems to update once, but then reloads that new bitmap when its called again.

//rtex.SetProjectionMode(TextureProjectionMode.WcsBox, RenderContent.ChangeContexts.Ignore);
rtex.SetProjectionMode(TextureProjectionMode.MappingChannel, RenderContent.ChangeContexts.Ignore);

The RDK writes one or more versions of any texture that is used. If you add a procedural texture to a material you’ll find that you get an image in tmp for that procedural as well - it is also just a memory/on-the-fly texture.

Where would this sample be saving these images? Maybe if I clear this folder, as @clement suggested.

I’d like to use this sample as a workaround to the memory issues with Displaybitmaps noted here

Cheers

@nathanletwory example is great starting point! Now I wanted to make use of it in conjunction with Conduit, however seems RenderTexture doesn’t want to be a Texture that fits into DisplayMaterial - any additional ideas how to push it directly to Display ?

FWIW - I tried with SimulatedTexture().Texture() but it calls internaly a new Texture() so the InMem BitmapTexture property seems to be detached at this point (or maybe even earlier than I thought).

I don’t know how to get a RenderTexture to show in a DisplayConduit. I’m not sure if it is possible. We’ll have to ask @andy or maybe @jeff.

Thanks @nathanletwory!

A more general question would be how to stream an in-memory bitmap to a material visible in a DisplayConduit. We need to update it at 30 FPS and are trying to avoid the IO latency coming with saving and subsequent reading to file.

We don’t have any sort of streaming texture support built into Rhino. It would be a great feature to have, but at the moment it doesn’t exist.

2 Likes

Thanks @stevebaer!

As a workaround, I’m conceptually treating a mesh as a bitmap and update its vertex colors on the fly. This isn’t ideal as it forces quite high vertex counts and viewport performance suffers but it works for now.

It is relatively simple to do directly in OpenGL. But it would be great to have a cross-platform and future-proof way of doing this directly through Rhinocommon:

I noticed that Bongo has video textures. I didn’t look into it further, and I wasn’t able to find out whether Bongo offers an exploitable API. Nonetheless, it seems to have managed to integrate video textures into Rhino materials.

Note though that this isn’t realtime playback, but frame by frame updating the image for the material, and Bongo playback isn’t typically realtime either. It is essentially the same as updating the material and redrawing the entire viewport. It is not the same as what is sought after here.

1 Like