Load windows Bitmap in memory to SetBitmapTexture

If i have my own code generating windows.drawing bitmaps, and want to pass them as textures to a surface in Rhinocommon, is this possible? Currently I only see load from filename?

Example. to solve BLIT in below.

Bitmap blit = new Bitmap(RawImage.ToBitmap, 640, 480);
Material mat = doc.Materials[material_index];
mat.SetBitmapTexture(blit); //??? Already in memory, don’t want write then read from disc
mat.CommitChanges();

Yep, this is currently they only way - sorry.

ok. was hoping to pump a webcam feed onto a nominated surface. Think VideoFrame as in PictureFrame.

Any other way of pulling that off?

Hi Chris,

Can you provide me some more details as to what you want to and why? What problem are you trying to solve?

– Dale

Hi Dale

Im kicking around in OpenCV and can detect circles, squares rectangles and edges and even numbers and fonts from a live feed, it however lacks the complexity of then further dealing with those components. Im wanting to exand on that in Rhino.

The video feed CAN be displayed in a docked panel quite easily but id prefer it was fed into a rhino window itself as a “texture” that updates @ 5-15fps on a given surface. the simplest way is to pull the haas data as a bitmap frame, but that that resides in memory as a windows.drawing bitmap, not on disc, even using a temp memory “drive” to write to then read back from, brings speed WAY down because of the overhead.

this is not urgent or critical, its a “would like” at best, and I don’t see any practical application at all currently other than it will keep the devils tools busy.

so i was able to pull circles, ellipses, and any form of three or four sided geometry as well as lines and arcs from a live stream. the data is stored in an array broken up into type and then characteristics based on a size relative to the px size of the capture frame.

its pretty easy to turn these into geometry and pass to rhino actually, far easier than I had imagined… the kicker is getting scale and location figured out, but im onto it.

wip showing the circles only at one time (drawing all at once drags it down to 2-3FPS)

if there is in the future a way to copy an image in memory to a texture in rhino the calculations woudl be far easier using the plane size…

@dale, i have a very similar task to solve. I create a System.Drawing.Bitmap to which i have to draw, like an animated pencil stroke. So far i have not found a way to assign my texture as a bitmap unless it was written to a file. My animation runs in a conduit.

I tried to save multiple images to disc before animating, one for each animation frame and then reload the texture on frame change, however this takes very long as the images are 2K or 4K in size. Also tried to update the assigned texture after drawing into it, but since i have to save and refresh the texture, this took even longer.

Is there a way to assign a bitmap without saving it to file ? I see this is possible with a Display.Bitmap but only if i then display it flat in screenspace. I would need it as part of a material or Display.Material so i can change it using SetBitmapTexture().

_
c.

I think I wrote a sample for exactly this thread to work in v5. I’ll dig it up tomorrow.

1 Like

@nathanletwory, thanks this would be great as i need it for V5.

_
c.

I remembered two separate sample projects, they got mixed up a bit in my head.

I did a MemoryBitmapTexture implementation for v6, and a MaterialTextureEvaluator for v5. But I think you might be able to marry the two together to get what you want. Note that this means you’ll have to use the RDK as much as possible, i.e. RenderMaterial and TextureEvaluator. Not sure how easy it will be in v5, my recollection is that the RDK was much more cumbersome to use in v5. My time is almost exclusively been with v6.

Thanks @nathanletwory for the examples. I’ve been trying to start with V5 where i got an error in below line which does not occur in V6:

rm.SetChild(rtex, "bitmap-texture")

I have to admit i am not sure how to transfer your example to python. But working with the RDK and using RenderMaterial has always been challenging. Is it generally possible to use an RDK material in a display conduit ?

_
c.

I’ll see if I can dabble in v5 tomorrow some (when I’m no longer sleeping. Until then I’ll let @pascal have a look.

To be honest I have no idea, but I’m sure once I’ve asked @andy, maybe some others I can give a better answer - tomorrow :slight_smile:

/Nathan

@nathanletwory, thanks. I currently have a problem setting up the texture evaluator. You’ve got this line which i cannot find a reference to:

var eval = new MemoryBitmapEvaluator(evaluatorFlags) { BitmapTexture = bm };

i do not understand what MemoryBitmapEvaluator belongs to. It is only used once in your code and cannot be found in the docs. Also i have no idea how to transfer what you wrote in the curly parentheses in python.

_
c.

This is actually code straight from v6, so you can run the command TestNathanMemoryTexture. A new material with the in-memory texture will appear in the material editor. You can assign it is to an object and turn on Rendered mode, or even Raytraced. Internally the CreateEvaluator function that creates the evaluator is called, so that the modes how to draw the texture. I think that is the same idea as in v5.

Does that help?

The two central pieces are the RenderTexture implementation, and the TextureEvaluator implementation. The latter is only ever accessed through the CreateEvaluator implementation of the former.

I’ve sent some, not sure if i should post that half baked code in public.

_
c.

I’ll read the script tomorrow.

Re half-baked scripts: they are great for community effort in improvement, imo :slight_smile:

half baked scripts is ALL i do…