Now i’m testing if i can upload a simple definition to check how value list and textures will work in it.
I’m having some problems, but i’m pretty sure is due to how i set the definition:
I’m using a value list to filter between 3 different image files. Then with Squid i’m reading the pixel dim and convert it in a rectangular surface. Then i’m texturing the surface with the referenced image, using a Shapediver component.
I can then upload the definition and see the input value list appear in the viewer, but no visual 3D, not even the surface (i did tried to see to output the preview as well). It just keeps on “buffering”.
did i do something wrong in choosing which input/output to group and rename?
(also the value list input appearing in the viewer apparently as no specific range).
Btw i’m not a web developer at all, never used Visual Studio before, so i was pretty happy to be able to see something. And i think rhino compute could really be useful for a lot of projects i’m doing.
I’m just new at this, usually i would keep on doing trials and error untill i found a solution, but i’m already running out of time. Thanks for your help!
Edit: Ok, i’ve changed the input type for the value list (from number to integer) and now i can see the surface: so it reads the files and squid works, the problem is texturing with shapediver? what should i use instead?
All right, let me update this post just in case some fellow n00bs needs it.
First of all: i’m really sorry. I just realized i’ve posted the wrong GH definition screenshot in the previous post (the texture files weren’t correctly connected to the Texture Input…). But still didn’t work either.
Now, i’ve solved one issue and can answer one question: yes, Compute can read textures.
not the best logic, but i just needed to see if it worked.
But, as you can see i had to change input type: from value list to slider.
A value list would be a better choice for an easy usability from the client side, so before going to production i need to find out how to make it work.
Will see what i can do.
I’ve been trying to get texture mapping working as well , and this has been quite helpful.
But since this relies on assigning vertex colours to the mesh(as opposed to mapping a texture on a mesh), It requires a significant amount of vertices on the mesh for an output with a decent resolution, and may not be the most optimal solution for web based workflows.
ATM seems human has texture mapping and able to preview it as well, but assigning the texture to the mesh and exporting it for Rhino Compute is still a mystery to me. Keen to hear if someone else has found a solution to this.
My recommendation at the moment is to return a RhinoDoc Base64 string from your compute functions (be that from a GH definition or a custom plugin). This is because GH objects aren’t associated with a Rhino file, which is the way that materials and textures are stored.
Hi Luis, I’m still having a few issues with displaying the base64 versions for some reason. I’ve just tried it with the example file (docString.gh) as well with the same result, any idea on what the issue might be?
So Ive been trying to figure out why exactly objects with textures are (still) not appearing.
I thought it might be the base64 conversion but i dont think that’s the case.
i’ve fixed a couple of issues with the gh file i’d sent previously as well so textures are definitely being assigned.
But found this while digging into the code -
Rhino.TextureType.Bitmap is commented out in the rhino3dm.js. Is this a possible reason for it?
Bitmap texture is deprecated. We use the Diffuse texture map for this purpose.
Unfortunately I haven’t had a chance to create an example for this last week. I will look to test this out this week and come up with a way to pass objects with materials and textures. Textures and Materials both need some work in the 3dmLoader, but there should be enough functionality to get basic textures across.
Hello @vnatarajan yes, I finally got something working. I will clean up the sample and add it here before I do a full cleanup and add it to the repo. The main thing is that I was not too familiar with the more modern materials and textures API in RhinoCommon. @andy pointed me to this sample: Add Texture with C#, Python, VB which gave me a way to add a texture to a RenderMaterial, and add that to a document. The rest is serializing an image as an input and eventually converting the doc into a byteArray. I’ll post the relevant files tomorrow after some cleanup.
Have either of you attempted to resolve this on the three.js viewer end rather than at the compute side?
I’ve been trying to add textures to the returned geometry but something seems to be up with the returned mesh’s UV maps. Can the UVs of the mesh be defined before getting them sent back to the clients viewer?