Using ShapeDiverImageInput to define texture URLs on the fly

Instead of creating separate model files to accommodate designs that differ only in the textures of one or two meshes, I want to have one ShapeDiver model and define those specific textures via JS. One product (model) can potentially have 1000s of finishes (textures) and I’d prefer to define the relationships vs create 100s of models.

When the model is loaded, an initial texture would be set via updateAsync and the ShapeDiverImageInput. Custom user inputs would enable choice between variations of the texture (could be up to 30). Upon selection, the URL of the texture variation would be set in ShapeDiverImageInput.

I’m concerned about possible limitations? Affect on load times? Will the images be cached? Is there a better way of accomplishing this?

Thanks

It is good approach to have only one model with all shape variations and different materials. Loading external textures with SDImageInput gives you flexibility to update your material collection and any image which is part of the material definition in Grasshopper will get cached to improve performance. Keep in mind that if you change the texture but the URL stays the same, the cached original will be rendered.

A few important points regarding textures:

  • keep textures reasonably small, ideally below 1MB (files larger than 4MB will not work)
  • make sure to use seamless textures for the best results
  • textures should be square and sizes of 256, 512, 1024 or 2048px

Find more details in this article:

Thanks Pavol.

I had read the links you presented but wasn’t 100% sure if images introduced via ShapeDiverImageInput would be cached for any reasonable length of time. I’ll use a naming convention that ensures updated images are served when there is an update.

Regarding texture mapping, I’m using the C# script component (builds a mesh from polylines) Edwin uses in the “How To Create An Online Table Configurator” with the U and V texture size inputs receiving values from integer input parameters. Each texture will have an associated “to-scale” integer that determines the value used for texture size. Assuming standard-sized images (thanks for the recommended sizing) and predictable (linear) behaviour, it should be relatively straight forward to consistently map the textures correctly with a formula. Can you provide any insight here? For example, a 512px square image that represents a 1000mm square portion of texture will require a texture size value of {magicNumber} to be at scale in a model.

This doesn’t facilitate texture rotation but as this value is static, I’ll just define it in the model using a ShapeDiverTextureTransform component. It’s up to us to ensure the texture images are in a consistent direction.

The LoftMesh component helps with texturing too as it allows you to input texture size in the model units. If a texture covers 1000mm portion of a pattern, set the U, VTextureSize inputs to 1000.

Ideally all your textures have the same orientation, pixel size and represent the same area of a pattern in the real world. But if that’s not the case, use the SDTextureTransform component for scaling. For example, LoftMesh U,V texture size is set to 1000mm but the image represents 500x500mm area. This means that it has to repeat twice and scaled by factor of 2, see example and article below. The number of pixels (resolution) depends on real world size of the texture. 1000x1000mm will obviously need more pixels than 500x500mm. It’s the best to test and see as it depends on size of the embedded viewer in your web site, model settings such as enabled/disabled zoom and so on.

test texture mapping.gh (30.7 KB)

1 Like

Thanks, I appreciate the definition - one is a great magic number