Heat Method

Hello, Yanna, this code looks amazing, but after I install it by the instruction, the gh component can not implement, and notice ‘there is no module named compas_slicer’, can you tell me where is wrong? Than you very much.

Hello! You can find more info on that library here.
https://compas.dev/compas_slicer
If you have more questions, you can create an issue on github, or contact the authors directly :slight_smile:

1 Like

Dear Piker
I am very lucy to see such an amazing algorithm,which can be said to be revolutionary for 3D printing.
Unfortunately,I cannot understand the theory part because of my mother tongue
So I want to ask if you can add more notes to the code for easy understanding
Thank you very much if you can

Thank you in advance

@DanielPiker awesome as allways , using TextureCoordinates to encode patterns, really interesting and how about the Vector Heat Method, Could this be implemented in your code?
thank you for all the contribution Daniel!

I did try some stuff with not just vectors, but the delightfully named ‘N-RoSy fields’
(rotationally symmetric fields of given order)


Nrosyfields_example.gh (261.0 KB)

You can set the rotation order and input guide directions as lines

It will work best on a triremeshed mesh.

14 Likes

:astonished: wow !!! thank you @DanielPiker for sharing! any static version on this?
In the case of Symmetry=1 is there any way to embed TextureCoordinates on your script to ideally work with you isocontour or put a texture map?
T.

Direction fields (vector fields and N-RoSy fields) themselves cannot be contoured in the same way as scalar fields. (You can trace streamlines in them, but this is a whole different approach with many other issues to consider around sampling etc.)

You can go from a scalar field to a vector field in a simple way by calculating its gradient, but going the other way round is a bit different.
In some of the other scripts further up this thread I do show ways of generating a scalar field with gradient matching a vector field. In those examples, the vector field is also coming from the scalar field, so they are linked cyclically, but it would also be possible to generate the vector field separately first with the direction smoothing script I just posted and use that.
For direction fields that form closed loops, you need to also include some form of value wrapping for the scalar, since obviously you can’t have a single value at every point around a loop where each one has a higher scalar value than the one before it otherwise.
I can’t recall if I included this in any of the examples already posted, but it is something I already have code for.

I even tried some funky stuff with smoothing complex valued fields that I’ve been meaning to write up and post examples of.

but before getting too far ahead of ourselves-
what are you actually trying to do here? What’s the result you have in mind?

2 Likes

thanks for your detailed explanation @DanielPiker , Definetely I need to review those field concepts,
Probably the StripperPattern is more accurate to the result I would like since this need an input vector and already saw that your IsoContour works.

Stripper pattern is simple to get ising directional reaction diffusion. Does i miss something?

One difference between using reaction diffusion approach like that and something like described in the stripe patterns paper is that you then get a scalar field rather than a parameterisation, as could be used for mapping textures.

2 Likes

With this direction field smoothing you can also set multiple sample directions in a few places and have it generate a smooth interpolation between them over the whole mesh


Nrosyfields_interp.gh (117.4 KB)

8 Likes

Hi Daniel,
Really cool image, I’ve been trying to do something similar to make gcode for a 2.5 axis plotter I’m building. How are you finding the shading values on the surface that then drives the line thickening? Is this made in grasshopper or is it possible to reference the shadows from geometry in the Rhino viewport?
Also is there a simple way to find the segments of the curves which are visible to the camera? At the moment I’m using mesh ray collisions for each point on the polylines, with the meshes that created them; but this is making some weird artefacts where the face normals are perpendicular to the camera ray.