Linetype Taper on Make2d

Did a little test with the new linetypes in GH, which taper lineweights based on distance from camera. It’s a little rough, but begins to suggestion some sense of depth, which is nice for section perspectives. I used ‘Silhouette’ so the curves would be in 3D. Files attached for anyone interested.

The results:


The only problem is because the lines are in 3D, it doesn’t take care of the intersections if you hide the 3D geometry from the viewport. (If anyone has any ideas for solving this problem, I’d be very grateful!)

line_tapering_by_dist_from_camera.3dm (6.6 MB) (26.4 KB)

Linetypes defined with widths in model units will be thinner the further they are from the camera. This does not involve defining any sort of taper.

Ah, I was under the impression that I’d have to physically scale the units defining width of the line in order to get it to hold up in Make2D. Is there any way to keep it?

Any way to keep what? I’m not sure what you are asking for.

Sorry for the lack of clarity! I will try to explain:

Here are the lineweights scaled by the viewport, as you have described. Since the curves are in 3D space, and the width uses ‘model units’, we see the closest object has the thickest lines:

That’s already very good, but the lines which extend into the scene have no taper, and are uniform thickness.

Anyway. If I try and make2D these silhouette lines, all the lines will have the same thickness. Which makes sense. Since they are on the same plane, they are the same distance from the camera.

Ultimately, I want to keep those differences in lineweights while make2Ding, so I can export them to Illustrator.

(If I simply export the 3D curves to illustrator, this is what I get.)

I guess it’s sort of along the same intentions as the side tangent about transparency based on distance from the camera/fog etc. The goal is to have Make2D linework that scales width based on distance from camera.

So if I have a line, for example, that we know is extending from close to us further into the distance:

We would want this line to have some visual effect of “vanishing” into the distance:

Of course, the vanishing point we talk about in perspective drawing doesn’t translate directly here. But it’s a similar idea.

Maybe I am misunderstanding the way it works, but I thought that every curve had the width as an attribute, so shouldn’t I be able to:

  1. Get the width of each existing curve
  2. Project curve onto plane
  3. For each projected curve, reassign the width of from the not-projected curve

Edit to add:
In my previous Grasshopper script I attached, I used the ‘start’ and ‘end width’ of the curves to decide its width.

I took the distance between each endpoint and the camera point, and I remapped the value. The smaller the number, the closer the point is to us, so the thicker the width. And vice versa.

I hope that is clearer. Thank you!

They do, it’s just not enough to show with the scene that you are displaying.

I understand now. You want linetypes applied to the resulting geometry from Make2d to show their thickness as if they were still in 3d. @GregArden or @rajaa is it possible from the SDK to get at the resulting Make2D curves before they are projected to 2D?

No matter what, I don’t think the results will be very usable in Illustrator. We still need to figure out how to best get curves with taper exported to applications like that.


I see, thanks for the screenshot! It’s good to know that. In the meantime, I guess this is the closest I’ve gotten to something with vector export (although wildly inefficient):

  1. Create silhouette curves

  2. Pipe/multipipe them into physical geometry.

  3. Since they are physical geometry, the “width” (size of pipe) will scale based on distance from camera

In the screenshot, I assigned black Plaster material in rendered view, so we can see more clearly

  1. MeshOutline the pipes

  2. Hatch them, to get the “lines” (really a bunch of fills)

Anyway, my takeaway is, even if the tapered lines don’t translate well directly in Illustrator, I hope that - as an intermediary step perhaps - we will be able to use DupBorder on them or explode them into planes/hatches instead of taking a screenshot and doing an image trace. That way we can have some kind of vector export, even if it’s a fill and not a line.

(Sorry for the many questions! Based on the Perspective Rotate feature, I am curious if the lines are actually plane geometry that always face us?)


All curve widths are computed in what we would call camera coordinates (always face us).