The goal is to get the points, p1’ and p2’ (green), from the initial points, p1 and p2 and their respective directional vectors, v1 and v2, which are unit vectors.
Multiplying v1 and v2 by the offset distance d won’t work (cf. example). The distance d is measured perpendicular to the line l. Both vectors are not perpendicular to l, which means that there magnitude must be bigger or smaller than d, depending on their angles to the line l!
The line length of line l and l’ shouldn’t be relevant here, but yes, both lines have only the same size if both vectors v1 and v2 are perpendicular to line l and thus l’!
Line l’ is the line between the points p1’ and p2’, which I want to find in the first place.
I would have appreciated @Joseph_Oster’s input, as I did @maje90’s suggestion, even though it also is what I specifically didn’t ask for, if he hadn’t started by throwing around false accusations that instantly debunked him as a hater (at least of this discussion).
And after my reply, calling out for censorship in the forum and simply manifesting that he now ignores me, seems like another piss-poor, childish approach. Be a man and face your mistakes!
Thanks @HS_KIm, I’m going to check it out. Looks promising!
If I understand the question correctly, you are looking for the scalar multiplier for each vector which takes that end point to a line at some distance d from, and parallel to, your original line l
You can use dot product to get the length of the projection of one vector onto one perpendicular to l, then divide by this, like so: dotproduct.gh (12.6 KB)
I think this kind of thing is covered in Rajaa’s guide: