Context (in case there’s an obvious easier way to achieve what I want):
I’m ultimately trying to get transform numbers from a list of geometries and export it as CSV: 3 translation numbers (x,y,z), 3 rotation numbers (yaw, pitch, roll) and 3 scale numbers (x, y, z).
E.g. I need a string like this for each piece of geometry:
1, 2, 1, 45, 0, 0, 40, 20, 35
representing Translate(1,2,1), Rotate(45, 0, 0), and Scale(40,20,35)
I’m sending transforms into a GHPython node and parsing them into the string I want.
For rotations, I’m using the transform.GetYawPitchRoll() method, which returns a bool of whether the input transform represents a rotation along with the 3 angles. When it works, it should look like this:
(True, 1.5707963267948966, 0.0, 0.0)
But for certain uses of the Rotate3D node around e.g. the shape’s centroid, something is breaking and returning:
(False, -1.2343210123432101e+308, -1.2343210123432101e+308, -1.2343210123432101e+308)
(These look to me like the largest negative 64bit double values the system can create… aka ~Infinity)
The only case I’ve found where Rotate3D provides a transform that python interprets as rotation is when the center is the origin (0,0,0).
Anyone have any idea why this would be happening? Is there a better way to do what I want?
Update: Just realized I could be looking at the transform matrices, and it appears that the afflicted rotations have values in the 4th column, where translation data is stored. I don’t recognize the numbers there. Any clues what this is for?
Rotations.gh (30.1 KB)