Besides using Rhino for presentation of my sculptures built in ferrocement, ( http://www.ohlers.com/ )
I am also into sound engineering and experimental music.
I just saw this combination of VBAP (Vector base amplitude panning, which is basically volume panning between multiple speakers) and MAX (a patch based software , similar to grasshopper, but for video and sound design )
have a look at these links:
now… I imagine this: Using Rhino to control the 3D panning environment: Fx build a 3D model in Rhino of the room where the actual music/performance is going to take place, positioning speakers in this virtual room, and then panning points around inside the Rhino 3D space. (fx “hands on” with the mouse, along custom drawn panning lines or curves, simple stuff)
The x,y,z coordinates of these “panning-points” could then be transferred to the VBAP, and/or MAX.
in other words - the question is, how to get the x,y,z coordinate data “out” of Rhino ?
Prefereably live, so I can record the panning in MAX, and repeat it.
Or - can I record movements of objects in Rhino, and repeat it?
(here it would be multiple points, each moving along a dedicated curve, each point resembling an instrument or a sound track)
Or other scenario: using the Spacemouse to simply fly around in the space
Sorry about my “non-tech” vocabulary
I can foresee I would have to learn some level of programming, but for now I would like to hear your comments on this, and hope for some creative input and ideas from you guys here