Touch gestures on Rhinocommon with hook to WM messages

I’m testing the touch features of Rhinoceros 7.16 from RhinoCommon with the effect of the parameter EnableWindowsGestures and I’ve found that

  • If I set this parameter to true, the native drawing features do not work but the zoom/pan works from the touch screen as defined previously in the original thread that introduced this option (for example executing a _-Sketch script with deviceDown/dragToDefineCurve/deviceUp works only from mouse)
  • If I set this parameter to false, the native drawing features return to work and the touch gestures are now disabled (the sketch example now works drawn with both mouse and touch devices)

Could it be possible to have both the features when it is true and only the drawing feature when it is false ? In this way this parameter defines only the enable of touch gestures handled from Rhinoceros.

A more subtle problem related to my needs is that the WndProc message chain from Windows API hooked to Rhinoceros 7 IntPtr Window Handle exposes the message WM_GESTURES=281 with the begin and end events, but no internal info of the gestures. The same code linked to Rhinoceros 5 exposes the internal info of the zoom, pan or rotate gestures as parsed from Windows before arriving (SetGestureConfig function (winuser.h) - Win32 apps | Microsoft Docs and GESTURECONFIG (winuser.h) - Win32 apps | Microsoft Docs). If the parameter is false, the client plugins should be able to hook on these events to implements their version of the gesture, to transform their macro-objects from the gesture event defined by the user.

If you recover the rhino5 behavior of redirecting WM message 281 if the parameter is false I solve my problem, and the native features works both from mouse and touchscreen. Let me know if you need more info.

Hi @cavalli.stefano,

My apologies - I don’t know how to help…

– Dale

I have answered in the main post of this featrue