Touch as normal Mouseinput

I try to use the Rhino Sketch-Command with a Touchscreen. The problem is that touch-input is used in RhinoViews as Panning/Rotating and not as MouseInput.

I tried several hacks/workarounds:

i) UnregisterTouchWindow -> Rhino crashes when applying this
ii) Using a Hook for Mouse_LL -> Rhino crashes
(I only tried this by directly manipulating the registry but it did not work for me)

Would it be possible to imitate the sketch command with a touchscreen though a Digitizer plugin?

Probably not. I do know both Sketch and DigSketch use a lot of the same code. And both tools have special case mouse handling. Why do you need the Sketch command to work with a touch device?

I need sketch to draw curves which will be later glued by a robot.
After 2 days of googling and experimenting I found a hack working on Windows 7 it is called udpp and is a different driver which works as charm:

For Windows8 there could be a different free solution.

Let me rephrase this:
Why would one NOT want(/need) to use the touchscreen or a digtizer stylus as input for a “freehand” command such as Sketch?

If I had just replied, “sorry, no,” then I would not have learn much, correct? Asking questions helps me learn more about what the customer is trying to doing and why, which helps in my decision making process.

Hope this helps.

– Dale

I love how you guys quickly react to all our wishes so my post wasn’t meant as a criticism of any kind.

Maybe it’s because I do use Rhino on a touch capable laptop that my view on things is slighty biased towards touch Input

I used the touch interaction (pan, zoom, rotate) in Rhino to present my work and totally love it. Still I can see, that touch input is not precise enought for most modelling commands. On the other hand, I frequently use Rhino as a vector drawing program because I think the snap functionality of Illustrator is utterly useless for anything half precise.

So in fact, being able to sketch and draw with my finger or a stylus would be totally awesome for me.

On a side note: viewport interaction is a nice thing, but when I start a command, viewport manipulation is secondary to data Input. My Laptop came with a touch stylus that provides a pick precision similar to a normal pen so precision may not be that much of a problem in any case.
So how much of a problem would it be to generally switch the input mode once a command is executed?
There are a few commands such as sketch, that definitely lend to touch or pen input and others, that might work good enough.

We actually spend quite a bit of time playing with Rhino 5 WIPs on touch devices, and we could never get the “feel” right. Modeling always felt “weird” at best. Much of this has to do with all of the code we’ve written that was designed just for mouse input. And there were modeling features that we just didn’t know how to implement. For example, object snaps work on mouse over. There is no such thing as mouse over on a touch device. So the decision was made to make Rhino 5 work like our iRhino viewing application.

Going forward, we are looking and playing with lots of touch devices and trying to figure what what touches, gestures, etc. feel good enough to model. I’m sure when we get serious about this, we’ll ask for input from our helpful community.

I’m not sure I can answer this. But eventually (I hope), Rhino will work with both inputs simultaneously.

In our application we need CAD-features to show dxf-data and to rebuild sketch-drawn curves. We do not need any snapping. The tolerances for the glue-curves are high. The glue-curves will be “modelled” by persons who may not have any computer experience. Therefore, “touch as mouse input” is the way to go for this project. What makes it even more simple, everything is 2D. Actually Rhino is somewhat an overkill for this simple project.

I just started using an Intuos Pen and Touch CTH-680 tablet on my Dell XPS27 with Windows 8.1. It works great with the exception that I’m unable to right click or double click using the pen in Rhino. I can double click by clicking twice, but not using the rocker switch settings. I can right click and double click using the Intuos express keys and touch mode, but both of these are unweildy to work with.

I realize this was a while ago, but you might try running the 32-bit version of Rhino. I found that it does exactly what you’re looking for (although I considered it a bug). Touch input creates a selection window, and can be used with commands to draw lines, etc. Touch/pinch zoom works, but there’s no way to pan/rotate using touch.

Thanks Owen. I finally got around to checking this out and you’re right, it does do exactly what I’m looking for. Could you explain the difference in functionality between 32 bit and 64 bit? I’ve been using the 64 bit for so long that it seems like a step backward to use 32 bit. Maybe that’s not the case though.

I also wish the 32-bit touch functionality was available in later releases.

Dale, the thing about partial functionality without mouseover makes sense, but totally changing the input behavior results in a frustrating user experience, especially for people who work across multiple applications. If people can use touch input as a mouse in Illustrator, Photoshop, Chrome, etc. it’s definitely going to “feel weird” for Rhino to work differently.

Couldn’t there just be a setting that allows people to choose between the iRhino behavior and normal touch-as-mouse behavior?

@Carsten_Rodin, this is a conversation that should be happening in the Serengeti category.


– Dale