OpenGL Feature Levels

@stevebaer Somewhat related:

Do you have a list of which V6 features require a specific level of OpenGL support? And what happens if the required level isn’t available (i.e.: builtin work around, benign failure, crash, etc)?

In my case I’m asking because I use Rhino on a Macbook Pro Retina with Nvidia GeForce GT 650M via Parallels. The latest Parallels only supports OpenGL 3.2. I realize McNeel doesn’t offer support for virtualization, but I’m hoping that with this info I’ll be able to have a better idea of what to expect.

I suspect this info would also help real Windows users with older machines as well.

I don’t have a document put together that lists what features are enabled as OpenGL driver support levels get higher. When a level doesn’t exist we either fall back to a slower technique to achieve the same results or just completely drop the feature. This has always been the case for Rhino. In V5, if you switched to software mode OpenGL (which is OpenGL 1.2) Rhino would still display geometry, but it will be slower and some of the effects in the rendered display would not be applied.

Most of Rhino’s display only needs OpenGL 3.3 features with the exception of

  • GPU tessellation for wire drawing (GL 4.0). We still draw wires without this feature, but it can be significantly slower.
  • Shader caching (GL 4.1). This feature is just for speed so the user doesn’t have to wait for shaders to recompile every time they start Rhino or go into a different display mode.

It turns out that even older native Windows machines almost all support at least OpenGL 3.3 across the board.

I worked on VMWare for the last few years trying to get Rhino to run well on it and got pretty close, but still not at a level that we can support it. Parallels traditionally had even worse OpenGL support until their recent release. I’m hoping to investigate these again after 6.0 is released to see if there are any improvements we can make in a service release of Rhino. These environments are special and the reported GL level support only tells half the story. There are a bunch of existing workarounds already in the code to try and handle VMWare’s GL driver and I’m sure there will be many more.

Thanks, Steve. I have sent a couple of support requests to Parallels asking them to up their game on OpenGL support. Let’s see if anything happens. Maybe other users who would like to use Parallels or VMWare but don’t because of past-due OpenGL support could let their virtualization vendor know how much latent demand there is for up-to-date support.