When doing a view capture (Rhino.Display.ViewCapture.CaptureToBitmap(view)) the display thickness of curves and edges in certain display modes (Pen) is scaled up if the resolution is larger than the viewport.
This is different from before, where the line thickness stayed the same no matter the resolution, and allowed us to get finer lines and more detail, especially at larger resolutions: a 1px line stayed 1px no matter what resolution the image was being output as.
Setting ViewCapture.ScaleScreenItems = False doesn’t seem to have any effect…
Has there been a change in the way this is handled? Is there a new setting to toggle this?
Also, doing ViewCapture.CaptureToBitmap(ViewCaptureSettings) throws an error in Python, says it’s expecting a RhinoView and got a ViewCaptureSettings. The documentation says CaptureToBitmap can be overloaded with a ViewCaptureSettings object…
Thanks, @dale , however I meant that the docs say that CaptureToBitmap() can take either a RhinoView (view in your example) or a ViewCaptureSettings, and it throws an error when I try to pass a ViewCaptureSettings object…
The overload is a static function so you’ll probably need to call it with
Rhino.Display.ViewCapture.CaptureToBitmap
When you want to pass the ViewCaptureSettings class
They are full-scale crops of screen capture at 7200x4800 to emphasize the difference.
The one on the left uses imageCap = viewCap.CaptureToBitmap(activeView) and the one on the right uses the static function imageCap = rc.Display.ViewCapture.CaptureToBitmap(vcSettings). The behaviour on the right, using the static method, is how it should be / has been in the past. Maybe it’s a feature, not a bug
For comparison, @stevebaer, this version works like the image on the left. Note that setting viewCap.ScaleScreenItems to True or False does not seem to have any effect.