Anti aliasing doesn’t work for me on this 2011 Imac since the new patch. Can’t select anything but none in the preferences.
Thanks for fixing the command options dialog. Another improvement in my case could be that, while you are in the layer panel and use a keyboard shortcut which includes the same letter as the start of a layer name, it jumps to that layer instead of executing the shortcut.
Software information
Software versions
Rhinoceros version: 5.0 Wenatchee 2014-05-19 (513)
OS X version: Version 10.9.3 (Build 13D65)
Plug-ins
None
Hardware information
Computer hardware
Hardware model: iMac11,2
Processor: Intel Core i3 CPU 540 @ 3.07GHz
Memory: 8 GB
Architecture: Intel 64 bit
Video hardware
Graphics: ATI Radeon HD 4670 256 MB
Memory: 256 MB
Screen size: 1920 x 1080
Displays: iMac
Third party kernel extensions
com.steelseries.BoardListener (8.56)
USB devices
PI-208: 1394A/USB2.0/eSATA combo drive
Apple: Internal Memory Card Reader
Apple Inc.: Bluetooth USB Host Controller
La-VIEW Technology: SteelSeries
Apple Computer, Inc.: IR Receiver
Apple Inc.: Built-in iSight
Bluetooth devices
Apple: Apple Wireless Keyboard
OpenGL information
OpenGL software
OpenGL version: 2.1 ATI-8.24.13
Render version: 2.1
Shading language: 1.20
Maximum texture size: 8192 x 8192
Z-buffer depth: 24 bits
Maximum viewport size: 8192 x 8192
Implementation settings
Use texture compression: No
A bit off-topic, but the software/hardware report you’ve attached looks like something every Mac user should know how to create, but I don’t. Where does it come from? How do you make it?
a. Open the Rhino Preferences panel.
b. Select the OpenGL tab. This displays a description of your computer.
c. Click the Copy to clipboard button. This copies the contents of the tab to your clipboard.
d. Paste the copied computer description into the post.
Which is right for Rhino? Does Rhino go 4 for 1 pixels on a Retina display, or does the OpenGL report get confused? I know that when I set Rhino to full screen it covers all the screen real estate.
There is a problem in the 2014-05-19 and 2014-05-20 WIP releases when determining the maximum allowed value for anti-aliasing. Rhino currently decides that anti-aliasing is never possible. This should be fixed in the next WIP release.
However, Rhino does place limits on the highest anti-aliasing setting. These rules were determined from the volume of crashes in the Apple OpenGL drivers or reports of display corruption. Here are the current rules:
You are limited to a maximum of 4X antialiasing if any of the following are true:
you have a retina laptop
you have more than one display
your GPU has less than 1024 MB
You are limited to a maximum of 2X antialiasing if any of the following are true:
you have a retina laptop and an external display
your GPU has less than 1024 MB and more than one display
your GPU has less than 512 MB
Anti-aliasing is disabled entirely if:
you have a retina laptop and two external displays
you have a retina laptop set to higher than normal resolution
your GPU has less than 512 MB and more than one display
you have an Intel GPU and it is not a HD 4000 or later
Again, these are performance and stability restrictions based on past performance, and can change at any time.
One of the reports listed above has anti-aliasing disabled because it is a retina laptop set to a higher than normal resolution, and is not affected by this bug.
Which is it? It is both! The OpenGL page is reporting the System Preferences setting because this setting can impact performance.
The screen has 2880 x 1800 actual pixels, and, in the System Preferences > Display panel, normal resolution is presented as 1440 x 900 points. In this case, Rhino draws at the 2880 x 1800 pixel resolution and this gets copied directly to the display.
Setting the screen resolution to 1680 x 1050 on a retina laptop means that everything has to get drawn into an offscreen buffer at 3360 x 2100 resolution, then scaled down to 2880 x 1800. This really kills performance.
Never woulda thunk it, but I guess it makes some sort of sense. Why doesn’t Rhino draw at the native 2880x1800 on the the Retina display? Is it not possible at the OS X/driver level right now, or something in Rhino? Or would single-pixel diameter/width entities be just too small?
Rhino does draw at 2880x1800 on the retina laptop. All those pixels visually take as much space as a “normal” 1440 x 900 display.
This is most obvious if you plug in one of Apple’s 27" displays to your laptop. The external display is 109 pixels per inch; the laptop display is 220 pixels per inch. That’s almost a perfect 4 to 1 ratio.
There are two ways to handle this difference, and OS X and MSWindows take opposite tacks.
On OS X, drag a Rhino window from one display to the other. The apparent size of the window does not change, but it is much sharper and crisper when drawn on the retina display. That’s because OS X and Rhino are drawing 4 times as many pixels to fill the same amount of physical screen space. Four pixels on the retina display are the same physical size as a single pixel on the external display.
When Apple introduced the retina laptops, it did a bunch of work in OS X to make this work almost seamlessly. Text and standard controls scale themselves automatically. Applications that do custom drawing and do not know they are running on retina displays continue to work; they just look fuzzy on a retina display. So OpenGL drawing still works, but applications needed to make adjustments get OpenGL drawing to look sharp. Rhino pays attention, and you get very crisp viewport drawing on a retina display.
MSWindows is different. There, a pixel is a pixel, and Windows does not seamlessly scale text and drawing to accommodate different pixel densities. Dragging a Rhino window from an external display onto the retina laptop display shrinks the window quite a bit, and makes it hard to read. This is getting better, but it is a bit of a piecemeal approach to fix this on Windows.
Thanks for taking the time to explain this. I gather that for most of the non-viewport stuff, Rhino just carries on as if it is working on a 1440x900 screen and OS X does a smart job of “up converting” to the retina screen - pretty much like a new 4K TV upscales an HD video. I don’t have the luxury of doing the 2-monitor experiment you suggested, but I certainly agree that the Rhino viewports on retina look like they are using the full resolution. Still not clear to me whether this is just the OS X upscaling while Rhino thinks it’s working with 1440x900 or whether Rhino knows that it’s working with 4 times as many pixels and draws accordingly.
I know you’d rather code than write a book on this, and I’m sure the Rhino Mac users would also prefer it
This behavior causes big, big hiccups also for me. Very disorienting – especially with lots of layers that are nested in a long list. Maybe this a “feature” for some(?) but I’m guessing more people use the command line and shortcuts heavily than have all layers alphabetically organized and use this method to get around the layer structure.
If exist behavior is necessary to keep (for reasons that I’m missing) can an option be created to turn this behavior off – to simply pull up the command line without having to click in modeling window first? That would be a much appreciated improvement!
Gotta say, working without anti-aliasing is a lot more fatiguing than I thought it would be. Eagerly awaiting reinstatement of this previously unappreciated feature!
Glad to see this thread 'cause I thought I was overdue for an eye exam!
Please post this in a new thread. This has nothing to do with anti-aliasing and posting in this thread will make this topic very hard to find in the future.
Thanks for fixing the anti-aliasing. However the max option is 2x and still appears jagged. I used to be able to set it to 4x. Hopefully there is a way for you to implement the higher anti-aliasing settings again.
USB devices
Apple Inc.: FaceTime HD Camera (Built-in)
Apple Inc.: Bluetooth USB Host Controller
Apple Inc.: Apple Internal Keyboard / Trackpad
Apple Inc.: iPad
Logitech: USB Receiver
Apple Computer, Inc.: IR Receiver
Bluetooth devices
None
OpenGL information
OpenGL software
OpenGL version: 2.1 ATI-7.32.12
Render version: 2.1
Shading language: 1.20
Maximum texture size: 16384 x 16384
Z-buffer depth: 24 bits
Maximum viewport size: 16384 x 16384
Implementation settings
Use texture compression: No
Well, it might be either your older OSX or my delapidated laptop. Whatever it is, I guess I should stop resisting this butterly vision and let my brain do the filtering.
Software information
Software versions
Rhinoceros version: 5.0 Wenatchee 2014-06-10 (515)
OS X version: Version 10.9.3 (Build 13D65)
Plug-ins
None
Hardware information
Computer hardware
Hardware model: MacBookPro8,2
Processor: Intel Core i7-2675QM CPU @ 2.20GHz
Memory: 16 GB
Architecture: Intel 64 bit
Video hardware
Graphics: AMD Radeon HD 6750M 512 MB
Memory: 512 MB
Screen size: 1920 x 1200, 1440 x 900
Displays: LED Cinema Display, Color LCD
Third party kernel extensions
com.steelseries.BoardListener (8.56)
com.bresink.driver.BRESINKx86Monitoring (9.0)
USB devices
Apple Inc.: FaceTime HD Camera (Built-in)
Apple Inc.: Bluetooth USB Host Controller
Apple Inc.: Apple Internal Keyboard / Trackpad
La-VIEW Technology: SteelSeries
Apple Inc.: Apple Keyboard
Apple Inc.: Display iSight
Apple Inc.: Apple LED Cinema Display
Apple Inc.: Display Audio
Apple Computer, Inc.: IR Receiver
Bluetooth devices
None
OpenGL information
OpenGL software
OpenGL version: 2.1 ATI-1.22.25
Render version: 2.1
Shading language: 1.20
Maximum texture size: 16384 x 16384
Z-buffer depth: 24 bits
Maximum viewport size: 16384 x 16384
Implementation settings
Use texture compression: No
I stated the rules for the maximum anti-aliasing setting in an earlier post in this thread. Please see that post. Niels maximum anti-aliasing setting is limited by those rules.
The current implementation and its rules has been in place since November of last year. Rhino starts with what the driver gives as its maximum anti-aliasing value, then applies additional rules.
The maximum is up to the GPU driver software. I haven’t seen a value above 8x, but I cannot say what might be possible in the future.
I am not sure I am very fond of these new rules you came up with. I have always been smoothly running Rhino Mac on 4x anti-aliasing on an external display, even back in 2010 on a 13" macbook.
Also pretty much every macbook in history including the new retina’s are not allowed to run Rhino with 4x anti-aliasing enabled on an external display. Only the 2011 and mid-2012 with the Radian HD 1GB GDDR5 GPU’s can supposedly handle it.
So in order for me to be able to properly use Rhino on a macbook and external display I will have to go and look for a second hand 2011/2012 model.
Please have mercy on the ones that just bought a new retina laptop and thunderbolt display for 3,5k and will be running Rhino OSX in a Duke Nukem 3D 320x200 fashion.
Luckily there is a work around for this: Start Rhino with your external display unplugged and select 4x anti-aliasing in the OpenGL preferences. Reload Rhino and then plug-in the display and you will be able to enjoy some futuristic 4x filtering.
Now please don’t fix this work around, or instead let Rhino automaticly decide anti-aliasing and also give us the option to let us manually set it, while providing us with a hazardous warning message. Thanks.