Anti-aliasing not available?

i definitely see your point. however, rhino for mac is used by people who well, use macs! and since most of apple’s computers DON’T have dedicated graphics card, it would make sense if rhino was optimized for integrated graphics cards as well :slight_smile: macbook airs, most of macbook pros (including mine), mac minis, and half of all imacs use intel graphics, and that’s a huge portion of the user base of rhino for mac. :smiley:

We don’t get to choose what hardware Apple chooses to put into their systems. Apple designs computers to be optimized for specialized tasks. The driving complaint with laptops is battery life so they have move to a chip that uses less power. The trade off is features and performance. The Mac Minis also use this Intel chip because they are intended for server and support roles. The Macs with the larger 15" and 17" screens are hybrid computers with a second graphics chip that is there for higher graphics demands. I’m sorry you chose a system that does not have all the features you need.

1 Like

i understand. i just can’t understand how a simple thing like anti-aliasing can be such a big deal to accomplish :slight_smile: but thanks for your explanation!

Only the Intel HD 3000 disables all anti-aliasing. All other Intel GPUs work fine on the Mac.

Rhino for Mac sometimes checks the OpenGL settings too early and gets the anti-aliasing information wrong. This should be better in the next WIP release.

In the meantime, please post your OpenGL settings. I can tell from your settings if this is the case, or if there is some other problem. In the current WIP, you should first open a model, then open the Preferences panel.

Software information

Software versions
Rhinoceros version: 5.0 Wenatchee 2014-07-28 (519)
OS X version: Version 10.9.4 (Build 13E28)

Plug-ins
None

Hardware information

Computer hardware
Hardware model: MacBookPro11,2
Processor: Intel Core i7-4750HQ CPU @ 2.00GHz
Memory: 16 GB
Architecture: Intel 64 bit

Video hardware
Graphics: Intel Iris Pro (null)
Memory: 1024 MB
Screen size: 1680 x 1050, 1680 x 1050
Displays: Philips 200W, Color LCD

Third party kernel extensions
None

USB devices
Apple: Internal Memory Card Reader
Burr-Brown from TI : USB Audio CODEC
Apple Inc.: Apple Internal Keyboard / Trackpad
Apple Inc.: Bluetooth USB Host Controller
Razer : Razer Orochi 2013
Apple, Inc: Apple Keyboard

Bluetooth devices
None

OpenGL information

OpenGL software
OpenGL version: 2.1 INTEL-8.28.30
Render version: 2.1
Shading language: 1.20
Maximum texture size: 16384 x 16384
Z-buffer depth: 24 bits
Maximum viewport size: 16384 x 16384

Implementation settings
Use texture compression: No

Appearance settings
Antialiasing: None
Mip map filtering: None
Anisotropic filtering: Medium

Anti-aliasing is disabled in your current configuration because you have your retina laptop display set to higher than the default resolution. See this thread, especially my first post in the thread, for the anti-aliasing restrictions.

See this article that explains why changing the laptop screen resolution requires a lot more GPU memory. Adding anti-aliasing on top of this requires a great deal of GPU memory. This led to corrupt display images in the past and so is restricted.

i see…the thing is, that retina displays are never set to their 2880x1800 resolution, otherwise the UI would be ridiculously tiny. In the OS, you can set how the display is scaled. I have it set to imitate 1680x1050 display, so that it matches my external monitor. It can’t be set to 1800p resolution. Does that mean that it is by definition impossible to have anti-aliasing on a retina display?

i have now set the laptop display to “best for retina”. i don’t like the bigger UI, but that’s not such a problem. the anti-aliasing now works at 2x :slight_smile: too bad it doesn’t work with other screen scale factors…

I don’t own a mac but “best for retina” resolution is apparently exactly half of the native resolution. Setting that resolution and setting x2 antialiasing I think will make Rhino render the viewports at the native resolution and downsample them to half the size (it also uses a filter that averages the pixels so it will look better than if it was rendered at half the resolution but worse than the original image before downsampling).
The viewports should look better by setting the laptop at native resolution and no antialiasing, and I think it will render faster.

you are correct. they do look better at native resolution without antialiasing, but if i have an external monitor hooked up, i will use the monitor as my primary screen for modeling and the macbook for toolbars and such. in this case, i’ll set the laptop screen to “best for retina”, to get at least soma antialiasing on my main monitor :slight_smile:

Actually, this is not correct. The “best for retina” resolution is stated as 1440 x 900, but that is in points, not in pixels. We have a wiki article about retina displays that explains this difference and I also referenced that article earlier in this thread.

Thanks, I have a habit of answering when I have no idea of what I’m talking about.

“…In the two other cases, OS X creates an off-screen bitmap in the GPU’s memory that is larger than the physical display, and Rhino draws into that bitmap. OS X then scales the contents of that bitmap down to the size of the physical display and draws the scaled down version to the physical display.”

wtf Apple? Does it at least do some antialiasing filtering when scaling down? :stuck_out_tongue:

good my question was already answered …

I got a new 13" MBPr with a Intel Iris Graphics 6100 card. Just for the record: it behaves just as described above - “2x antialiasing” works if the setting is “best for retina” - otherwise only “none” is available.

But just as Rok_Sraka I still silently hope for a way to boost AA settings.
Even tried TestSetAALevel … :wink:

Hi Mirko, Aside from the above issue, I’m curious how you’ve found using Rhino with that hardware. I’m thinking of upgrading to that model myself.

Thanks in advance.

Apple calling something “PRO” and adding Intel graphics is beyond understanding.

I upgraded recently and had to go top of the line for this reason…

1 Like

Hi Alister, sometimes I think the difference is to my 5-year-old 15" MBP is not big enough to justify the new computer. But I am quite happy about 16GB RAM vs 8GB before. And in regards to the graphics card I see no difference at all - except it’s much more quite!! :smile:
And overall size & weight is much smaller obviously…

I used the old computer with bootcamp while the new is running OS X only (until now at least). Eventually I have to try Bootcamp again to make it speedier. All the Mac Apps are somewhat slow IMHO.

Cheers

Hi Mirko sorry to bother you after so long time you posted this comment, i’m curently using a macbook pro 15" mid 2010, upgraded to 8gb of ram and a 512 of ssd.
i’m planning to switch to MBP retina 13", i’m an architect, and i usually model in 3d with rhino and render with artlantis 4, i’d like to know if you are still happy with your choice of moving to MBP retina 13" or not and i’d appreciate some advices concerning this change.
regards
Andrea

No worries :smile:

I am still very happy with the MBPr 13". I have to admit, that I am not a hardcore CAD user (anymore) and I (seldomly) use a second display at work. But I absolutely prefer the small size and weight over the bigger model.
I do not use the standard resolution of the screen, but the ‘smallest’, so I have enough space for toolbars etc.

However, if I’d only have the laptop screen and use CAD more then ~ 50% of the time I would go for the 15".

I hope this helps!
Mirko

Thank you so much, it has been very helpful !!i’m a frequent user but anyway not over 50%!
In the office I use my MBP only plugged to a24" apple screen so I won’t see any difference I guess…I’m a fun of portability as you and my flow of work is changed in a way that I spend 40% of my time out of the office…now I’ve only to check if my artlantis4 works the same wai as cad and rhino!
Thanks a lot!!
Andrea

@marlin
While we are at that, are Rendered Shadows possible on that GPU?
Last time I tried it wasn’t with Rhino 5.

Thanks,
-C-H-A-R-L-E-S-