@boatkinson
So here’s been my sordid trip through Mac Video Hardware.
I have 3 2010 Towers. All started out life as dual 4 core 2.6’s and stock 285 graphics cards. All have been post upgraded to dual 3.46Ghz Hex core Xeons. Up until Rhino 7, all were parked at OS 10.13.6 and all of them had dual NVidia GTX 970’s in em running stock (non Mac EFI flashed firmware). No fancy external power supplies required, just a couple splitter cables for the power.
The boxes were all running NVidia CUDA drivers for the sole purpose of acting as a 6GPU render farm for Blender. It worked like a champ and was fast. As in I could render a 60 second 4K ray traced animation under cycles in less than 24 hours.
Had zero issues, all boxes were rock solid and could be left pretty much unattended for weeks. That was all great.
I knew I couldn’t go past 10.13 as Apple shut down NVidia’s ability to install KEXT’s as a 3rd party with 10.14 and later. I was ready to live with that.
10.14 came out and Apple nuked more API’s namely open CI, Open GL and every thing else vaguely 3D rendering related in favor of it’s “Metal”.
When Rhino 6 came out, it needed to support 10.14 which meant it also had to support Metal. The only option at that point was to ditch the NVidia’s in one of the boxes and drop in an AMD RX580 just to not be completely crippled when using Rhino.
While the card works, it’s slow, (slower than the 4 year GTX970’s it replaced mind you) and had no support for any GPU based rendering of any kind in Blender’s cycles.
Cycles render on that box are now CPU only and even with 12 of the fastest Xeon cores you can put in a tower it’s 1/4 the speed of the CUDA GPU renders.
But at least I can still run Rhino on one of the 3 towers and do what I need to do.
So mid last year, I had an opportunity due to covid to borrow a buddies spiffy new Mac Pro Tower because we were both working from home.
So this thing, on the surface should have been beyond bad ass. It had 16 cores to my 12 (albeit at 3.2 vs. my 3.46ghz) dual AMD Vegas, 96gb of ram (that was $1000 option alone, which cracked me up as I’d just recently dropped 96 GB in my “main” tower with used server pull ram for a whopping total of $80). All told, this thing cost his employer over $12K. I’ve got less than $2000 in all 3 of my 2010 towers fully loaded mind you.
So on to the testing. This was highly informal, as I just wanted to get an idea of where things stood. Performance in Rhino was about the same for general (non cycles raytrace rendering) stuff. Complex Booleans completed in about the same time, In short, for most of what I do in Rhino I couldn’t tell the difference between the 2 rigs, despite mine costing about 1/20th of what the new Mac tower did.
Moving on to blender. Did a marginally complex Rhino export via Nathan’s 3dm importer plug in, dialed up a bunch of surfaces to test most of the stuff cycles is able to do threw in a few volumetric lights and some fog and some other crap and hit render. The new Mac Pro was only slightly faster than my AMD equipped main box I run Rhino on (like 12%) and was 65% slower than my old boxes still running NVidia’s CUDA on the dual GTX 970’s.
I loaded up a couple older animation projects that had some fairly complicated stuff going on and ran previews. My old tower had marginally higher frame rates than the $12K wonder. Go figure.
Now I don’t do video stuff at all anymore and I’m sure had I fired up final cut it would have smoked my decade + old tower big time.
But that’s the thing. If you aren’t leveraging Apple’s own software that’s written to utilize the hardware and Apple’s API’s to the fullest, you really don’t get much of performance boost despite literally dumping 10’s of thousands of $$ at the hardware.
Now I know 10.14.x is the end of the line for my towers and frankly, I just don’t care. I’ll have gotten well over a decade of use out of them for next to nothing cost wise when amortized out over the years.
I’ll keep running them until some piece of core software (probably Rhino) I have to use drops support for 10.14.x and then at that point I’ll jump to windows. It’s just not worth it to me, as much as I’ve loved the Mac OS since 1986 when I got my first SE, to have to drop that kinda coin just to run an app. I don’t do much animation or rendering anymore, as most of my Rhino work these days is destined for actual fabrication.
So if I were you, I’d drop a RX580 (non efi flashed, don’t need it) in there, toss some ram in it if you haven’t already, maybe upgrade the proc’s to hex xeons if you haven’t already (they’re cheap if you get lidded ones and delid em yourself, which is actually pretty easy to do), and call it a day. Unless you’re doing lots of cycles ray traced stuff, it’ll be more than fine.
I’ve got several Rhino projects that literally weigh in in multiple GB each with thousands and thousands of surfaces and get along just fine without shiny new Apple hardware. All depends on what you’re doing.
Anyway, that’s been my experience with this whole thing and hopefully it’ll be helpful to you.
Cheers.
Mark