An nVidia RTX 4060, 4070, 4080, 4090, A6000, A3000, Others Chart

I am working of this chart for myself and for a friend, and I thought I would share it.

  • Basically, what you are looking at is a relative performance chart, using Cycles, albeit with Blender, but the relative performance should be similar. I wrote the person who manages the scores, and he thankfully said that the CPU and GPU scores were directly comparable.

  • The CPS column is cost-per-score.

  • The factor I used for reference is a GTX 1080 as 1x, but it’s also close to a 16-cored AMD chip, an Intel would be a bit faster.

  • The chart suggests that buying a bunch of low-tech video cards–isn’t the cheap way to go. Also, the prices on the RTX 30-series are way too high. Also power improvements have happened.

It shows some interesting things, such as a dual-Epic with 96-cores cannot render even as fast as a RTX 4070.

It shows that memory is expensive, but likely necessary. Especially looking at A6000, which if I could have any card, the A6000 would be it, but it’s too expensive for mere mortals–unless your company needs to close out the quarter and show a loss. Why would I want it: the memory makes it future-resistant. Given that the A6000 is last-gen, that thing must have a whole lot of shader cores.

I have read rumors that they were coming out with a 16GB RTX 4060 Ti, but I am not hearing about a RTX 4070 Ti with 16GB, as in WTF?

Also, there may be a different 4070:

“One more thing…” Actually two, because the A6000 replacement is on the way. If you want this stuff, you want a 1200 watt power supply, as the card itself might take 800-watts!

[I needed to upgrade the memory in my last computer, and it didn’t have NVMe. I am glad I bought a 850-watt power supply. When I bought it, everyone was telling me that we didn’t any longer need the power LOL!]

8 Likes

Thank you very much for sharing this. I assume you get the scores from here? I also assume that the scores have the RTX cards utilize optix, right?

It’s really interesting to see how inefficient a general purpose cpu is for rendering compared to specialized silicon, like gpu cores or even more dedicated parts like rt-cores.

2 Likes

Check the Blender Benchmark for some numbers too.

2 Likes

Nvidia GPUs: https://tinyurl.com/44yc3rkb

Do you have some info on cost for power consumption in relation to all cards? As far as I understand the CPS only takes the price of the card into account.

1 Like

Yes, the data came from Blender’s Open Data.

No, I didn’t include power for CPS, but the 3090, for instance would do pretty bad, and so would the A6000, well and any of the CISC processors.

Also, oddly and fortunately nVidia is now considering their maximum power as opposed to the DTP, which everyone lies about anyway.

exact same thing to me:) I’m very happy with my new GTX 4070Ti, I find it a good middle between being ripped off and still get something that won’t be replaced in two years already…

thanks for sharing your charts and findings! :heart:

1 Like

Yes, with nVidia owning the market presently, the financial suffering is harsh.

I like the 4070 and the Ti. With some 8k renders but mostly 4k, and my zest for 4k textures, I am a little worried about memory. For instance, the room is meant to be cropped 4x3 ratio, so this is a different kind of thing, and I want 4K–vertically. If it weren’t for the Sun, I could render the room sideways : )

Everything else I have rendered for my resume’ has been 4k. I don’t know if I will feel up to working again, but I want to keep my options open.

This is only my opinion, but in Cycles, I like my final renders to have 12,000 passes.

I know that as soon as I get a new card, I am going to tweek and rerender–nearly all of my projects. Just being able to do real-time adjustments will allow the renders to finally be set up right. It’s difficult, when it takes 15 minutes to see–if a minor adjustment was enough–while a new video card might do that in 1 minute. For a bearing race–I made a brushed material that is much better looking than what I had worked up for the diner.

The 4070 Ti fits nicely between the 4070 and 4080. It’s nearly 12 times faster than what I am using. : O

The problem is: I’m on a fixed income, and I have to think about useful service life. As my GTX-1080 is now 6 years old, I want the next card to have a 5-year life.

The diner (below), was too big for reliable rendering on my video card, so I was left with doing it on 12-core AMD 3900x. That scares me, as does paying $1200 for a video card–when really you really want the 4090. LOL! When really, the 4070 would make more sense financially. The diner is a rendering nightmare. It’s full of chrome and glass. Also, the charging stations needed 12k passes before the noise faded.

There’s another problem. I…just fantasized doing some animations. : O
That’s not good.

This is really difficult to talk about but…I…like modeling stuff in Rhino. It’s a sickness : )

1 Like

Yes, that is where I got the numbers.

I didn’t include a link because I didn’t want to send anyone elsewhere, if you get my meaning.

we do have a lot in common, lol.
I had to ditch my old Quadro K2200 WS …no OpenGL in Rhino since RH7 = big discomfort. Also the K2200 was not able to work with resolutions up to UHD or even HD regarding denoising in Maxwell Render or rendering in general.
…and was also looking for something that will at least have some useful RAM in regards to Raytracing for the next few years. I heard everything under 8GB is ridiculous for future raytracing applications and more of a label than real functionality.

Well. I finally had to flex away some parts from the pc tower, because the GTX is huuuuuuge! :slight_smile:
So not only the power supply and the card itself, but also a brand new big tower is in the budget if you don’t want it to look like my fallout post-apocalyptic steampunk computer :nauseated_face:

Although Nvidia has announced to make up for their new price ranges that makes customers furious, after I bought my card, I’m pretty sure the 4060Ti will have some downpoints somewhere, if the RAM is adjusted and probably the price tag as well.

If you can afford it, go for it, it does feel great to have the latest war machine, but we all know that the latest evolution is always more expensive in relation to performance than the “second best” solution, just because you can say: I have the best one (for a while). However, the jump from the K2200 to GTX 4070Ti and it’s performance difference is largely enough for me to keep me happy for a long time :slight_smile:

this started over about 18 years ago for me. It took 8 entire days to render 120 photorealistic images on a ridiculouly low resolution which would result in a 5second video. At the time I had a quadcore and Octas just came out. rendering with Graphics cards started to get known. (You basically needed an entire rack of graphic cards to do so more efficiently than CPU. I think the standard was about 512MB RAM for a graphics card at the time if I remember correctly) so in regards to your last comment:

I think you’ll end up doing some animations.

1 Like

FWIW - there’s a 16 GB variant of the 4060 Ti out later this month. Not the speediest thing, but the extra RAM is nice, and it’ll be priced right I bet.

Yes, but no 16GB 4070 or 4070Ti ?

Correct, unfortunately.

1 Like

Aw. : (

Have you already tried:

1 Like

Thank you : )

I will do my next render in it, on my aging video card, for now, while I can resist…
My current card cannot even do OptiX, though it was a good card in its day.

2X would be a great jump in performance.

For a 6K render:
image

Already just CPU rendering has increased performance. CUDA is also better.

1 Like

Dispersion / Chromatic aberration?
It would be fun for me, but quite handy for the jewelry people.

I guess they tend to use a mixture of path-tracing and ray-tracing to do dispersion?

Anyway, if I got even 1.5x improvement, that would be helpful.

Edit: someone already asked for chromatic aberration / dispersion.

No new features added, just targetting feature parity. There likely still are bugs.

2 Likes

Why not hunt around for a lightly used 3090 or 3090ti? They’re definitely out there, won’t break the bank and have oodles of RAM.

1 Like