Any plans to fix RAYTRACED mode or is it now where its gonna stay?

I’ll try to put an end to the price discussion, so bear with me.

So if you want a good deal then search ebay for a second hand Geforce 960.
They are selling for as low as $50 it seems and should be 5x faster than what you have and only 30% slower than a 970, but faster than a 1050(!) and as fast as a 1060. (Maybe the 1060 has faster cores, but I don’t know… google if you like)

550 has 288 cores. (yours)
750 has 384 cores (my laptop)
960 has 1024 cores
970 has 1664 cores
980 has 2048 cores
1050 has 640cores
1050ti has 768 cores
1060 has 1024 cores (same as 960)
1070 has 1920 cores
1080 has 2560
So you get a lot of bang for the buck if you don’t go for the top performer.

Happy shopping!

Oh, and keep in mind regarding future proof:
Neon was amazing, bound to the CPU, but it had limits regarding what it could do in the future.
While Cycles will open doors for future development that Neon just can not, and it is still being developed.

Nathan also made a marvellous renderplugin that uses Cycles, that iterated the image just like in the viewport and you could zoom in to check quality while it was working and it was just great, but sadly it didn’t make it into V6. The current viewcapturetofile with a statusbar when rendering larger than viewport just isn’t a good solution. But that’s how it is. For now at least.

Luckily for you Neon still works in V5, and you have V5 and you can copy-paste geometry from V6 to V5 to do your magic.

Remember: you CAN spend 1000 bucks on a graphiccard, but you don’t have to, just like you can spend tons of money on a dual rig multicore XEON computer to speed up and older render engine.
Oh, and Cycles isn’t “free”, it’s part of a new software, software that requires good hardware to perform at max speed, just like Neon requires hefty CPU(s) to perform at max speed too.

550 ti actually has only 192 cores (according this list). A good CPU probably will outperform that in Raytraced

:open_mouth: ! (wow ! )

I included that info (192-CUDA Cores) in my previous reply. Seriously though , nearly TWO HUNDRED parallel-processing cores isn’t enough to get reasonable results ? And that’s just TIME. Noise ? A thousand-dollar card STILL had “noise” at the end of its run, and seriously, nearly TWO-HUNDRED cores ? :slight_smile: !

1 Like

Interesting, but if you don’t include the PRICE of each card as a function of upgrading for a “free” feature, this info is at BEST incomplete … AND nearly 2 hundred parallel processing cores isn’t enough to get REASONABLE (LESS than 55 minutes per FRAME for what’s SUPPOSED to be an RTR solution) performance? Now the 55 minutes per frame, and remember that we’re talking about ANY movement of the display screen PER mouse move, and that’s reported by someone ELSE here, not me. I usually give up at 5 minutes or so. 3 minutes per frame MIGHT be workable. HOW MUCH would I NEED TO SPEND for say 3-5 MINUTES between keystrokes ? I could POSSIBLY (~) see “Raytraced” as an UPGRADE for RHINO RENDER (also an EXCELLENT solution, btw.) but as an RTR ? seriously ?

Obviously you should read up on the technology :slight_smile:
Keep in mind that a GPU can have 2000 cores on a single CHIP. Obviously those cores are much smaller than the 4 cores on an i7… thus they can calculate a lot less. But while they can’t do so much on their own they compensate by being many…

@ec2638 -

TOO FUNNY ! THANKS ! Good way to end the week . Doesn’t CHANGE anything, but DEFINITELY a fun way to wrap it up !

They PAID me WELL and FAIRLY for my work, and we saved $$MILLION$$ as a result, and finished YEARS ahead of the competition, saving even MORE $$MILLION$$ (!) with Management Crediting CAD on-project and OUR TEAM as one deciding factor in that success !

ALL the BEST to ALL from Texas, going forward !

Thanks -

C.

If you can supply and model I will gladly test it out and see what I can do to speed that process up on your system.
By the way, have you tried rendering with Cycles on the CPU instead?

PS! Logging off here, past midnight now and beer is calling. Cheers!

No, it is an INTERACTIVE solution, NOT realtime. That is what it says here, too:

https://www.rhino3d.com/6/new/presentation#raytraced-viewport-mode

Yes, EXCEPT NEON was intended to be RHINO’s approach to RTR, and was promoted as such, and Raytraced is its intended replacement.

I’d already encountered the info at the link you provided, and was LOOKING FORWARD to a GOOD RENDERER I could use INSIDE RHINO, with a possible option in addition to RHINO-RENDER, apparently the intended use for Raytraced. ALL of that works well as a concept, and offering it as an included (no additional cost: “FREE” ) internal feature is a WONDERFUL idea.

So- to my point: a “free” offering (included in the PRICE OF PURCHASE of the base package) requiring $350+ up to a THOUSAND DOLLARS or more to utilize … AND the stated intent of at SOME point REPLACING RHINO Render ? See where I’ve been TRYING to go with this, all along ?

Don’t get me wrong, I’m TRULY excited about the capability, and genuinely hope for success, but consider- let’s say we find that magic combination of settings that pulls the time back to something “reasonable” - say 3 to 5 minutes per frame, have we solved the NOISE problem ? Are we expecting to run Animations at whatever frame-time we DO get to ? Is there maybe a possibility of a truncated version where we get “usable” results earlier than the complete cycle-run (as in NEON) (fewer bonces, fewer cycles, something ) ? These are just questions, but I sincerely promise, they’re actually a real thing.

You guys have done something WONDERFUL with all of this, and with including A-O as a feature of the RENDERED viewport, with or WITHOUT “Shadows” on, but the result there is marginal too. I’ll check again for any additional improvement that may have been made since I last had to disable it out of desperation to get my project completed and delivered …

Ok, I think I’ve about covered it. Everything I’ve reported is EXACTLY as I’ve presented it, and these will remain barriers to implementation generally, as we (customer base) look to adopting the product going forward,. They aren’t insurmountable, but they DO have to be dealt with by folks on THIS side of the table, and I can assure you, While $350+ might be a reachable stretch (I’m looking to budget that with my next client invoice), $500, $650, $1000 + and MORE for MOST may NOT necessarily be. I mean no ill will, but I DO mean clear, open and honest communication on this important topic. I’ve been a RHINO fan and user for MANY years, and hope to be for many more to come ! I can’t thank you folks enough for such a POWERFUL tool !

Thanks again -

C.

You know I’ve been using Rhino since 1997 or so, I think the first machine I ran it on might have been a 486 DX 50 overclocked to 83mhz? Is that possible? That or a Pentium 166. I remember back then, when most “3D accelerators” were frankly kinda garbage, people asking on the original Rhino 1 Beta newsgroup for Rhino to make more use of GPU hardware. Now finally with V6 they’re really doing that, with great results, but there is a price to pay. The GTX 550 was a low-end gaming card, with less performance than a card I bought in 2009.

Yeah, maybe not so much.
Anyone wondering can look it up and see My version is ALLOT closer than the one you’ve presented.

SO- you started at v-1. Well THAT beats me by a couple of versions ! I only started at -3 ! I guess I’m just a young whipper-snapper compared with YOU ! When I got my card, it was new out, and Mid-range. Yes, it was a gaming card, but that’s all that was available for less than a THOUSAND DOLLARS ! See the trend emerging here ?

Processor-shaming went out with the demise of the Pentium as THE power-house of choice , so really, I’m gonna let YOU have this one and bid you good night. Oh, by the way, EVERY SINGLE NUMBER I’ve reported STANDS, no matter HOW clever you may want to be side-stepping THAT inconvenient reality :wink: ! Mine runs Open-gl 4.1 - modern enough for ANYTHING writtten for nearly 200 cores of parallel computing power (RHINO, for instance), and if configured correctly, I can have a 15-minute animation RUN and FULLY RENDERED overnight, that YOU wouldn’t have until FRIDAY. How do I know ? I’ve_DONE_it .

LOTS & LOTS of CORES ? LOTS and LLOOTTSS of $$$$ :wink: ! Why I’m sure YOUR cores are the BIGGEST AND THE BEST ! Bet YOU got LOTS of money to make sure YOU got THE BIGGEST BADDEST video card bragging rights ANYWHERE ! Very COOL !!!

So, render-ON, my man ! RENDER ON !

Sincerely, this has been fun ! Everything I’ve put forth is EXACTLY as I’ve represented it, and MAYBE we’ll see a viable product SOON ? I think we’re ALL hoping so ! I know I am . ALL the BEST you you and yours -

C.

Charles

Yes - but the point you are missing is that Neon isnt using your 550ti. At all.

Andy

2 Likes

But the picture you posted yourself says the release price was $150. In 2011. That’s “low-end,” and it was not the only thing that wasn’t $1000, please stop that. Please post an actual file that illustrates this speed difference you’re talking about instead of asserting what sounds to everyone else who uses and develops these products like nonsense.

1 Like

Des

What GPUs do these school computers have?

It sounds like you’re running in CPU mode.

Andy

As it is a public school, most systems don’t have a GPU, they just have the basic on board video card. This was not a problem with V5 & Neon as the systems could easily accommodate the ray tracing. The trouble is that the students use the lab for 1 hour then a new class comes in, so if the ray trace takes over 30mins it is not going to happen. Likewise having 20 students starting at a screen tracing for that long is asking for trouble in the classroom.

1 Like

As Andy says, forget about the 550, it just isn’t good enough. It’s like complaining that Neon is shit because it doesn’t render faster on an old Core2Duo CPU…

So on a final attempt to help you: Answer my questions already instead of repeating the price of beastly 1080 :wink:

1- What CPU do you have?
2- Did you even try to render on the CPU?

Use: RhinoCycles_SelectDevice and set the value to 0 to toggle the CPU.
Here my laptop with a four core i7 renders almost twice as fast as the 750m, so rendering on the GPU doesn’t make that much sense.

CPU, 300 samples: 2.45 minutes
GPU, 300 samples: 4:10

This is all reflected light, something that you can not achieve with Neon UNLESS Brazil is installed and knew how to set up a scene so Neon could benefit from it efficiently.

Thanks all of you who had constructive input. I cannot test this on the random “Bunky” machines at school since it is a weekend, but I have found a way to make it for for my situation.

Changing from CPU /GPU had very little effect (I’m using my home PC which has a GPU)

What makes it possible is to change the samples to 128, this rendered in 7 Mins.

Still a problem with teh black being a mirror but I can live with that. Knowing McNeel, they will tweak a few things down the track.
My main reason to find solutions is autodesk & sketchup are dominating the educational market and I know Rhino is better for the students.

Once again, thank you all for your help

Des

2 Likes

Oh Yeah, the output

Two ways of tweaking the black material are:

  • set the reflection COLOR to dark gray

  • turn on fresnell reflection, but then you can’t control the amount