Is cycles only for viewport rendering or can one render out a large image - 4500 X 3200 pixels with it. And if so how? I’ve been trying to figure it out with no success.
In v6 you can use -ViewCaptureToFile or -ViewCaptureToClipboard (dashed version) to set what resolution to capture (render) at.
In v7 (Rhino WIP currently) you can use Rhino Render (same engine as Raytraced).
I tried to use the rendering engine and it did it’s cycling through all the steps and at the end I got the message at the bottom of the render window 0 Samples and 33 sec. Just a black image. Tried to render out in the viewport and just crashed to the desktop. Cycles is not even close to being ready for rendering. My file is huge and it’s obviously taxing the render engine.
Maybe you could post to me the output of the Rhino command _SystemInfo so I know what we’re dealing with.
A large render should be possible.
and a heavier scene at 100 samples
@PaulS would it be possible for you to share the file with us confidentially? If so please use firstname.lastname@example.org - it should be able to handle huge files. Once uploaded I’ll be automatically notified of the completion. A link to some cloud drive location would work to, send it to the e-mail address I placed in the upload link.
Anyway, you may want to try also with the CPU as render device to see if rendering even starts. As a test say 10 samples should be enough, the count shouldn’t matter.
edit: also bear in mind that if you have already a Raytraced viewport running the memory usage will double when doing either a capture or a _Render. If you use _Render then I suggest you first switch away the vp from Raytraced.
Unfortunately, I am unable to share the file. Strict NDA.
Understood. So to recap the things to try:
- Use _Render and ensure no vp has Raytraced running
- Use CPU as Render device (Tools > Options > Cycles), to be on safe side also use just _Render and no vp with Raytraced active.
Thanks I’ll do that.
A question. I received this file as blocks and is 400 megs in size but assigning a metal material it still renders in grey plaster. Do blocks cause no material assignment? I know in Brazil this is the issue so this is why I am trying Cycles. If I explode to poly surfaces the file balloons to 4 gigs.
Tried at 1200 X 900 at ‘draft’ quality and it rendered out in grey (I’m using the blocks model - 400 megs) with CPU. Tried to render out same size at next higher quality and it stopped and said out of memory. And then just froze.
It depends on how the blocks have been created.
(Simple) example of how it works:
- create box
- set box material to Use Object Parent
- create a block out of the box
- assign the block instance a material
You can duplicate the block instance and assign different materials to different instances, it should all just work. For the simple case anyway. If you have deep structures it won’t work that well.
Are textures used in the model?
What does _PolygonCount for the entire model say? And how many objects (instances) are in it?
When memory pressure becomes a problem for rendering it often helps to optimize the scene, probably best to do with a copy:
If there are for instance thousands of bolts you won’t see anyway - hide them.
If there are a large number of objects that have the same material on it may be useful to _ExtractRenderMesh on them and join them into one large disjointed mesh. Hide the originals.
14 mill quads and 13 mill tris…42 mill tris if forced…and that’s only a half of the model:-)
A respectable size I’ll see if I can recreate a huge block-instanced scene. Maybe do some nested blocking of say a torus until there are 10k or so in there, with reasonably dense render mesh.
edit: I’m trying a scene with a bunch of block instances of the predator head, _PolygonCount says around 106M polygons.
Another question. Much of this rendering will be from enough distance where details will be minimal. What is the most effective way to reduce polycount? I’ve been bringing the poly adjust slider to 0 and bringing the absolute tolerance to much larger numbers but not getting much change in polycount. Is there a better way to bring the model to a very, very course mesh?
If this is an object that will indeed be not in close-ups I’d do these steps, assuming from original full-blown model. Best to work on a copy, so you always have access to the original should you ever need it again.
edit: leave the absolute tolerance alone, it’ll save you some headaches later on.
- delete any parts that are going to be invisible anyway
- select block instances of pieces that have the same material,
- Delete original selection
- select newly created meshes (I believe _SelLast would work here)
- _Join them into one disjoint mesh.
- _ReduceMesh on this one mesh.
Repeat until you have a model that is acceptable for your renderings. It may take some time, but this way you have the best control of how much geometry is going to be rendered. It’ll save you computing power and time.
If the entire model is pretty much with one material (color), then maybe easiest is to just select everything, then _ExtractRenderMesh, delete original selection and join all meshes into one. Finish off with _ReduceMesh.
Thanks…makes sense. I figure it’ll take 3-4 days of babysitting this but I’ll be able to render it at the end.
Who knows with all the AI fuzz maybe we’ll have in some years one that can do render optimization for us