Rhino Render: Glass and liquid

When rendering a glass and a liquid, what is recommended approach?

The liquid is a closed surface is using the same surface as the glass inner-wall.
In such a case below is the output when rendered with some funkiness to it.

Should I make the liquid smaller, so to make a space? or larger and let it intersect?
or is there some setting?

When I do this type of thing I offset the liquid surface from the glass surface - a few tenths of a millimetre should do it

Thanks!

Hi Toshiaki - try using one surface/polysurface corresponding to the liquid/glass interface with a material having an IOR of:

IOR liquid/IOR glass

-Pascal

Hi Pascal,

With changing the IOR, the pattern seems to decrease a bit, but still seems to appear.
I’m using mac native Rhino Render at the moment… maybe I should check other rendering engine.
Any recommendation? (non-Cuda…)

Well, the idea is to use one surface only as the interface between glass and liquid - is that what you have? If you use two coincident surfaces, you’ll get interference.

Test file added…

SmallGlass_IORTEST.3dm (5.1 MB)

-Pascal

ah got it.

Hello,

Fuse has the right idea for items like glasses of liquid, the offset works well. Conversely, if you have transparent items that are finished on the backside and the viewer is looking through the transparent material to a finished opaque surface I’ve found it helpful to actually extend the transparent materials surface past the opaque surface for the best looking result. In the meantime, here is output directly from within Rhino using Thea (T4R). I could have spent more time on modeling to make it more realistic, but hopefully it will give you an idea of the glass / liquid interaction ignoring the simple model.

1 Like

Hi OSTexo,

I guess if overlapped there’s less reflection/refraction and what’s behind the object will be shown
better?

Output from Thea render looks pretty good.
I also tried the evaluation version on mac (standalone), but still haven’t got the hang of the UI yet, esp changing the material. Any good tutorials?

waiting for any plugin ver for mac rhino…

I use blender for rendering. There’s a plugin insome stage of development, but being mac based I’m not waiting for that. I just name all my groups in rhino and export as obj, import to blender, assign materials and render. On my old tower Mac Pro with a couple inexpensive GTX 970’s it renders extremely fast (3 mins for a HD frame is very long, that’s with lots of transparent materials with odd IORs or for fluid simulations) and most stuff is well under a minute/frame even landscape stuff with millions of blades of grass / plants as real geometry (I.e not textures).

It’s open source (free), is always under development (I.e. Not abandon ware) and runs on all platforms. It has a very decent physics engine, can do full on fluid sims, the render engine is node based and pretty easy to get the hang of for writing your own shaders, and there’s boatloads of tutorials and stuff available for learning it. I’ve already been down the road with more very expensive and dead end render engines that have gone defunct over the years than I can count, especially on the Mac platform.

Thanks for the tip! I haven’t tried it out yet.

I don’t know how many mac rhino users are out there, but being the first to make a
plugin for mac rhino for rendering seems a good leverage in the business…
I’m surprised there’s not one out yet, even in beta…

It’s common practise to let the surfaces intersect slightly when doing glass and liquids.

Sometimes what we think makes sense in our brains is not actually what happens in real life. If you go fill a glass with water you don’t see the glass where the water is. It looks like the water goes right to the outer surface.

You can duplicate that by letting the surfaces intersect.

2 Likes

Brilliant! Works better than my offset… Now if only I could get my brushed stainless steel looking as good as that…

Hello,

Identical glass model the difference being the liquid being offset into the glass on the left and the liquid offset to the center not intersecting the glass on the right.

1 Like

The reason is, that there is no render development kit (RDK) at the moment. It is planned in a future release.
This is needed to develop a render plugin.
Please correct me if I am wrong but I think the kit is needed for the communication between rhino with materials/ texture etc. and the output Plugin.
So long Mac Users have to use stand alone applications like Keyshot, Maxwell and many more.

I see.
I assumed when the SDK is released the development for render plugin also started.
For now it’s a good experience trying out some standalone to get the feel for the
each rendering softwares. though some are quite complex to even get a decent render…
Trying out blender now as recommended by LewnWorx… searching for some good tutorials.
If Keyshot was a little more affordable…

The SDK is already released we are waiting for the RDK :slightly_smiling:

Lemme know if you hit snags. It’s got it’s own learning curves but fortunately there are a lot of good tuts out there.

Biggest issue seems to be units for me. My OBJ’s all come in 10x too big. My solution is to patent the entire import to a null I blender and set its X y@ z scale to 0.1 and that fixes it.

There lots of of materials out there, but using is a challenge until you understand the blender notions of linking and embedding objects. Also all objects must have at least one reference count or they will not be saved with the file embedded or otherwise, so look up how to set a fake user on an one so it’s reference will be saved.

What I do is import all my materials into a separate be lender file, set them all to have a fake user so they will save then import or link to the materials I need in my master materials file on a project by project basis. The key concept to get on linking vs importing is if you modify a linked object you are then modifying it in every file that links to that project. For materials I’m not going to change that’s fine, but if I need to modify it much for the project in question I’ll import it. Links are drive path dependant so set up your directly structure with that understanding. I have a dedicated blender user files directory (I.e. Not the apps folder) that I keep all my template type stuff (mats, HDR’s, etc) in and that path never changes so I don’t run into vanishing items fown the road. I keep a backup for mobile use on a sub stick with the same relative pathing for mobile work so I can just copy that directory structure to another machine if need be and open a project file and have everything in place with no missing path items.

A lot of my renders are product previs type studio lighting applications so I have several “stages” implement with defined lighting rigs, cameras and whatnot on various layers in my startup file (blender lets you save any file as a startup file which acts as a master template). As a result I can export from rhino, fire up blender, import the obj, select which studio I need, link or import my materials I’ll need from my master material library and hit render. Usually it doesn’t take more than a couple minutes before I can hit render. It’s not quite as painless as keyshot but pretty close, costs a lot less, and can do full on animations smoke, fluid, soft and hardbody physics (my architural flybys have dandelions and grass moving in the wind and they even cast shadows on the sidewalk, try that with keyshot or octane render). It’s a little more hand on workflow than some solutions but for me an infinitely more flexible one. It got a built in compositor, supports multi pass rendering (which once you learn how to use it and the compositor effectively can save you as much as 80% off your render times). I multi- pass render everything I do and the setup for the multi pass stuff is built into my startup file so I just adds one minor step of assigning each group a render layer after I import them. Background stuff goes to one layer, non transparent objects to another, and a thing with any transparency in it to another. No sense in burning tons of AA and samples on a off white out of focus studio background, and even less on burning CPU/gpu time on diffraction / IOR and scores of AA ray bounces Etc for stuff that isn’t even transparent. That trick alone shaved my frame render time on the grass and fields type renders from 8+'minutes per frame to under 30 seconds. Multi passing my jewelry stuff took frame render times from 15 minutes to about 2 - 2 1/2. Worth the effort once you get going to dive into the compositor to fully leverage the multi pass stuff.

woah. thanks for the info! I’m sure I’ll go back to this post after studying it a bit more.
Thank for offering future support!(^^)

As will I, thanks!