I’m doing the Rhino 6 level 2 training materials and came across this. Not an error yet, but might be soon. At least visually there is a gap when zooming in that is noticeable. This seems to be due to Rhino 6’s filleted surfaces not being accurate enough and even the DupEdge command does not recognize this inaccuracy. Does it matter; will it give me errors later on possibly? And why can’t I select the surface edges instead of doing DupEdge for this part of the Level 2 Training Materials for Rhino 6? If someone does do what they train you to do and create any fillets and/or chamfers with the ‘trim=no’ option selected during the tool’s/command’s use, then I can see issues arising because of this. On the left is a grey surface created from the NetworkSrf command and on the right in the picture is the filleted surface (trim=no as they tell you to do during fillet creation).ScoopKevin19.3dm (879.5 KB)
I think is just a visualization thing. the surfaces you see in rhino are just a mesh aproximation to the nurbs surfaces, they could be more dense to look more smooth :
That’s just the render mesh, gaps like that will be visible until you actually “join” up the model, which tells Rhino to actually make the meshes line up. If you still see gaps after joining, THAT’S a problem.
this could rhino solve internally knowing they (surfaces) have got common boundary ?
Oh, wow! Yes, that actually worked Diego. Pretty fast solution this time. Good to know it’s not only going to be good upon export, but also can be rendered well within Rhino 6 this way. So, thumbs up!
I did go into those settings and did the Custom option and did this (which worked perfectly):
If the surfaces are joined they share the mesh vertices around the common edge.
Yes, both you and Diego were correct. Thanks! It looks better now after joining and trying the settings to show refined/detailed meshes in Rhino options.
Well it only “knows” that once you join them, otherwise it would have to try to figure that out with every thing you do and that would kinda slow things down…a lot. Something like that for rendering models was nonetheless a recent request of mine.
thats what i thought it doesnt mean to be realtime but once you run a command it analyzes the situation for the state of view. basically the render situation
Well that’s the same thing…it would still slow down the meshing step potentially a lot. Just increasing the mesh density until you don’t see huge gaps is adequate for modeling, anyway.