Rhino trouble with radius

The best way would be if the SDK gave you the center curve. That would make it easy to figure which are connected. Counting may not work because composite surfaces may produce multiple connected fillets when used as base surfaces.

Again the center curve would make it easy to tell if the direction reverse.
But what you want to do is way to complicated. Just check the edge curves if the direction changes by more than 45 degrees then end the chain. The fillets are supposed to be tangent. Their edges should be close to tangent. If the edge direction changes more than 45 degrees something is probably wrong. If the edge change 180 degrees that’s completely wrong.

That’s a pretty loose tolerance I think the angle should be Atan(abstol)/R where R is the fillet radius and Abstol is the doc absolute tolerance.
The red fillet is on the black surface and the orange fillet is on the blue surface and the blue and black are not tangent at that point. The normals differ by .89 degrees. That’s a lot. The tolerance needs to be tight if you want the fillets to join properly. You might want to mark the edges with a dot that are close to being in tolerance and then the user could know those edges will need to be matched so that they join properly. Otherwise the chain will stop when the normal test fails.

If you find that both corners have the same new surface that passed the normal test, that surface is probably an existing same size fillet or a blend. So you should stop the chain if you don’t want to run over existing fillets.

I get that but the loop should have stopped at the last red fillet if the normal test had been tight enough.

If you have such a case please post it. Also one where _FilletSrf creates 2 or more connected fillets.

In radians or degrees ? Why does tolerance eg. 0.001 get smaller when the radius gets higher ? Example:

R = 0.1 >>> 0.572958 degrees
R = 0.5 >>> 0.114592 degrees
R = 1.0 >>> 0.057296 degrees
R = 5.0 >>> 0.011459 degrees
R = 10.0 >>> 0.005730 degrees
R = 50.0 >>> 0.001146 degrees
R = 100 >>> 0.000573 degrees
R = 500 >>> 0.000115 degrees
R = 1000 >>> 0.000057 degrees
R = 5000 >>> 0.000011 degrees

Makes no sense to me. I’ve just tested with 0.01 degrees and then errors occured, i got some duplicates on the double pipe example which is a stop condition. The document tolerance equals to what i pass to the fillet surface function (0.001). Measured normals of surfaces to some fillets they range from 0 to 0.008.

If i make them too tight it will never get past the deviation test resulting in zero candidates to be filleted.

Ok.

_
c.

Go back to the example that started this thread. If you merge the surfaces into composite surfaces you will get more than one fillet when you run FilletSrf

The angle tolerance needs to be smaller for larger radius so that the fillets will join and trim the base surfaces. The closer the base surfaces are to tangent the closer the edges of the two surfaces will be. With a one degree tolerance you will find some fillets won’t join. It may create fillets that are as much as .05mm apart if both sides have new surfaces that are off tangent by 1 degree. If it is the more common case where you have one new surface and one old surface you could loosen the tolerance by half. I was talking about the worst case scenario if you want joining the loop to absolutely never fail.

I think everything in the RollOverNakedEdge.3dm document would pass the test. That is why it all joined and trimmed effortlessly.

As I said before you could allow some fillets that are close to being in tolerance and mark them so that the edge tolerance could be checked and matched after the fillets are made.

First of all notice that for small angles the tangent function is for all practical purposes a linear function. So you can make your tolerance for the normal test using a hard-coded number something like x/R where x is around .06. The .06 is the worst case scenario where there are 2 normal deviations at both corners in opposite directions (as in the example below).

Second, the purpose of the tight tolerance is mostly to avoid joining and trimming failures. If you are willing to accept some problems with joining fillets and trimming using the loops as cutters them you can use a looser tolerance and then struggle with the joining and trimming problems later. Also if the chain stops because of an out of tolerance condition the user becomes aware of the problem. The user of the script can continue making the chain by clicking on the next surfaces (starting another chain), but the user will now be aware that the 2 chains may not join properly because the underlying surfaces have poor continuity.

Enclosed is a file that may help you understand.
In the file the green fillet is the first fillet. The fillet size is 100. The surface normals at the end of the green fillet are shown. The normal deviation on both corners is .01 degree. If your script allows these to pass the normal test the cyan fillet will be made and the 2 fillets (green and cyan) will fail to join. Their edges are .047 mm apart. I marked the centers of both fillets. Those points are where the center curves of each fillet would end. The center curves would also not connect as their ends are also out of tolerance.

If you did the same thing with a 1mm fillet the .01 normal deviation would not cause any problems in joining the fillets.

I just noticed I forgot to post this file. Here it is
angle_tolerance.3dm (79.2 KB)

explanation is in previous post

8 posts were split to a new topic: On intuitivity of UI

I did, when i run the fillet where it is required i get only one. When i do a convave fillet from the outside with the merged cutter i get two connected fillets which are joined. On the latter this is a brep with 1 face. All these cases would require special handling, first when it happens on the initial fillet picked by the user and second when it happens in the loop.

CompoundSrfInvolved.3dm (124.4 KB)

Remember that every fillet which gets created needs to look back at other fillets and the one it was created from (and its a and b points) to decide in which direction it should continue further or not. Mix this with singularities and you have a nice pool of worms to go in with just with the initial fillet. If one fillet operation in the loops results in multiple unconnected fillet each would add 2 new point pairs to the stack which have to be visited later. So the start of the initial fillet loop will never be reached. As i allready said, the stop conditions you imagine to be working are not enough.

I changed it too smaller tolerance, but when i use the ones you suggest it doesn’t find anything. This happens with very simple objects which can be joined in Rhino using 0.001 as unit tolerance. I’ve created myself a few cases where i get a normal deviation from regular fillets ranging from 0 to 0.015. This is what Rhino does using either _BlendEdge or _FilletEdge.

You’re still underestimating how things work programmatically. Image if _FilletEdge would change the tolerance dynamically after each internal failjure and tried to find a tolerance which is more suitable. It would run forever. Changing these values on the fly makes everything prone to errors and impossible to track down when things fail. A User would not understand why either. The tests i did where all joinable, at some of them i could use the joined fillets as trim object, at others i’ve extracted the borders and it trimmed and joined OK.

I’ve tried you example with R100. If i first pick the white surface then the black it never stops not shure why yet. If i pick in reversed order, it creates 1 duplicate and stops. (got a check in there for the duplicates but it’s only comparing current to last).

I ran into this situation, then used _FilletSrf on the failed pairs of faces and things where working. A User would do the same i guess and then only encounter the problems once he trims and it does not join without naked edges.

Good example btw. I’ve set my unit tolerance to 0.0001 and the angle tolerance to tolerance/R. Then it created a third tiny sliver fillet between the two in your file. The case is a pain to track down as it seems to never find a stop condition yet.

Btw. when i re-write the script to work with surfaces and polysurfaces, would it be logical for the user that only all brep-faces of the initially picked (sub) surfaces are included in the search for tangent surfaces at the points (a,b) ?

I’ve restructured my code to make the closest face and normal comparison in one central place but found that with 20000 surfaces in the file, it just gets noticably slower.
_
c.

Composite surfaces should not be handled at all if you want your script to be robust . Very tiny internal tangency errors will cause filletsrf to produce a garbage fillet.

Each individual surface within the composite surface should be treated as a standalone surface if you want it to be robust.

If you want to avoid failures, you should be making only one fillet at a time - always.

The problems you are running into would disappear if the algorithm was correctly implemented. There is no need to do anything with singularities. Generally you will find one mew surface that is tangent to both sides. That means end of the line. Occasionally you will find no new surfaces. That’s also a stop. If you understood the topology that results in a singularity you would understand this.

For detecting fillets that reverse direction testing the center curve would be best but testing one edge on the new and old should work as well. But you could just do nothing. The worst that will happen is there will be one extra fillet that the user can delete.

You didn’t understand what I suggested.

If the fillet radius is smaller than 8 mm then all of those fillets would have passed the normal test that I suggested.

Don’t constantly try to blame your errors on me. Look at the example I posted do you see the 2 cyan surface normal lines that I made. Do you understand that if there was only one side that had a new surface with .01 angle deviation then the fillet ends would have been about half as far apart as they are if both sides have .01 deviation? A robust normal test would take that into account.

If the abs tolerance is .0001 then the normal tolerance should be .012/R. If abs_tol is .001 then it would be .12/R. What I suggested before was easy to implement but too complicated for you to understand. So just do what you can understand and once in a blue moon 2 fillets that don’t join will be created

Your script should never make that tiny sliver. If the closest point and normal tolerances are sized correctly and everything else done correctly , it would never happen.

BTW if you zoom in at the point -214,404 you will find a tiny version of the same thing that has a 1mm fillet. At 1mm the .01 tangent discontinuities don’t prevent the fillets from joining.

I don’t know why you think it matters . Filletsrf allows you to make fillets that use surfaces from the same brep or from different breps.

.

I’ve tried to explain why just keeping one of the resulting (multiple) fillets returned from _FilletSrf does not work even when choosing one which is closest to the point pair.

I’ll see how i can detect. Reason why i wrote above that if i get a return of 2 connected fillets (joined), internally they seem to count as a brep with 1 face. I’ve been surprised that this condition was handled by itself, eg. it did not generate coincident new (a,b) points from the mated edge, only the ones which where at the open ends of the fillet.

Nope ! You always get the one(s) which you already had to generate the fillet at the start also at the end. I am filtering this one out, but only if 2 or more where found matching the position and normal of the fillet corner. After this you have indeed the state you imagine but only if no fillet has been found too which is blocking the loop. Up to this state everything is robust. (including singularity cases)

After re-reading i think i did. But sorry, i see something different from what you say. If i set my maximum normal deviation to a tolerance of 0.01 degree, it tries to run over the problematic edges in your example. If it is 0.009 degree, it stops there. I see the same behaviour with a radius of 8 or 100,. also if i use 0.012/R.

What i was trying to say is that with the tolerances you suggest, it would stop looping at edges which are usable from my point of view. See the file of the OP, there is a normal deviation of 0.13 degree i’ve marked in red. Extract the fillet borders and trim, seems to join OK.

Apple Watch_original.3dm (408.2 KB)

No, it is simply far off from real world examples. If you use a document angle tolerance of 0.01 degree and create a G2 blend on some very simple surfaces, the normal deviation is higher than that, actually 0.0338. See and measure the G2 blend in this file:

G2_Blend.3dm (171.4 KB)

LOL, i thought you’ve forgotten to remove a point from the file. Nice.

Because i use all surfaces in the doc atm. If it is limited to the 1 or 2 breps picked for the initial fillet(s), other floating surfaces or breps wouldn`t be part of the search for new candidates.
_
c.

@jim, it’s easier said than done. Try _FilletSrf with 4mm: FilletSrf16.3dm (109.5 KB)

@pascal, is this a bug ? Only happens with that radius, which is the half of the srf width.
_
c.

So far this looks buggy to me… If the radius is not half the width of the surfaces (different number or ExtendSrf) the result is clean…

https://mcneel.myjetbrains.com/youtrack/issue/RH-44714

thanks.

-Pascal

The Rhino code for Composite surfaces is buggy. You should just run split at tangents and avoid them if you don’t want to be tripped up by bugs.

Your response makes it clear you have failed to implement the algorithm correctly. If you don’t comprehend the difference between ‘new’ and ‘old’ you are not going to get this right.

Would it be possible to lock the surfaces that are used to generate the fillet and the fillet itself before running closest_point?
If you did that and then closest_point only returned new surfaces that would guarantee you can’t screw that part up.

Its not a state I imagine. Its a necessary condition to make the process work. Its a very simple process to determine if there are new surfaces at each corner.

Also You don’t need to check for blocking fillets if all your script does is make fillets that are tangent and join end to end. The only thing required is remove from consideration any new surfaces that appear at both corners a and b. You don’t even have to run the normal test ion that surface f you are not planning to make any fillets that are not tangent.

The blue normal lines that you created where the fillet crosses that brep edge have angle deviation of 0. There is no place that I see where the normal test would report a deviation anywhere close to .13 degrees.

The blue line where you have marked with a dot that says “.0132979” is a single normal line. There is no good reason for you to be computing a normal at that point. There is only one surface there. There are no angles to compute from just one normal.

You don’t need to tell me that G2 blends are not anywhere near as accurate as rolling ball fillets. I am well aware of that fact. The accuracy and predictability of rolling ball fillets is what makes it possible to create complex fillet solutions with very simple algorithms.

You would hope that the Rhino developers have already dealt with that. If the bounding box of any brep is not close then one would hope that closest point function does not bother searching the individual surfaces inside that brep.

BTW what are you using for the closest_point tolerance?

If you want to analyze fillet(s), you can offset the 2 surfaces by 4mm in the direction of the center of the fillet and then run intersect command on the offsets. The intersect curve is the center curve of the fillet. It appears that the intersection is screwed up because it lands right on a knot.

You could build the correct single fillet manually by using the isourve of one of the offset surfaces as the center curve.
FilletSrf4mm.3dm (157.4 KB)

Jim there is no need to be insultive with every of your replies.

The search for the “old” surface is required and there because if no new surface is found for that side on the fillet corner and the old surface continues, it is used again. I need 2 things to compare with.

No, did you ever try to lock a subsurface of a brep ? Read above why you cannot lock them.

ClosestPoint does not return multiple surfaces. It returns ONE point and, depending on the method used one brepface, brepedge or brepvertex as i already wrote…

This is getting exhaustive. You would probably implement some on “the fly” joining and analysis.

Please look again, the blue line pointing out is the normal of the cutter, the green line pointing in is the one of the fillet at that corner. If i run GCon it prints:

Tangent difference in degrees = 0.0132979

If you look closer in that second file of my previous post, you’ll see that for every surface a blue normal is created at a fillet corner. Measure between them (blue / blue) then compare with (green / blue) at that point. The blue normals in that second file are only created when the normal angle to the fillet is below tolerance for that surface. I’ve created a red normal (at point a) where it is not. You say that G2 blend normals are not accurate, measure where the blend meets the planar surface, they are 0. Then compare one of the blend normals (the red one at point a) with the rolling ball fillet normal (the green one at point a). GCon reports: Tangent difference in degrees = 0.0338

You may also create a new fillet with _FilletSrf there and compare. It gives the same result.

Tried to investigate what happens: My fillets are created with doc tolerance value of 0.001, most of the time the result is equivalent with the _FilletSrf result (_SelDup). With smaller fillet tolerance, i get slightly better normals and less normal deviation. But: If you look in the last bug report above where _FilletSrf somehow creates 16 surfaces, 5 of them overlapping on the left, those cases are appearing more often. I can prevent them with coarser tolerance which then would give worser normals.

doc unit tolerance (0.001) in my examples.

_
c.

OK Now I understand what angles you are measuring.

You are going to find this insulting but what you are doing there is completely wrong. There is no reason for your script to be creating normals at that location nor is there any reason for it to be comparing those normals. There are no new surfaces at that corner and therefore there is never a need to do a normal check at that corner and you certainly should not be using the results of any normal test at that point to do anything .

Only corners with new surfaces need to be run through the normal check.

I thought you said you were using just exploded surfaces.
My point was you need to do something to prevent doing the normal test (or any test) when there are no new surfaces found at a corner

I guess I don’t really know what functions you are using to find new surfaces at the corners. I just assumed that since most of the time you were correctly finding them you must be doing something right.

That is what is i do, after finding “new” surfaces at that corner. Maybe you misunderstood that.

When there is nothing, no normal test is done. After i’ve created some kind of conceptual script working with single floating surfaces i’ve moved to make it work with all sub-faces of 1 or 2 polysurfaces. If the initial fillet is done using the same subsurfaces of 1 polysurface, only the faces of this polysurface are involved. Otherwise i use the sub-faces of 2 polysurfaces.

I’m searching closest surfaces (or now sub-surfaces of a polysurface) by comparing their minimum distance to a fillet’s corner points (a,b). This involves a closest point search.

Afterwards the found 1-2 surfaces are filtered depending on the normal of the fillet corners, if a search result is not not tangent at the corner, it would not be filletable. This works OK if you look at the videos.
_
c.

I understand that. What I don’t understand is why your file ( Apple Watch_original.3dm) contained surface normals at corners that did not have any new surfaces and why you were asking about those normals.

That would certainly be useful for situations like the original model where you have 2 intersecting breps that you want to connect with fillets.

OK now I think I get it
You have been testing every surface in the document to find the surfaces closest to each corner.

This whole discussion started when I said there needs to be a SDK function that gives you the surface ID of every surface that is within tolerance of a 3d point. You jumped in and said there was such an SDK function, but now it turns out that you are coding that function yourself. It is admirable that you are writing the function, but this would be far easier and more efficient and fast if the SDK provided the function.

Yes the videos you posted above have been impressive.

In that example the loop would only be a closed loop if the cutter is one joined polysurface and the watch is one joined polysurface.

Yes. Now i only need to test the subsurfaces of the 1 or 2 picked polysurfaces.

The brep.ClosestPoint() function i’ve linked in that post can potentially give you 1 brepface back, together with the closest point near your test point (eg. a fillet corner point). But if the closest point falls on an edge, you won’t get all faces connected to this edge. Therefore the search (using a tolerance for the distance).

Thank you.
_
c.

I cannot see a video posted, could you please tell me were it is? Is there a switch in the site settings, that I do not see it?
Thanks!

Hi @steff, i have no idea why this could fail and did not enable a special setting for it on discourse to view videos. If you cannot see one at post 116 and 121 it may be your browser. What do you use ?

_
c.