[FEEDBACK] V7: Internal scaling. Provide better accuracy across functions and object size. Improve working space

This is just an idea for developers or a petition
Can WIP Rhino7 somehow as for example scale down internally big objects and scale up smaller ones for each operation?

Accuracy is one of the grates points of Rhino, but we need more. Is not intuitive for users that you must stay on the grid and use default settings. And after years of experience, you learn the correct working zone.

Some research: User comments

Some modelling tools would need some refinement and increase in reliability (for example booleans and solid fillets)

Also if you are dealing with very small parts with a high level of accuracy, or very large file sizes, other programs may be better suited. I design boats and if you are not careful, it is easy to loose your center line which is a big problem when building symmetrical things. Using gumball (the Rhino command to move an object through a pop-up with x,y,z coordinates) is not always accurate and you need to keep an eye on your tolerances when using it.

It takes some time to learn how to use it, it is not very intuitive, much practice is required before you will be able to do complex shapes.

I’m having problems with tolerance. Does Rhino need to handle more precision in the workflow?

If I make a large vehicle, I wish to render in a big environment exterior and at the same time, provide the smallest modelled detail, working on the smaller part of the motor ( is not so simple). “So it’s quite good for basic modelling but for detailed models and mechanism probably it is not enough.” This is also because is missing radius injection but probably grasshopper fill this gap.

I wish the Rhino7 would improve and expand the working zone (the greatest permissible distance). Usually, I render outside, but so that we can work on a big vehicle with the same precision of smaller objects and not having problems making nuts fillets for example.

I’m not talking about changing higher than needed absolute tolerance to fit inside rhino needs. We don’t need intersections with many, many control points! I mean reduce the noise level, improving the Rhino functions accuracy or the size of the grid. To provide and ensure greater consistency across object size.

Actually, the object must not be too big or too small. That create issues and unnatural workflows.

Maby Rhino has a destructive workflow when you move and modify objects and that is the reason for using doubles? In that case, consider a fixed object in the centre or making no destructive workflow (like Substance Designer Layers).

In games, Kerbal Space Program was able to cover more play space and handle better position jumping from floats -+7 to double-precision floating-point. Rhino uses 64bit or double-precision but can’t handle that amount of space? we can only work in a limited zone size.

An internal automatic scaling of the object when making an operation, applying functions can improve software handling?

So are you talking about modeling tolerance or the OpenGL display on huge models miles from the origin?

There may be not much they can do about the latter without a ton of work, it’s just not how realtime graphics work, and there’s no comparison to be made with the tricks games use at all–they use a bunch of pre-processed tricks!

The limits of Rhino’s modeling accuracy are the limits of double-precision math on a digital computer. Again, nothing they can do about that without reinventing the wheel. Every other CAD program is the same or in fact worse, many of the “big guns” have or used to have fixed internal units that limited the scale and precision of what you could do regardless of what you told it.

Pick your tolerances appropriately and you should have plenty of “room.” I usually model with absolute tolerance set to 0.0001" or 0.001mm–which is ludicrous, but it does ensure fillets are precise enough to clean up easily–and the worst-case scenario I ever had was I was doing some weird workflow stuff that resulted in geometry scattered literally a MILE across the file, modeled at 0.0001" tolerance. No issues related to that–well at the time it was big enough to see display artifacts–and that was a very long time ago. 64-bit numbers have 15 or 16 significant digits. To be on the safe side set your units and tolerances so that you’re using maybe 7 or 8 max.

1 Like

Thx Jim for the replay, and jumping into the discussion. I’m just trying to help development giving user feedback (be patient, my English is weak).

To be on the safe side set your units and tolerances so that you’re using maybe 7 or 8 max.

That is the problem I want to point out.

which is ludicrous, but it does ensure fillets

And by fillets, you also mean a lot of other Rhino functions.

Sometimes we need to increase rhino tolerances for rhino reasons when we do not need it in our project (example car designer that will make 1:5 clay model, a game developer, an RC model, etc…). What I usually do is to have a second Rhino open and make some operation on different scale or tolerance. This is because these objects are way too small or too big for the project. An example can be working on the fillet, trimming and shrinking a bolt for inside a 32m sailplane glider project. I’m making the airfoil profile and the bolt is very important. Another example was my Ferrari shape (Rhino3) by handling the engine parts where each millimetre counts. Making a 150-foot sailboat and having a very precise blade airfoil, etc…

There is always a solution and if looks like there is a limit then look for a new solution 90° to the problem in a new plane dimension.

I will try to put down some example and (in my ignorance on how Rhino is coded) some solution ideas. CPU power is not a limit. I can imagine Rhino7 running in 64 cores in idle and cloud services offer and waiting for task. Or you can make better calculations accuracy on the background. For example, 64bit is not the limit, some space game you can combine two doubles or two simple floats to increase accuracy or precision (but I do not think that is a good solution here).

Problem example:

  • I have a ship and I need to convert to Meshpoligoms the settings and result are way very different than the nuts that are in my same project.
  • To make an intersection, fillet the object, the object can´t be too little.
  • Functions has a comfort zone. If I export IGES I need to shrink them becuse Alias sometimes do not pick up the trim. The problem is that. If I shrink and I give me overlapping points, I carry without noticing that in all the project pipeline. And in the end, I need to redo all trims and model because of that. As to the forum. A lot of forum threads are fixed THANKS Peter and you tanking user novice time and expert replay time.
  • I do not need big tolerance in my game but I need to add a lot of precision in Rhino to make it work. This must be done automatically by Rhino and not by the user that more often is inexpert of how that particular function works. And later when I export I have too many digits (including in smaller LOD) in the transform that reduce my game frame rate and increase the creation time and file size to ship. Sometimes I finish opening the file and take all redundant them manually.
  • Scale up or down automatically for each individual functions. You need to be an expert to set up big object and is a completely different setting (that you can’t save) and experience form making the mesh of a very small object (fillet of the nut).
  • Making a space ship for game development in Rhino: I use Maya and Alias and Rhino is perfect for the task of modelling beautiful space ship but hast this limitation when I go down making the nuts or hangar container. Finish making in separate projects.
  • If I increase the tolerance Rhino put a lot of points inside intersections. Big objects became complex and heavy for no reason. Usually in the industry bigger is the objectless tolerance you need. Making a decision only when starting the project is a problem. Some project will be approved and others will not. But is not possible to simply to change tolerance in the way.

What I mean is that all that 64bit space is not optimised. And millennials can loos time fighting for a trim.

Some possible solutions:

  • Use cores to do the work in the background in much greater precision. More or less how HD work
  • Simplify visual representation version that is provided to the user via screen always perfect no matter the size. So Rhino works faster with the user as a user interface and does de hard work in the background with rhino different scale tolerance.
  • Increase the level of success. Study how many times a function is called and fail in the execution and increase the success rate.
  • Quality control: Avoid giving to the user functions that do not succeed perfectly in the execution or expectation. The solution can be that rhino scale the object up or down to fit inside his executing comfort zone. Or make a visual red indication when the object is not in tolerance ( for example overlapping points of the same curve or surface mark them in red)
  • Consider using bigger tolerance internally so that we can increase or decrease in a non-destructive way when exporting. Substance Painter does this for textures: It works internally in 8K; you work in 1K and at the end, you scale it up if you need to.
  • Change the tolerance of individual objects. Smaller objects need more tolerance. For some group, layer or having a new window drag and drop system specifically where you can set specific execution tolerance. But is better if Rhino does this automatically for us. And only when we export it asks us with how many digits we want to export.
  • To Mesh command can be solve using Grasshopper but the setting and result look like different (better-using Mesh). Consider improving the (tolerance) of Mesh command interface for game developers and external rendering.

You might look into Worksession to combine models that have different tolerance requirements. I’d say building them separately is the way to go.

-Pascal

2 Likes

Yes in Rhino6 and that is nice for isolating the problem.Makes you concentrate on that particular object at that scale. But I still consider Worksession as a workaround of the real problem. With that solution learning curve increase. Specially for beginners that need that function works well.

I remember switching to Rhino because alias booleans were never fixed. Rhino function works fine but only in a limited region of space.