I’ve done ma time of research around the concepts of tolerances and I think I understand them.
I’m working on architectural models intended for construction, so our precision needs lay around milimeter (1-5). So my question lies on which approach is better, to use lets say milimeters as unit with a tolerance of 1 or to use meters (that’s our firm convention, with a tolerance of 0.001).
The result should be the same (I think) but the floating point difference makes me intuitively think, there might be a processing difference between the two. Am I wrong?