# Zoom1To1Calibrate inhumane

why can´t the bar represent a Number that a normal human being can also measure? just to put that in to context, how should one measure 0.31 mm anywhere with a ruler? Please put the bar into a reasonable length like 150 mm or similar.

most obviously the command can calculate and display the length automatically, so that should not be too hard…

I think you have it all backwards. The bar is in PIXELS because it is in screen space which has no idea what an inch or mm is. Your monitor has a given overall size and number of pixels, which then determines how big a single pixel is on your screen in actual, physical units. You then measure the bar with a ruler and you INPUT the actual measurement in the “length of bar” box. Rhino figures out the ratio between how many pixels the whole bar is and your physical measurement of it - which tells it the actual individual pixel size - and remembers it. Thereafter when you do a Zoom>1to1, things should appear actual size.

Edit - I measured the blue bar in pixels on two of my machines - by doing a screen cap, putting it into Photoshop, cropping the image to the edges of the bar and then looking at the canvas size - and the results are as follows:

Machine 1 - desktop - 718 pixels
Machine 2 - laptop - 1078 pixels

I was a bit puzzled by this, I thought they would be the same size in pixels, but then it dawned on me that there is also display scaling going on - while both are 4K screens, the desktop’s larger screen is at 150% and the laptop’s smaller screen is at 225%.

So,

Desktop: 718/1.5 = 478.67
Laptop: 1078/2.25 = 479.11

Therefore I think the bar is probably designed to be 480 pixels at 100% scaling.

i think there is a misunderstanding going on.

the bar asks to measure it, measuring meaning using an actual material ruler to compare the number precalculated by the command and adjusting the number to coincide with the dimension on the surface of the screen if needed, the dimensions possible to enter are all metric and imperial, not pixel.

the number appears to be correct usually because i believe there are certain standards regarding pixeldensities relating to dpi as in dots per actual inch which is resulting in the command being accurate and when i accept the value and measure the object in zoom 1to1 the object appears to be correct to the mm more or less, making this manual calibration process actually redundant in all my cases at least.

BUT
precisely measuring out the dimension of the given value shown above to be sure that its accurate is shear impossible, calibrating only then makes sense if when needed at all is physically also feasible i would assume.

Yes.

No, that number really doesn’t mean anything except to indicate either the previously entered value, or some value that Rhino has cooked up using the actual pixel dimensions of the bar, the detected display size and scaling. That value is in physical units, and as the second possibility is a result of some kind of calculation, it’s likely to be fractional.

I don’t buy this argument, there are so many different monitors out there with different pixel sizes/densities, there is no “standard”. IMO it would be an error to assume one. So the manual calibration is not at all redundant.

I don’t understand this at all. Almost all screens in the world today are flat, and anyone can put a decent ruler on the screen and measure the bar to within 0.5mm or less. Unless all you have at your disposal is one of these in which case it might not be possible to be that accurate…

yes that could still work, might not be so accurate because there are no half mm on any of the rulers i have lying around, which are quite a lot from all sorts of countries including japan (if that counts as an extra precise/sharp argument )

BUT measuring 0.31mm … not so much

It’s fairly easy to visually interpolate between 1mm lines (if they’re not too fat) and call it 0.5. Again this is purely a visual thing, not sure you really need to be all that accurate, but it’s possible.

As the calibration is stored in a per-scheme basis, a new default scheme will propose some precalculated value. In my case it’s 127.0mm, which obviously does not correspond to the reality of my screen.

In the image, the left end of the ruler looks like it doesn’t line up with the bar, but that is because of camera perspective and the fact that the ruler is in front of the screen, it actually does line up correctly. It looks like 111 would be what I should enter, maybe 111.25 *

So I put in 111 (no fraction) and OK, then I drew a 150 mm line and ran Zoom>1:1:

The result actually looks like 151mm.

I re-ran the calibrate and set it to 111.5 and the result looks better, even still a bit too large. Calibrating with 111.75 it’s perfect.

I attribute the slight inaccuracy to possible rounding errors, and who knows what else.

Considering this is a once-in-a-monitor’s-lifetime thing it isn’t unreasonable to exercise a bit of unusual effort to make the measurement. I think that by using a machinist’s steel rule (for accuracy of the graduations) and a magnifying glass the length could be measured to within 0.2 mm or so. Of course this would require laying the monitor flat on its back so the ruler will stay put. Or perhaps even a vernier caliper would work.

Double-face tape…

Oh, sure. Or maybe cyanoacrylate glue!

well as written i changed quite a few screens in the past 2 years, not only at home did i enjoy 3 different screens but also at work.

i can measure in thirds of a mm by eye, maybe even quarters, and even if i went those extra lengths trying to measure it with some tool that still would not yield me the accuracy to measure the bar accordingly and by all means even 0.5 mm as @Helvetosaur proudly feels securely enough to master would be quite ridiculous let alone in the 0.01 mm department and probably not necessary if the input widths shown would be set to some normal numbers.

also, why having such a miniscule bar when a bigger bar as wide as the screen would yield far lower tolerance.

by all means that is just to get a feeling for how big things are in real size, no quarks are going to be hurt in that process, still there is no need to torture poor guys like me into having sweats about not being able to measure that dumb bar accurately