Texture size limit of 2GB: a bug or feature?

We used to apply textures of 16384x16384 px with file size about 768Mb .TIFF and it works OK.
Now we are eager to use large textures up to 32768x32768 px with file size of 3Gb but all attempts failed. Just a long suspended waiting and… nothing. The texture does not apply to surface without any errors reporting or smth.

We made an investigation and found out that there is a limit of 2Gb for the texture file to work well.
Moreover, we tried a texture 32768x32768 px with indexed colours (256) and file size of 1Gb and it works fine even with Quadro K4000 3Gb, which reports a Max Viewport Size of 16384x16384.

Sysconf: Rhino version 5 SR12 64-bit @ Windows 7 SP1 Pro 64-bit; 32Gb RAM; Intel Core i7-3930; GPU NV Quadro K4000 3Gb OR NV GTX1050Ti 6Gb, driver 375.63

Hi Vic - I guess it is neither… it falls under ‘Limitation’…

-Pascal

Well, what kind of “Limitation”?
Is there any description of such limitations in documentation? I did not find yet.

I just checked in with the developer.
Textures are stored on the GPU…and the GPU and driver is the thing that limits its physical size.
Rhino does not determine it in any way.
Rhino’s OpenGL settings page will show your limits:

Thank you John for your reply.
Just to clarify:

  1. when i applied the texture using NV GTX1050Ti with 6Gb RAM, the Rhino OpenGL reported 32768x32768 texture size, as well as viewport size. No success.
  2. with default NV Quadro K4000 3Gb RAM the Rhino OpenGL reported 16384x16384 in both strings, but i DID apply the texture 32768x32768 with reduced color depth (indexed colors in Photoshop down to 256) and uncompressed TIFF file size of 1Gb. Was it downscaled or applied as it is?

Do you (or developer) have an advice or workaround for the issue?
P.S. So, the OpenGL driver is the only one to define the limit?

Hi Vic,

Since I’m not really sure what it is you’re doing or wanting to do, other than “use really large textures”, I’m just going to explain what rhino does when it loads textures…

But first, I’d like to point out that “size on disk” does not equate to “size in memory” for file formats that use any form of compression. Also, texture dimension size limits do not really equate to physical memory limits (i.e. loading files that fall within the hardware limits can still fail simply because they’ll take up too much of the remaining memory). Lastly, Rhino does not work with paletted images, it will convert any image data less than 24 bits into 24 bit data. In other words, Rhino only uses 24bit (RGB) and 32bit (RGBA) image data.

So…

  1. Rhino loads the image information from the file…

If the load fails (for any number of reasons, which can be memory limits, or limitations imposed by the library Rhino uses to process image files), then usage of that texture fails, and Rhino does nothing.

  1. If the load succeeds, the image dimensions are compared against the hardware limits, and if they fall within those limits, the image data is pushed up onto the GPU where it sits, waiting to be used.

  2. If the dimensions fall outside the hardware limits, Rhino resizes the image data to match the hardware limits (paying attention to aspect ratios), using a simple halftone downsample method…and then the resized data is pushed up onto the GPU where it sits, waiting to be used.

With that said, if you have a situation where a texture isn’t working, then most likely it’s because step #1 above failed… And if I were to guess, it would most likely be because of the limitations of the 3rd party imaging library Rhino uses…but I would need to run some tests to really figure it out.

If you have an example that fails for you every time, please post it here and I’ll be glad to take a look at it and try to provide more insight on what’s actually happening.

Given that, I am curious why you’re using such large textures, and what it is you think you’re getting from using larger resolution textures? I understand that higher resolution images give you much better fidelity in the display, however, after a certain point, that’s no longer the case, and increasing texture resolutions will result in no change in the final display output (with the exception of large zoom factors).

Thanks,
-Jeff

Hi Jeff.
Thank you for detailed explanation of “things under the hood”.
As for the example, I’ll post it here on Monday being in the office. So i hope you’ll test the imaging lib too.

P.S. We have to use large textures to get fine printing quality on large inflatable shapes, and we make full-color patterns for special printers to get printed pieces of fabric to sew the shape. For example, we have to make an inflatable globe of 60 feet in diameter with full-surface printing of “earth view from space”. If you need more details, you can check the aerodinamika.ru website to get an impression of such things.

Hi Vic,

We get around the large texture size limit issues by using tiled UV maps.
So instead of having one monster texture mapped to the whole object, we break it up into a series of UV groups with 8K textures. Everything runs a heck of a lot faster too.

We do this out of Rhino. In our case with 3DCoat

Steve

1 Like

Hi Steve,
It’s an obvious solution, we have to do the similar things.
But our designer insists on applying a full-size texture to Squish it then.

Hi Jeff,
The example we were talking about is on GoogleDrive due to it’s large size: a 3dm file and corresponding TIF texture.
An object
A texture
The files have to be in the same folder to run properly.

Hi Vic,

It turns out this is a problem with the Windows SDK functions Rhino uses… The 3rd party library Rhino is using, loads your file just fine…but Rhino then uses some Windows routines to store, manage, and pass the image around internally…and looking into the documentation for these, it says they are limited to 2GB in memory size, and no amount of research I’ve done so far, has come up with any alternatives so far.

Unfortunately, this looks (to me) like this will require some pretty big architectural changes to Rhino’s image handling code… I’m pretty sure OpenGL can handle your file, the problem is the contents of the file isn’t making it through the processing portions of the code, so by the time the display goes to use the image, it’s already too late. This is pretty much what I thought was happening, only I assumed it was the third party library, not Windows API.

I’m going to have to pass this along to John Morse (@JohnM) , who heads up that part of the code. So let’s hope he will have better news on this issue.

Thanks,
-Jeff

Hi Vic,

I’m looking into the way Rhino handles the reading of large bitmaps and will let you know what I find out.

Thanks John,
Look forward for results.

Hi Jeff,
Thank you for your investigation, even bad news give us important information.
Just two notes:

  1. I have suspected Windows API too because of Windows Explorer image preview tiles - Windows does not make a tile if the image file is bigger than 2Gb.
  2. I bet you would make big architectural changes to Rhino’s image handling, sooner or later, as more and more users are going to make large-scale projects.

Thanks again

Hi Vic,

You can almost certainly compress your image with very little loss fidelity.
A high quality jpeg compression will probably be barely distinguishable from the TIFF original when at that scale.
Talk to your digital print people and ask what you can get away with. Unless the audience is right up against the globe you’ll be able to get away with much smaller file sizes.

On that topic, I also doubt that your digital print providers will be terribly happy about having to RIP a 2.3GB file :wink:

Have a look at this page (these are the excellent printers we use) and in particular the sections on “Viewing distance and resolution” and “A note on Jpegs”: https://www.vivad.com.au/artwork-requirements

Cheers, Steve

Unfortunately, compressed images on disk have no bearing on images in memory. When compressed images are read into memory, they are uncompressed, and each RGB or RGBA element requires 3 or 4 bytes respectively. That’s why I mentioned earlier that “size on disk does not equate to size in memory”… a 10 byte, compressed 20x20 RGB JPG on disk will still require 1200 bytes of memory (20x20x3)…every time.

I also still don’t have a good understanding on why “large scale projects” means you have to use larger scale images as textures…so I’m still interested in hearing about the different workflows

Hi Jeff,

Hope you are well.

Yes I understand that a 16K pixel image is the same memory footprint regardless of image compression when loaded into VRAM. However it DOES make a big difference to processing time further downstream at the RIP if you compress the file.

I remember having this conversation with you years ago and I seem to remember that I never convinced you :wink:
OK. Firstly to answer why large scale projects need whopping great textures.
Let’s imagine we have decided that for a particular project the audience viewing distance means that we can get away with a final print resolution of say 30dpi.

For a 1mtr square object we only need about 1400 pixels sq, so a 2K image is more than sufficient.
If we now bump that up to 5mtr square (and that’s small for the sort of work we do), we now need about 7,000 pixels sq and so we are already up to a 8K image. 10 mtrs sq and we are at a 16K image

Bigger objects need even bigger textures which is beyond most video cards, so we break our objects up into multiple materials with tiled UV sets and process them separately. We’ve found that using a bunch of 8K textures seems to be the sweet spot. Everything runs much faster, especially if we are doing any complex processing like baking.

Hope that helps, Steve

Hi Jeff,
At least it can be explained from a philosophical point of view: we have more computational power that becomes cheaper, so why don’t we use it?
Now we have to get several small tiles and scrupulously align it with each other manually on a complex surface like, to say, giant leopard. Аnd we have to get all the dots on it’s fur without seams.

Hi Steve,
The size of file does not really matter because we have fast and cheap enough internet access here in Moscow and we compress TIF with LZH before transferring to print providers so it’s rare been more than 1 Gb per file.
Moreover, they are really happy to have such customers as we are, with large amount of printing.

Logged as RH-37095.