Powers-of-Two and Image Atlasing

REMINDER! If you encounter any issues with Enscape (e.g. crashes, installation problems) or your subscription please reach out to our dedicated support team directly through the Help Center or by using the Support button as detailed HERE! Thank you for your understanding.
  • Hello all,

    We are experiencing extremely long Enscape load times during the texture loading phase. This isn't a huge surprise as there is no formal texture pipeline here and the images are everything under the sun. I am working on guidelines for using textures in SketchUp and Revit materials for Enscape performance improvements, this would mostly apply to graphics which are being imported for signage in the final rendering...

    It's not uncommon for our renderings to have several thousand textures, so that is also undoubtedly a factor - moving these files from the network to a local SSD drive helps a little - but isn't really that practical...

    I am curious to know any performance tips anyone would like to offer, but specifically I would like to know what benefit I can expect to see from:

    • Implementing powers-of-two image sizing
    • Rudimentary image atlas-ing to the extent that SketchUp and Revit would support it.
    • Implementing a maximum image size 2k or 4k etc.

    I also need a decent strategy for very large real world graphics, e.g. a 40' banner that should be just a small file becomes quite large at real world sizes.

    • Can I make these graphics some factor of the real world size and expect them render perfectly sharp?
    • What's that look like in practice?


  • Hello Setsudo

    Currently loading textures includes three time consuming processes:

    1. Find the best distribution of the textures into a limited number of arrays (texture size and pixel format must be equal for all textures in an array)

    2. Scale/convert textures to assign to selected array

    3. Upload to GPU

    To accelerate the first step, you might want to have textures grouped by the size and format (number of channels). This way the best distribution may be found faster. This is one of the most cheap steps, so you will probably not win much time here.

    For the second step the best case would be to have textures in the power of two resolution and grouped by size/format. This way resizing and converting to a matching size and format will be skipped for the textures that match the chosen storage arrays perfectly. Maximal texture size is currently limited to 2048x2048px. This is where you can win the most.

    For the third step you cannot do much except for reducing texture sizes. If you have to upload 1000 textures, you will have to wait anyway.

    This information is valid for version 2.5. In furure releases we may introduce a bit more clever way to choose texture sizes to lose less quality and use less memory. If particular OpenGL features support will grow, we may get rid of grouping textures in arrays completely. So again, the information above may not be valid for the future releases.

    To sum up:

    Implementing powers-of-two image sizing

    Will help, especially if lots of textures have the same size and format.

    Rudimentary image atlas-ing to the extent that SketchUp and Revit would support it.

    Will help, but be aware that max texture size is 2048x2048 currently.

    Implementing a maximum image size 2k or 4k etc

    Yes, 2k is the max size.

    Can I make these graphics some factor of the real world size and expect them render perfectly sharp?

    No. But it may change in the future releases.

    What's that look like in practice?

    In future releases (not 2.6) Enscape will probably choose texture size taking into account world size.

    I hope that helps.

  • Just as a side note: I would recommend only using PNG files when you need the alpha (transparency) channel - from my experience they seem to have a heavy impact on load times. I've not tested it yet, but I suspect that a flat JPG and a JPG mask as two files might have less of an impact.