Posts by Micha

    Sean Farrell I'm not sure my personal assets are from the same resource like the Enscape assets, but during my last project per V-Ray and models from Renderpeople I found, that the peoples texture looks better if I set the gamma of the texture 0.6 instead 0.45 (1/2,2). Could it be that the textures are created at the old Mac standard gamma 1.8?


    Here an example - left standard gamma correction for 2.2 (blacks are very dark and skin tone to dark/colorful) and left for 1.8 (softer colors).


    I run in this issue in the past too. My impression was that the gamma correction of the asset textures is to strong, so that colors and contrast are quite strong. But maybe it's a question of the users taste too. I would prefer a more fresh and light look.

    ...to wait say 5 or 10 minutes (instead of seconds) to get ultra high quality still images.

    Buy two 2080ti and use V-Ray instead Maxwell. ;) I hope Enscape in premium quality mode will only need 50s instead 5s. ;)

    Beautiful!!! I'm impressed. Looks like a photo. Is it all 3D?


    Are the caustics from the water? I wonder that there are no visible waves, the reflection looks so clean, but the caustic looks like from water waves. (Most at the third image.)


    At the left side there is a large calculation error on the water. Maybe you need to render a wider view and cut away the error in post work.

    Very nice mood. I get hungry. :)

    Is the used HDRI selfmade or is online available? It perfect match for the situation.


    The Pepsi is looking unexpected realistic for a VR rendering. If you find time I would be curious for a screenhot, how did you setup the coke, the bubbles and the glass? Is this model online available somewhere?

    This kind of reflection quality let me stop using Enscape and now I'm waiting for improvements before I try it again. My final clients are not interested to know the technical reason why a still image doesn't look good. Can you imagine a design presentation or design competition, where people talk about the pixel errors and not the design or there are notes on the image, why something is looking wrong? If an image can't be used than an explanation doesn't help. I would be glad if the real time argument could be forgotten for still images. Good luck. ;)

    ... unacceptably high "realtime" losses again

    I'm not sure I understand you. Do you mean the workaround will cause a slow down by the calculation? I don't know that render times are increased, but so far I have seen renderings are needed here only, so this should no problem.

    is there a possibility of reducing this problem of reflection with specific setting ...

    In your case you could work around if you mirror the scene behind the mirror and you remove the mirror. So, "real" objects are seen and not reflections only. ;)

    ... looks like the environment is switched off in the orthographic mode (?), this means we do not get the sky reflections in the glass. Is it possible to maintain the environment settings so the orthographic has the same quality as the perspective?

    Maybe anything works right. Orthographic means that you are looking from infinite distance with a nearly 0° view angle. This could cause that you see a very small portion of the BG only and it looks like a constant color. The same happens for the reflections - every coparallel surface shows the same little part of the environment. If you look at your ortho shot than you see that some windows have a white reflection on it and windows in an other direction shows a dark part of the environment and it looks like no reflection. I afraid what you ask for isn't possible. Or I'm wrong? I'm curious. ;)

    Interesting, looks like heavy usage of RTX raytracing. For example shadows looks like raytraced and not done per shadow map.


    At the youtube videos there is an interior rendered - 4k in 86s (2060). This sounds like what I missed at Enscape - a high quality output based on RTX, not optimized for real time only, but for final stills. The big question for me would be - how much of a scene can be seen at mirrors? Are objects skipped like at Enscape?

    Is it not a problem of your viewing device? The eye distance of the virtual camera influence the stereo effect only. A large eye distance of the virtual camera cause a stronger depth effect. Funny side effect, if you would choose a very eye wide distance than the stereo image would show a doll-house-effect. But it should not cause an overlay problem.

    The Clara scene looks like the scene isn't really flat, the street in higher at the curve side than at the straight one. An other problem could be that the pano isn't 100% leveled.


    If you want to know is Enscape using a wrong horizon or not you could do a test. Attached a "pano" with a horizontal line at 50% image height. I checked it here and it looks perfect. A large plane disk at height 0 matchs the environment horizon line. But I found that a self-shot-pano is in one direction higher than in the other. I thought it was aligned, but it is it not. Maybe it helps if you fine adjust the horizon line of your pano. But it could be difficult. For example I use Affinity and here I can adjust the horizon in 1° steps only, not finer.

    I'm testing the usage of selfmade panoramas too and I quite happy with the results. Could you post some images about your issues?

    I afraid you can't expect so much from manipulating the view per render software. So far I understand there isn't so much possible. During shooting the pano you decide how the view will be looking. For example if you shot a pano in the park 70 cm above the ground than you can expect to render an interior and to look out the window and to see the park from 3m above the ground.


    Adjusting the horizon height is critical - the horizon of the model and the image needs to match. Attached an example where I mapped the pano to a sphere and moved the sphere up. It looks wrong.

    The second screenshot shows my virtual camera at approx. the same height like during shooting the pana - approx. 4m over the ground. It works.