Posts by Micha

    Pete Chamberlain I wished, some times someone from the team would more think about features as an advocate of the users and in this case he should come to the conclusion, the current implement way isn't so intuitive and useful and a solution is needed. I don't know why not more users ask the same question again and again, maybe other users think - oh, the problem is reported, so why should I beg again?


    You have some professional early adapters who tell you that there is a strong problem and a fix is needed. If you ignore them than you waste a worthwhile resource over years. Also you waste the time of other users which run in the same situation, be frustrated, looking for a solution and write posts ... and get nothing. Believe me, the beg system of upvotes is a pain at the users side and doesn't cause a well adjusted software.


    The problem is that it is impossible to keep the head at standard height, if you navigate per mouse.

    That's still the case, the distance from glass pixel to respective background steers the diffusion intensity.

    In the past I got the effect, that an additional placed object somewhere at the scene far in the background helped to get more blur. But I tested it now and this behavior isn't there anymore.


    Here an example - this is the max possible frost effect. The man is approx. 40 cm behind the glass. More blur could be really useful.


    It's a well know and ignored issue since a long time. It makes VR Tours on the PC screen very uncomfortable. I suppose we will need to wait longer for a better PC screen navigation, but I'm sure you will get an upvote accordingly. ;)


    Thank you for asking for it again, maybe once a day it will be solved. It sounds like a must-be-there feature.


    2019:

    Camera movement too slow


    2018:

    Enscape - experiences of the last weeks (bugs and missed features)


    2017:

    Will the Oculus Remote work with Enscape?

    Macker In the past I found that the blur effect depends on the scene depth, maybe it helps to place an object far away at the background of scene. But I'm not sure, since did it some months before the last time. Also adding a noise bump map can help to increase the blur, which is some times not strong enough. But this workaround can cause some noise.

    So it's not a simple equation of taking 2x the geometry and receiving 2x rendering time. Memory is even more critical - if the resources don't fit into your memory, there's not that much you can do.

    Theoretically this maybe true, practical the user can remove the mirror glass of a little/medium bath room scene and mirror the scene as a raw workaround. Anything is rendered perfect and fast without any technical problem. Unfortunately this workaround doesn't work for any scene, because the mirror-room relations. Also my impression was that Enscape can render really complex and heavy scenes, but fails to render a mirror at a small simple bath room.

    The problem was and I think it is, that even on simply light weight scenes not all geometry is show in mirrors, also if the user own a card with 11GB or more. So, I save two seconds and my memory keeps empty, but my client isn't happy. It's the reason because most I found no usage for Enscape, but I see a great potential at this Engine and so I'm waiting for the day, if the quality reached 100% - less my client's doesn't accept. My client pay me per hour, seconds doesn't matter. ;)

    Maybe you sayed - an option like this would allow to see and adjust the reached web viewer look before uploading. Sounds like it's essential to optimize a scene for web use.

    Like "Render quality multiplier" let the user decide how much "optimization" he wants when rendering.

    I like the idea of a render quality multiplier. For example I have seen to often noisy blurry metals at the show case images and I ask me, how some user can sell images like this. Pushing the quality above the VR RT quality is essential for still images and animations.

    I did some tests with the help of a friend who owns the RTX 2070 and what I realized is a big advantage for those who have RTX technology for those who do not have the mirror reflection is clear evolution and the shadows as the contact are much more realistic, the same version and configurations were used in two renders.

    Nice comparison. Thank you, also to tas_1985 . Interesting could be to fix the banding in the mirror reflection of the ceiling. Thomas Willberger

    From my clients need I can say only that parallel views in high quality are needed from any direction - side, front, left, right and isometric views. The shown views with strong tele lens are not usable. The walls need to straight upward and the mini map looks dull, the full render quality is missing.

    Yes but not in the near future. We like to improve the experience for all, like Sergio Fernando , and use RTX to get a speedup. But so far, there is only little visual difference yet a good performance speedup. So far we also implement the same feature set without RTX too, to ensure a broad compatibility.

    But please don't forget you pro users - if they can't deliver you will lost them. Buying best hardware for best quality is no problem for professional use. Speed was never a problem for me at Enscape, but quality very often.

    The mirror reflections of the first image would be for my clients not acceptable. Still images needs to be nearly perfect.

    What you get is primarily an increase in performance and a minor bump in fidelity. In contrast to game engines, Enscape already does some ray tracing and the the visual improvement is not so obvious.

    Is planed to enable RTX features so that a big visual improvement can be seen? I think on raytraced shadows, full scene visibility in reflections and raytraced area lights. Maybe not available for RT, but for stills and fly-throughs.

    Most of the time this will however result in a burned out sky.

    I second your request. On the other side the burned out sky could be catched by an enhanced color mapping. Maybe the ground/sky relation is physical correct, but without a good "viewing device" like the human eye the result will be like you described. Other engines provide the main parameters burn/exposure/contrast to get it job done. The burn control is missed at Enscape.

    Very much I like the curve control of the V-Ray frame buffer which allow me to simulate an anlog film curve based on a very long flat upper curve end like in this extreme example here.