Posts by Sean Farrell

    Is this not 98% of what you want?

    To do this, I took this studio HDRI:

    I increased the brightness of the skybox and got this result with materials:

    There is still some residual color from the materials emitting light, but the basic light influx is almost white. The studio lamp appears not to be white, so you may want to look around for a better one. Finally you probably want to use white background, since the studio looks odd. I noticed that increasing the highlight and shadow made the image look better but also more artificial.

    As I wrote you in the PM, we are looking into improving our environment (sky, background & sun), I added your use case to the feature request. Maybe we can make this option work out of the box.

    Hello Gman ,

    welcome to the forum.

    Enscape already uses ray tracing for a number of features and we try to get the most out of the hardware. As things stand today, the RTX feature that will be shipped with Enscape 2.6 will only be available on compatible Nvidia cards. Many past and current Nvidia cards have RTX "enabled" but you only get the full benefit with the cards that have special hardware to support the feature.

    Hello Nico ,

    for the 2.5 we improved the physics simulation used to model the walk mode. This makes the movement smother, unfortunately this also introduced some bouncyness. This effect gets worse if your encounter low framerates. For the 2.6 we removed the framerate dependence and removed most of the bouncyness. The next preview version should have the full implementation of the new simulation.

    CampCots Unfortunately for technical reasons it is not easy for us to replace things online. We are working on a solution to this issue, but for the time being, like MatthiasB described using an intermediary is the way to go.

    So every time you upload a panorama or web standalone you will get a new link (URL). Instead of using our link and QR code directly, you use a "URL shortening" service such as The key feature here is not that you have a short URL, but that you can change where that URL points to. So when you upload a revised edition of your project, you just swap the target URL to the new target.

    Next you use a QR code generator and generate a QR code for the shortened URL. ( even provides "dynamic" links which is the two steps rolled into one service.)

    If you want to be really fancy and operate your own website you can even do the redirection using a stable link on your website. ( How to do this can be quite technical, talk to your website administrator, they know how to do that. Then make a QR code of that link.

    Hello dongtuuyenblackpeony

    welcome to the forum. There are a number of things to consider with VR. First are you in "room scale" or "seated" mode. If you are in room scale, have you calibrated your headset's height properly. If you are in seated mode you will be offset to the configured height. Look into the settings and adjust the configured height to a value that matches you. For 2.6 we will be adding some improvements to the walk mode, including a minor fix for the viewing height. I would love to hear your feedback when the next preview is released.

    CampCots As noted on the other thread, we currently need SteamVR to support Windows Mixed Reality. The technical reason is that we support the Oculus through their API and the Vive through OpenVR, which comes as SteamVR. Windows Mixed Reality API does not expose an easy way to for Enscape to render to the Windows Mixed Reality devices. With Oculus and Vive marking about > 90% of the market share, we would rather make Enscape better in other locations and have you tolerate SteamVR, that to invest a good chunk of time in re-implementing something that kind of already works.

    I have recorded a feature request and we will elaborate on it, since Windows Mixed Reality devices are slowly gaining market share.

    LiTeL good point. I occasionally have trouble to precisely position the camera. I have recorded a feature request for this, it should not be to hard to realize. Alternatively to a slow button, a setting for the base fly speed might also be a way to solve this problem. We will usability test what works better.

    CampCots Unfortunately when using Windows Mixed Reality, you still need Steam. The reason is that we setup on OpenVR for Vive and all Microsoft Mixed Reality devices. Using Steam is the only sensible way to get the drivers and management software.

    This is kind of a real annoyance for professional VR users, that you need to install game oriented user space driver software. "Let me show you this really serious first responder training, but first I need to launch this gaming platform, oh look Beat Saber is on sale..." I really would like for Valve, Vive, Microsoft, Oculus and & Co. to provide a small "just drivers & management" software, but alas, games are driving this field of innovation.

    tobiasolsen No, it's the "weak" GPU on my workstation, it just can't submit the images sufficiently quick. I could trace it with the help of the Oculus dev tools. Normal apps work OKish, but Enscape is just to complex (on Ultra).

    The good news is, I will leave it this state, since I was finally able to reproduce, without purposefully breaking things, a number of low frame-rate VR deficiencies. This gives me the opportunity to make things better when models start to get to complex. Yes, I know it's weird that I am happy that things are not running smoothly on my machine; Software Engineer...

    Pieter I understand where you are coming from. Our goal it to make a product that is simple to use, adding yet an other slider does not help. On the one hand you can achieve almost everything you would with with an ambient light, by setting the exposure and playing around with the settings in the image tab. On the other hand you can really mess up your settings and think the renderer is broken. Keep in mind most users of Enscape are not VFX artists and as such we try really hard to balance usability with power.

    Hello neotronx ,

    welcome to the Enscape forum.

    Our goal is to build a simple to use and realistic renderer. As such we have a real time global illumination solution to provide realistic ambient light. It's not perfect, but with each version it gets better and more accurate. If you have locations that are to dark, there is a good chance, that it will be to dark in reality. One of the slightly confusing aspects is the auto exposure control, it works really hard to normalize the image, but occasionally a bright or dark spot can bring it off balance. (Like on real camera.)

    Have you already found our Material-ID and Depth export options. It's not quite what you want, but if you want to do post processing, it is a big step forward. Part of the problem with your request is that to a certain degree, some of the information does not exist in a simple savable form. We already have a feature request for something along those lines, I have added your up-vote.

    Nice to hear that you like our software.

    tobiasolsen Can you confirm something for me?

    Does your Oculus Quest (with native Quest apps) also have some tracking lag of approximately 100ms? I have seen "wobbly stars" with the Oculus Rift S and am not sure if my setup is broken or it's just the nature of inside out tracking. And since the Quest and Rift S share the same basic tracking technology...

    Also I am interested in how bad would you describe the ALVR lag?

    Epix 3D As far as I know we don't have an HTC Vive Pro Eye in the office yet. Now, there is no good reason why it should not work work with Enscape; but the devil is in the detail. From the specs it looks like an HTC Vivie Pro and for a start it will just work like that. The eye tracking and the associated foveated rendering is something we are considering to implement.

    For what it is worth, I can report that the Oculus Rift S works fine with Enscape 2.5. (After an update of the Oculus Software & Driver) The only caveat is that you do need a slightly more power full GPU to run it. I tried it on GTX 1080 (OKish) and RTX 2080 (fine). I haven't had time and opportunity to test it on other hardware configurations.