Posts by Sean Farrell

    I think I am starting to see what you mean. I am still mildly confused, because the teleport indicator existed beforehand and kinda sorta already checked if the teleport target location had enough space to place you, it just didn't try to place you on the ground.


    You can alleviate some of the issues that happen in small spaces, by reducing the spectator width and spectator height. I would be a bit careful with the height though, since that is the offset to the ground in seated mode, having a spectator height that does not roughly match the actual person, can give you a false sense of scale. The indicator figure is exactly as large as the configured spectator, so if it is "very large" something seems off.


    My advice to users of VR is to first slightly press the trigger button to get the teleport indicator, then to aim at a spot on the floor where they want to teleport. This best gives them the feeling of what will happen. Just squeezing the trigger will generally result in lots of confusion and disorientation.


    Alternatively, I have heard good reports with people using views to move people around in the model. Before the presentation, a number of key views are configured and during the course of the presentation, the spectator is guided through the views and effectively never modes on their own. This is, so I am told, the most novice friendly solution.


    Can you post or send me the types of geometry where you are having trouble moving around in VR? I will have a look and see if there are ways in making it less error prone.

    Is the light supposed to come out of the side or the front?


    The light emitter does not have to match perfectly the shape, for plausible results. If it is supposed to come out the side, you can use a few line lights along the edge. It's not a perfect solution, but should bring you close the the desired result.


    Also have you tried increasing the emissive power of the material? In some instance that is sufficient.

    Hello hollyconradsmith, thank you for the feedback.


    We had a number of users getting stuck under stairs and sloping walls, that is why we altered the teleport validation. We also made the teleportation place the user on the ground when in walk mode, since falling does make many uncomfortable.


    Can you please point out specific situations where you feel the teleport is not working as intended? I will look into it and see if we can improve the feature further.

    What you get is primarily an increase in performance and a minor bump in fidelity. In contrast to game engines, Enscape already does some ray tracing and the the visual improvement is not so obvious. These are things like higher texture resolution in reflections, more geometry in reflection and the likes. The performance improvements of RTX also open up options to improve the visual fidelity in future versions.


    RTX is enabled for all "real" RTX cards automatically. We disabled RTX for the cards that emulate RTX, like the GTX 1080 because it was slower than our own path tracing solution.

    annevanzwol Tentatively yes. Using Virtual Desktop will always remain a hacky solution and I really can't give you any hard guarantees, it will remain not officially supported. I ran the Oculus Rift S through the OpenVR code path and had some minor issues, such as the analog sticks not mapping cleanly.


    But upon review, we decided to relax the endpoint threshold on the teleport trigger, this should help with the Virtual Desktop setup.

    For one thing we have the option to rotate the sky map image under Athmosphere > Horizon > Rotation.


    For the two other options, that kind of does not work with sky maps, it is just a 360 degree picture. To try our what scaling would do you can use the field of view slider. That is the max you can do with a sky map. The problem is, if the background is at a different field of view than the geometry, you will have it move at different speed when moving and it will just look broken and disorienting.


    The best option is to either get a sky map that is at the elevation you require or obfuscating the horizon with context.

    Hello omnifinity welcome to our forums.


    I was wondering how the input is submitted to Enscape? When in seated mode, you can translate freely using the left analog stick. This is currently restricted in room scale, but it should be possible unlock it. Our experience with room scale is that any translation, especially sideways makes people nauseous and there is a risk of people falling over; hence why it is restricted.


    For the time being you can try if seated mode works for you. In that mode the height is restricted to the spectator head position in walk mode, but apart form that, there is no reason why you cant use it standing up.

    Hello annevanzwol


    for the Vive and all Mixed Reality devices we are using OpenVR to interface with them. The teleport is executed when either the trigger button is pressed or the trigger pull is fully done. To be specific, you can pass that to the developer of Virtual Desktop, the button that is monitored is OPENVR_BUTTON_TRIGGER, the axis for the trigger OPENVR_AXIS_TRIGGER needs to be above 0.999 and the controller needs to have the role TrackedControllerRole_RightHand.


    I hope that information helps. Supporting the Quest, even if not officially, sounds really useful.

    That is cool to hear. We are always looking to extending the standalone experience, maybe adding WebVR to the web standalone or building a standalone viewer that may be able to run on android. But for the time being we have nothing interactive that will support the Quest natively.


    You can work with panoramas and combing them together into an "interactive" presentation using one of the many panorama viewers. Here is a blog-post from a while back. This should work with the the Quest, there are a few apps, but I have not tried them yet. The nice thing with panoramas is, you get high end visuals viewable on low end hardware, like your phone.


    For the WiFi, you should be able to buy a rather cheap access point (AP) and connect that to the laptop. You can connect the laptop to the AP and the Quest to the AP's WiFi. You can set that up "at home" and it should just work out of the box at the client's site, since from the laptop's, AP's and Quests point of view they where just powered down and nothing changed. The only problem is that windows does not like to connect to two networks and your laptop will not have any internet wile connected to the AP. (It's possible to configure two networks but it's not trivial and should not be necessary in your case, just unplug the AP if you need it.) The second issue Android devices, like the Quest, don't like WiFi's without any connection to the internet, so you may need to be insistent and remove any networks that do have connection to the internet. (I once had a WiFi camera remote that forced this situation.)

    For the teleport issue. This is unfortunate that the key mapping or trigger is not properly forwarded. I read about the issue with Virtual Desktop and that is why I favor the Rift S, since that is basically the same technology, just tethered to the PC. But there is also the workaround that you can use, either use views to teleport the spectator to predetermined locations or you can double click on the PC screen to teleport the spectator to the given location.

    Hello Solo


    to be specific, we collect the following information about your system:


    - the version of Enscape

    - the fields in the feedback form (name, email, description)

    - some basic information about your Enscape license

    - your GPU types and driver versions

    - your CPU version

    - your Operating System version.

    - the recent log files and dump file


    We are not collecting anything else, especially nothing from the CAD you are currently running. That is why we often need to ask you for project files when we can't reproduce the issue. But if you can reproduce the issue in a different project, one you created just for testing purposes that is totally fine.

    [...] My main complaint though is with how long it takes the asset library to load. Is there a good reason for this? [...]

    The asset library is cached locally. The first time you open it and something changed it needs to be downloaded. After that initial wait it should open rather quickly. Are you experiencing the slow load on consecutive uses?

    Thank you for your feedback on this topic. I understand that having a ready to use studio environment will get you up and running faster. I have forwarded your request.


    In the meantime, building a studio environment can be quite simple. First you take a HDRI "skymap" of a studio environment. Then you model the ground and background planes in a white with 100% roughness, a bit like so:



    You may need to adjust the orientation of the sky, ensure that the sun follows the sky and adjust the exposure to the desired values. This should not take longer than 5 minutes to do.



    I hope we can have something that woks for you soon.