Enscape already implements metal/roughness physically based rendering (PBR) and can be accessed through the material editor and native material systems. What specific feature are you missing?
Hello @demo3, welcome to the forums. This is quite unfortunate to hear.
First, do you have any "nonstandard" peripherals plugged in? Like the Space Mouse, Joystick, game-pad or similar? If yes, please plug them out and see if the issue persists. If you have found out which device is causing the issue, ensure that it is calibrated correctly.
Second, are the issues starting when you just moved the mouse out of the window or switched the window focus? We may have already fixed your issue for the 2.7. Once the first 2.7 preview version comes out, you should try it out and see if your issue is fixed.
If the issue persists, please update us with details, so we can have a look.
icarumba If you want Quest support, you should petition Oculus. The issue is less of a technical issue, but more of a legal and policy issue. There is not reason why the Oculus software can't soft tether with a PC, the Virtual Desktop implementation shows that it works.
But so far, Oculus does not want the Quest to work with PC applications. For this to work you need to sideload Virtual Desktop, because the version in the Oculus app store can not forward VR, because Oculus forbids it through their terms of service. And using the sideloaded Virtual Desktop app is technically a terms of service violation and voids your warranty.
Just post here and our forum gnomes will pick it up and up vote the feature request on your behalf.
I think I am starting to see what you mean. I am still mildly confused, because the teleport indicator existed beforehand and kinda sorta already checked if the teleport target location had enough space to place you, it just didn't try to place you on the ground.
You can alleviate some of the issues that happen in small spaces, by reducing the spectator width and spectator height. I would be a bit careful with the height though, since that is the offset to the ground in seated mode, having a spectator height that does not roughly match the actual person, can give you a false sense of scale. The indicator figure is exactly as large as the configured spectator, so if it is "very large" something seems off.
My advice to users of VR is to first slightly press the trigger button to get the teleport indicator, then to aim at a spot on the floor where they want to teleport. This best gives them the feeling of what will happen. Just squeezing the trigger will generally result in lots of confusion and disorientation.
Alternatively, I have heard good reports with people using views to move people around in the model. Before the presentation, a number of key views are configured and during the course of the presentation, the spectator is guided through the views and effectively never modes on their own. This is, so I am told, the most novice friendly solution.
Can you post or send me the types of geometry where you are having trouble moving around in VR? I will have a look and see if there are ways in making it less error prone.
Is the light supposed to come out of the side or the front?
The light emitter does not have to match perfectly the shape, for plausible results. If it is supposed to come out the side, you can use a few line lights along the edge. It's not a perfect solution, but should bring you close the the desired result.
Also have you tried increasing the emissive power of the material? In some instance that is sufficient.
Hello hollyconradsmith , thank you for the feedback.
We had a number of users getting stuck under stairs and sloping walls, that is why we altered the teleport validation. We also made the teleportation place the user on the ground when in walk mode, since falling does make many uncomfortable.
Can you please point out specific situations where you feel the teleport is not working as intended? I will look into it and see if we can improve the feature further.
What you get is primarily an increase in performance and a minor bump in fidelity. In contrast to game engines, Enscape already does some ray tracing and the the visual improvement is not so obvious. These are things like higher texture resolution in reflections, more geometry in reflection and the likes. The performance improvements of RTX also open up options to improve the visual fidelity in future versions.
RTX is enabled for all "real" RTX cards automatically. We disabled RTX for the cards that emulate RTX, like the GTX 1080 because it was slower than our own path tracing solution.
Ok, this is only for Revit 2019 and I have not tried it out. But you can use Substance in Revit 2019 and up and it should be picked up by Enscape without issues. (As long as the substances are not dynamic.) If someone tries it out, I would really like to hear how well it works.
The relaxed triggers are going to be in the 2.7. So you should be able to test it in the first 2.7 preview which should come out in the next couple of weeks. (Currently previews are for 2.6.1)
annevanzwol Tentatively yes. Using Virtual Desktop will always remain a hacky solution and I really can't give you any hard guarantees, it will remain not officially supported. I ran the Oculus Rift S through the OpenVR code path and had some minor issues, such as the analog sticks not mapping cleanly.
But upon review, we decided to relax the endpoint threshold on the teleport trigger, this should help with the Virtual Desktop setup.
For one thing we have the option to rotate the sky map image under Athmosphere > Horizon > Rotation.
For the two other options, that kind of does not work with sky maps, it is just a 360 degree picture. To try our what scaling would do you can use the field of view slider. That is the max you can do with a sky map. The problem is, if the background is at a different field of view than the geometry, you will have it move at different speed when moving and it will just look broken and disorienting.
The best option is to either get a sky map that is at the elevation you require or obfuscating the horizon with context.
I just checked what the issue may be and I understand the issue. You mean decoupling the motion input from the HMD head. Thus forward is the global forward not the direction of view.
I will PM you with contact information and options on how to proceed.
Hello omnifinity welcome to our forums.
I was wondering how the input is submitted to Enscape? When in seated mode, you can translate freely using the left analog stick. This is currently restricted in room scale, but it should be possible unlock it. Our experience with room scale is that any translation, especially sideways makes people nauseous and there is a risk of people falling over; hence why it is restricted.
For the time being you can try if seated mode works for you. In that mode the height is restricted to the spectator head position in walk mode, but apart form that, there is no reason why you cant use it standing up.
I will look into it, but aren't the quest controllers the same as the Rift S controllers? Why not "emulate" that. Also I just checked the Rift and Rift S controllers are basically the same and behave the same. (Sans the touchy area we don't use anyway.)
for the Vive and all Mixed Reality devices we are using OpenVR to interface with them. The teleport is executed when either the trigger button is pressed or the trigger pull is fully done. To be specific, you can pass that to the developer of Virtual Desktop, the button that is monitored is OPENVR_BUTTON_TRIGGER, the axis for the trigger OPENVR_AXIS_TRIGGER needs to be above 0.999 and the controller needs to have the role TrackedControllerRole_RightHand.
I hope that information helps. Supporting the Quest, even if not officially, sounds really useful.
That is cool to hear. We are always looking to extending the standalone experience, maybe adding WebVR to the web standalone or building a standalone viewer that may be able to run on android. But for the time being we have nothing interactive that will support the Quest natively.
You can work with panoramas and combing them together into an "interactive" presentation using one of the many panorama viewers. Here is a blog-post from a while back. This should work with the the Quest, there are a few apps, but I have not tried them yet. The nice thing with panoramas is, you get high end visuals viewable on low end hardware, like your phone.
For the WiFi, you should be able to buy a rather cheap access point (AP) and connect that to the laptop. You can connect the laptop to the AP and the Quest to the AP's WiFi. You can set that up "at home" and it should just work out of the box at the client's site, since from the laptop's, AP's and Quests point of view they where just powered down and nothing changed. The only problem is that windows does not like to connect to two networks and your laptop will not have any internet wile connected to the AP. (It's possible to configure two networks but it's not trivial and should not be necessary in your case, just unplug the AP if you need it.) The second issue Android devices, like the Quest, don't like WiFi's without any connection to the internet, so you may need to be insistent and remove any networks that do have connection to the internet. (I once had a WiFi camera remote that forced this situation.)
For the teleport issue. This is unfortunate that the key mapping or trigger is not properly forwarded. I read about the issue with Virtual Desktop and that is why I favor the Rift S, since that is basically the same technology, just tethered to the PC. But there is also the workaround that you can use, either use views to teleport the spectator to predetermined locations or you can double click on the PC screen to teleport the spectator to the given location.
to be specific, we collect the following information about your system:
- the version of Enscape
- the fields in the feedback form (name, email, description)
- some basic information about your Enscape license
- your GPU types and driver versions
- your CPU version
- your Operating System version.
- the recent log files and dump file
We are not collecting anything else, especially nothing from the CAD you are currently running. That is why we often need to ask you for project files when we can't reproduce the issue. But if you can reproduce the issue in a different project, one you created just for testing purposes that is totally fine.
[...] My main complaint though is with how long it takes the asset library to load. Is there a good reason for this? [...]
The asset library is cached locally. The first time you open it and something changed it needs to be downloaded. After that initial wait it should open rather quickly. Are you experiencing the slow load on consecutive uses?