In the menu top left (maybe hidden a bit) you can choose device to enable VR mode
As stated in another thread I am having good results with the Oculus Quest in a wireless setup. Also inside out tracking.
Unfortunately the most important button (teleport) is not mapped. Enscape is looking into it.
Seems that a custom mapping feature would enable the proper use of different headsets.
Until now rendering images at 4K was sufficient. However today a client requested a higher resolution. Even with my 11GB RTX2080TI I am not able to render much higher resolutions than 4K before Enscape crashes.
Would it be possible to implement a solution like you did for the panorama renders?
(making multiple renders and then stitch them into one image)
So currently which is the best option for the best VR output, Keeping it wired or wireless - for enscape.
1)To showcase to client
In a fixed setting I am very happy with the performance of my Vive Pro with wireless adapter. No cables to worry about or clients getting tangled up.
At client locations I am trying the Oculus Quest streaming wirelessly from the laptop. Thanks to the inside out tracking there is no need for basestations and cables and stuff. Much easier and faster to deploy. Unfortunately the trigger button does not work as it should. Enscape is currently looking into it. This would be the ideal solution.
Can you give any indication if the controller issue is solvable? I am eager to deploy this solution for my business and would appreciate any feedback.
Alternatively for the time being can you make it so that I can press a button on my laptop to activate the teleport? It is not ideal but at least workable.
With that I mean. The user points to a location with the teleport arc and I trigger the teleport.
The way the controller is emulated is puzzling to me.
In the SteamVR 'waiting room' I see the Rift S controller. The grip button and the thumbstick are animated analog. Pressing the button halfway will also show it halfway in VR. The trigger button on the other hand is either pressed or not pressed.
Also in the Enscape executable I see the Rift S controllers. The buttons are not animated by the way. But when opening the key binding menu from the settings it shows the Touch controller and there is no option to choose the Rift S. Choosing another controller will not change the controller in Enscape anyway.
As you can see the teleport arc works as expected. But I cannot execute the teleport.
If there is any testing I can do from my end. Please let me know. I am very motivated to get this working. My system, drivers and SteamVR are all up to date.
The VD developer has looked into the Quest controller issue (not being able to do the teleport) taking into account your remarks and made some changes and uploaded a beta version for me to test. Unfortunately the issue is not solved. This is the feedback he gave:
The developer of Enspace will need to properly support this as I’ve done all I can from my end. If they need a key to test it, have them email me: email@example.com
I asked for any pointers for what to look at:
It’s hard to say, the developer needs to run it from Virtual Desktop with a Quest to see where the input is not behaving as expected.
Maybe it has something to do with SteamVR recognizing the Quest controller as an Oculus Touch controller which seems to have a physical click on the trigger button. This is not the case for the Quest controller.
It is so frustrating to have the perfect solution and yet be litterally one click away. The Oculus Quest setup provides fast deployment and large play areas because of the inside out tracking (no need for base stations and cabling) and a wireless experience for the user. This solution would really benefit my business and most likely that of other Enscape users.
Please can you look into it? Maybe a workaround with another button (A or B) to execute the teleport. Or moving forward and letting go the thumbstick like it works in SteamVR Home.
Ok thx. I will pass this on the VD developer.
As explained in the other thread I have Enscape working wirelessly with the Oculus Quest. This make roomscale demonstration on client locations so much easier. One backpack with my laptop and my Quest. No basestations, no wires, no extension cords, no stands for the basestations, no PC with the wireless card for my Vive.
One issue to resolve.
The trigger button on the Quest controller will show the teleport arc but I cannot click and do the actual teleport. This is how the Vive controller works. Pulling the trigger will show the arc. Clicking the trigger will do the actual teleport. This last click seems to be missing with the Quest controller. In SteamVR you can do a controller test which confirms registering the pull action but not the click action.
Tonight I chatted with the creator of Virtual Desktop and he tested the Quest and Rift and reported back that both show the same behavior in the SteamVR controller test. In other words both don’t show the click Enscape is waiting for.
Can someone with a Rift explain how the teleport works in Enscape?
@Enscape. Do you use OpenVR?
I tried rebinding the controller keys in SteamVR which seems possible but then I cannot save the new bindings like for other apps.
Any help would be appreciated. The solution is so close. Just one click away.
Happy to report I was able to solve the wifi setup for client locations.
In first instance I connected a 5Ghz wifi router to the laptop (wired) and then connected the Quest to the wifi signal. This resulted in a good connection between the Quest and the laptop. Unfortunately (as Sean Farrell mentioned above) Virtual Desktop needs an internet connection to startup. After that the connection is only local (confirmed by the creator of VD). This means a cabled internet connection must be available at the client location which is not always the case.
Today I tested a 5Ghz wifi repeater with an UTP connection. The setup is simple. Laptop wired to the repeater and Quest connected to the 5Ghz outgoing wifi signal. The incoming 2.4Ghz wifi is the hotspot on my iPhone. VD starts up and after that a local fast connection between the laptop and the Quest. Now I can run Enscape in VR on my Quest at client locations (wirelessly)
Now only to solve the key binding issue.
As stated in another thread I think that the Quest can easily run the web standalone in VR. It can be run from the native browser so no need for streaming from a separate laptop. Just like https://moonrider.xyz
So the only thing that is needed is a VR interface.
Oculus Quest will likely not be supported by the Quest unless they do crazy mobile optimization or start baking lighting.
Optimization is likely not necessary. I am pretty certain the Quest can handle the web standalone. Enscape only needs to add the VR interface. Just like the website I mentioned above.
The web standalone is great for collaboration clients during the design process. No need for special software/hardware. I really like it.
I do have two suggestions for improvement.
1. Please add the loading screen option like with the standalone executable. It is more professional if clients see my company logo. I don't mind if you add the Enscape logo as well.
2. Please anable navigation through WASD. 'Q' and 'E' are also used. It is logical to have WASD as well.
My goal is to have the Quest work on client locations with Enscape.
Currently I use an HTC Vive Pro with wireless adaptor. To setup at a client location I have to take my Desktop PC (with Gigabit transmitter for wireless VR), the Vive, the basestations, stands, extension cords etc. Then go through room calibration. The quality is great and for longer (paid) sessions this a good solution. However for some quick demo’s it is just too much hassle.
The nice thing about the Quest is that is does not need basestations like the Vive. This means I can take my laptop and Quest in one backpack. No stands, extension cords etc. Also the room calibration is very fast.
I tried ALVR but it was not very smooth and it detected only one controller. Today I tried Virtual Desktop (sideloaded) running from my laptop and the performance is good enough for quick demo’s. It shows both controllers. So this seems a viable option.
2 issues to ‘solve’.
1. The trigger button shows the teleport arc but it does not actually teleport. With the HTC Vive pressing the trigger halfway will show the arc after which a full press will do the teleport. I tried to change binding in SteamVR but without succes. Any suggestions would be appreciated.
2. My current setup uses the home WIFI and has the laptop connected to the router with a UTP cable. I would like to take a trustworthy WIFI connection with me and not be dependent on the client WIFI facilities. Is it possible to connect a WIFI router to the laptop with UTP and then use the WIFI signal for the Quest? Or maybe there are other options. I am no an ICT expert. Any tips would be appreciated.
@Enscape. Ideally I would like to have a dedicated Quest app which runs from the glasses directly without the need for a fast laptop. Could be a sideloaded version so that you don’t have to go through the Oculus store limitations. I think the glasses are capable enough to handle the detail level of the web standalone. The ability to show roomscale VR on client locations using the Quest would really be a game changer. Can you please look into this? BTW I am aware of the cabled solution Oculus will launch in November but a wireless solution is so much better.
Noticed this as well with grass. Even had to crop this part out in my final video
I wonder how the textures look in such a file. Is everything unique there?
TBH you don't want to mess with a 3D scanned mesh in Sketchup (pun intented) because it looks like a broken egg. For texture painting use Blender instead. For providing a lifelike context to your model these scans are great. If you are looking for a clean model then draw it using basic shapes.
How do you get the scans into enscape?
It seems that you can add elements, but can you also modify the existing building? Like cutting holes? Is this done in Sketchup?
My workflow is as follows:
My scan result is available as an OBJ mesh file and XYZ pointcloud. I prefer working with the OBJ as it is (i) easier to manipulate and (ii) Enscape will not display pointclouds.
I open the OBJ file in Blender (freeware) and use the Boolean modifier to cut out geometry and/or clean up the mesh. You can actually modify the mesh in Sketchup or Meshlab but Blender 2.8 in my experience works much faster and smoother (using the EEVEE realtime rendering which is the default) then export again as OBJ.
Depending on the filesize I either import the OBJ directly into Sketchup or use Transmutr to make an Enscape proxy model. Big mesh files directly into Sketchup can make Sketchup sluggish. You can choose to make a simplified faces skipped proxy model which makes placement in Sketchup easy. If need be you can also use Transmutr to reduce the number of polygons.