purposeplace.studio, I think you may be relieved to find out you are very close! In the Virtual Desktop menu there should be an option to "Switch to VR" which should be done when an application is looking for the VR headset. Then when you are done with the VR application, you can return to this menu to "Switch to Desktop" and control the computer again.
Posts by vertigo1
I've only just begun to use what is officially documented, in your case it does sound like there could be some work reconnecting maps.
I know that the custom asset editor allows Enscape material definitions to be saved and loaded (right click on the names), but I am not sure if there is a way to export those definitions from a model for use in the custom asset editor. That is where you would cut out most of the extra steps if it is possible.
That time spent getting an asset to look great once in order to use it independently going forward doesn't seem like a waste of time if it is a piece that will be used often from scene to scene. I believe that is the entire purpose of the custom asset editor.
If you have the ability to "export selected" to one of the custom asset formats then it should work...however you will want to move the asset to the origin point before exporting, because the custom assets will place from the scene origin point.
Just wanted to chime in on this topic and offer some thoughts to those that have built up large libraries of 3DS Max assets.
Today (finally) I got around to using the Enscape Custom Asset tools and I converted some decade old 3DS Max models for use within Revit as Enscape assets. Great work on these tools! A little work and troubleshooting with units conversions as we were probably mis-using the scene units in 3DS Max all those years ago. I also had some normals issues on one of the models I was testing with FBX format, but was able to get it resolved by going to OBJ format and some tweaks to the OBJ surface export settings. All that to say that it worked very well for me and I am excited by the possibilities it opens up to us!
In this way you can use your 3DS Max content inside of other applications if you are rendering with Enscape. In conjunction with the "Link Revit family to asset" functionality this could be extremely powerful for some users who cannot comfortably create the forms needed within Revit. I was creating families for an optometrist office several months ago and I would have had a much easier time creating those models in 3DS Max...I should have explored this functionality earlier.
If you are unfamiliar with these tools, this documentation covers it well:
Same here. We have alternated between RiftCat VRidge and Virtual Desktop since the Quest came out, they have slightly different approaches so one may work better for you. VRidge is free to set up and try with a time limitation which can help you see if it is something you can get up and running with.
We never had success with ALVR, but I feel it is more due to the hardware we tried to run on than anything else. ALVR is completely free I believe, and it has been a good while since I tried it.
.png files have worked with previous versions for me with no problems. It has been awhile since I have been on a preview version though. None of the .png bump maps were working. Opened them in photoshop and saved as .jpg, remapped and they worked fine.
Don't believe it was anything specific to any certain png because the ones in the autodesk library were behaving the same way as the the ones in my custom library.
Try with .jpg file instead of .png for bump. This trick helped me with one of the older Enscape builds.
This worked. Thank you!
Left side of the image is realistic view in Revit, middle of image is Enscape render, right is material settings.
Reinstalled Enscape to see if something was wrong with the install, and this is still the result.
Title says it all. Bump Maps have no effect on the Enscape rendering in Revit 2021.
So to clarify, your solution still requires a desktop/laptop machine to run enscape and stream the experience to the wireless Quest? Its great that this is possible...
...However... the biggest barrier to client engagement is the GPU requirement. Its not reasonable to have a dedicated VR capable machine for each person in a meeting and it diminishes the usefulness when people a have to take turns to participate.
If the Quest could run an optimized/"game-ifyed" EXE file, we could buy enough quests for a small army of meeting participants and simply push the exe to each.
Iris VR Prospect has had multi-user and untethered Quest support for a long time. The visual quality is just severely lacking but the functionality is impressive.
That hardware barrier will not go away anytime soon.
While it is not even yet possible to fit the Quest with the hardware in it to accomplish real-time raytracing, if it were, the cost of the headset would be well over $2000 and that would become the barrier to outfitting a room full of people with what is needed to reach the pinnacle of what we are after here, Our goals are the same after all.
Yes, we have experimented with IrisVR, Insite, TheWild etc. They all can work natively on the Quest, and they all have the same visual fidelity issue due to the resources on the Quest being inadequate. Strip an Enscape file down to what would run on the Quest natively and we are going to be in a similar situation, but Enscape does not have the collaboration features that the other products have.
VR Chat is probably the best bet for everything you are after today, but the application comes with an unprofessional slant, and a lot of work optimizing scenes through Unity.
Consider this option, there is a cloud compute offering available today that gives access to a remote PC that is outfitted with a GTX 1080. In theory, similar steps as outlined above can work with this cloud PC as well, and the service to access the cloud PC is $12-$15 per month, with new higher-spec offerings coming soon. Enscape supporting multi-user collaboration is now the only hurdle in achieving what is desired if you have the capability to do the above steps as we would be able to provide a client a login for the cloud PC to view the VR from their home/office utilizing only an Oculus Quest and a good internet connection.
This wireless VR setup today is just a stepping stone to get to the ultimate goal. We have overcome some of the "wait your turn" issue by mirroring the VR participant's view to a large screen or projector in the same environment. It is not ideal, but it is what we have today, and our clients are finding it useful in sparking discussions that would otherwise not occur. The pandemic is another factor making it impossible to continue using Enscape to collaborate in that sort of environment, but we have the scenes set up for Enscape already so we have been rendering to 360 and the clients are using these during zoom meetings effectively.
This one looks nice, but a little over your budget:
Will this feature allow us to bring assets from any of the supported software? Very powerful feature if so! Especially in conjunction with the "Replace Revit Family with Asset" tool.
+1 from me and can't wait to hopefully test in 2.8!
Wireless VR Update March 09, 2020
We have had many successful wireless VR Enscape demos using VR Desktop over the last month...but we have found a tool that is much simpler to operate and set up: VRidge by RiftCat. We have also experimented with ALVR, but no success with that one as of yet.
We have tried many different network setups for this now, and stand by our original assessment of the key component being a dedicated 5ghz router hardwired to the rendering computer. If you are trying a wireless VR setup and you don't employ this piece of hardware, the experience can be sub-optimal.
Thanks for your descriptions everybody, and congrats to everybody having set up their desired connection. vertigo1 , can you explain by what you mean by clicking in the controllers?
Kaj Burival The analog sticks have a click input, this has to be initiated before movement of the sticks is registered.
annevanzwol The teleport worked for us without fuss, I believe it was one of the two trigger buttons.
landrvr1 We were using the sample architectural scene that comes with Revit to test with. It is a good middle of the road test by my summation. Scene weight really won't have any bearing on the performance of the wireless streaming quality. The PC we were rendering with is a Threadripper with a 2080Ti, so the quality wired or wireless would be similar from an appearance standpoint. The router is going to be the biggest factor for the wireless streaming quality (framerate) and distance you can get out of it. We were using a Linksys WRT for the test.
Our office has been working on getting wireless streaming from a PC to Quest for the past 2 days and finally got a payoff today. It works!
Control mappings in Enscape were off a bit, for instance we had to click in the joysticks before they would work for movement.
Setup was not easy at our office due to our network being very restrictive. We brought in some extra equipment today to get it working. Key for us was a dedicated 5ghz router hardwired to the PC that is doing the processing, and a shared internet connection to feed into the router so that Virtual Desktop can establish it's tunnel.
Virtual Desktop was the winner for us (not free). We tried setting up ALVR (free), but ran into roadblocks with prerequisites on the PC we are utilizing.
One nugget of insight that would have helped us when we were struggling with setup:
Virtual Desktop requires an internet connection to initiate the tunnel, ALVR does not. This would have made ALVR ideal for our use due to our network restrictions, but we just couldn't get it to install properly on the machine we were using.
Keep in mind that the whole process relies on side-loading unofficial apps on the Quest to get it up and running, the functionality could disappear tomorrow if one link in the chain fails. But this tech needs to be a thing permanently and it needs to be more approachable, it really sells the technology.
We were able to connect an android phone to the stream as well, and through this we will be able to push the stream to another display via chromecast. Still kinda reeling from the experience of all of it and how well it worked.
I think someone was testing the Virtual Desktop tunnel on Oculus Quest in another thread here awhile back. There are two methods available, running Enscape and the host software on un-tethered local resources, or running Enscape and the host software on the cloud.
- Oculus Quest
- Virtual Desktop app for Quest
- Good WiFi setup for both options, and good internet connection if using cloud resources
- Decent local PC that runs Enscape well that is hardwired to the network, or shadow.tech subscription and decent proximity to one of their edge nodes
- Some technical prowess to jump through all the hoops
Guide on setting up on the cloud for gaming, but the general ideas are the same for getting Enscape up and going on there. Haven't tried it myself but I would like to soon.
I do not want to sound overly harsh, but is this the correct way to look at this issue from a computer graphics standpoint?
No surfaces in nature/on earth are perfectly planar either, yet here we are trying to calculate a "realistic" fresnel on that surface. Pretty sure the qualities of the fresnel falloff curve would change with the microstructure of the material as well, rather than the linear gradient that seems to be in place as of today in Enscape.
The whole point of computer graphics is to fake it until you make it. What good are a few realistic variables when the formula as a whole is not yet complete?
There is no such material in nature/on earth with attributes 0 Roughness and 0 Reflection. That won't happen in reality.
The two Jonathan mentions a few posts above are the ones I have been looking at myself for gaming/VR. Make sure it has the GTX 1070 for a little more future proofing, and personally I would avoid a 4k display but I don't think that is an option on the ones mentioned (they are 144hz FHD I believe).
What I find interesting is that a lot of the performance gains seem to be coming from AI. If I understand it correctly, they do all the fancy calculations at a low resolution and then use AI to filter/up-sample to meet the needs of the attached display.
I think it depends on Windows Mixed Reality to be installed, and I believe that is Windows 10 only.
If you have a valid product key for Windows 7, it can be used to activate Windows 10:
Install and activate Windows 10 with a Windows 7 or 8 product key.