Posts by ViggoPaulman

Reminder: If you encounter any issues with Enscape or your subscription please reach out to our dedicated support team through the Help Center or by using the Feedback button as detailed here.

    I would add to this

    - Bevel/Round Corners setting like in D5 Render
    - Precipitation effects (snow, rain)

    - Particle effects (water, fire, waterfall)

    - Hedge material

    - Composition grides for renders (rule of thirds, golden ratio)

    - Better and more detailed 3D assets

    This would be a dream come true. Sadly Enscape falls behind some of the competitors when it comes to all these features. But still, it has some features others don't have (batch rendering, material packaging, etc).

    Further tests on a newly acquired Meta Quest 3 (with ultra high graphics on a high-end machine) made me realize that DLSS severely degrades real-time viewing quality in Enscape scenes.

    Only later did I find out a comment, that it is recommended by the Enscape team to turn DLSS off if your intention is to view your scenes in VR.

    DLSS, to be very blunt and simple, is an AI "interpretation" feature by NVIDIA, that analyzes your scene and adds in its own frames in between the frames produces by the software (in this case Enscape), making your experience smoother. Now a question arises which I couldn't find an answer for anywhere - does DLSS affect still renderings in Enscape?

    As I understand, DLSS does affect video renderings, correct?
    But does it make any difference if you are rendering just a picture?

    Here's 2 renderings of the same scene with same settings, rendered with and without DLSS.

    As you can see, visually it seems that there is no discernable difference.

    However I have to add, that rendering a still 4K picture with DLSS ON takes a bit less time, so I am assuming the GPU is doing its AI shenanigans in the background, "interpreting" and adding frames and stuff.

    But then again, there is no added value in the quality of the rendering.

    Here's the full pictures side by side.

    DLSS is making rendering times a bit faster apparently, that I understand. What I don't understand is at what cost? In a still rendering I saw no difference. But I read that DLSS does affect animation exports (by interpreting/adding frames). As good as AI can get, as a professional I don't trust AI generated imagery because its interpretation almost always skews straight lines. AI has trouble in the art world with fingers, and in the architecture world it does not comprehend how important parallel/straight lines are.

    Can someone confirm whether DLSS is compensating visual/design quality for faster return times?

    And does the checkmark change anything if I am rendering a still image?

    Thank you!

    Hello Demian,

    getting back to this thread after the holidays, so firstly, wishing you a very happy new year! 🎉🎇😊

    After a lot of trial and error and experiments, I'm happy to report, that I finally seem to have found the right way to connect to the Meta Quest 3 headset through Enscape.

    Your reply helped as well. In case someone else stumbles upon the same difficulty in the future, leaving my findings here, to make it easier.

    So, some things to really clarify:

    1. You no longer need Steam Link or Steam VR if you want to view your Enscape scenes in VR. The official Meta Quest app (by January 2024 still called "Oculus") is all you need to be able to start a 360 view. No secondary software or plugins required!

    2. Through the Oculus app you can connect your headset to the PC either wirelessly or wired, as you wish. If, however, you have a slow internet connection, or you're using WiFi, I recommend getting a 5 meter long high-speed cable (the official Meta Quest cable or else). Using the cable has a huge advantage to wireless because it essentially "ignores" the processing inside the headset (which is much weaker compared to PC specs). It turns your headset merely into a "viewer" instead not a computing processor, because all the information is rendered/calculated in your PC and transferred through the cable to the monitor inside the headset.

    3. I don't recommend getting a 3 meter long cable, because regardless of its length it's not optimal. Get a 5 meter long cable.

    You want to ensure that (even for static viewing) you are occupying a circular area of 1.5-2 meters in diameter, that is free of any obstacles, purely for making steps. You want to be at least 2 steps away from your desk/PC, to be able to comfortably turn around on the spot without accidentally bumping on something and whether you want it or not, you will instinctively make 1 or 2 steps within your scenes without realizing it. I tried 3 meters for static viewing but it was too short and on more than one occasion the cable jerked me back towards the computer.

    How to Connect and Run

    1. Download and install the official Oculus app on your PC.

    2. Turn on your headset.

    3. In the Quick Settings make sure "Use Air Link" is disabled. Otherwise it will not use the cable.

    4. Turn off WiFi.

    5. Plug in the cable and connect the headset with the PC. If you have the official Meta Quest cable (USB-C to USB-C) make sure the port on the PC you connect to is a thunderbolt/high-speed. The cable can transfer up to 5 Gbps of data. Official tests, however, show a transfer rate of only around 2-2.3 Gbps, which is also more than enough.

    6. Once connected go to Quick Settings again and under "Available PCs" you should see the name of your computer. Set up the connection.

    7. If successful, your Headset will transfer you to the Oculus dimension (grey environment). From there you can click on the Monitor icon to essentially duplicate your PC monitor onto your headset.

    8. Click on the Enscape button to start the software (I do this from within the headset, while projecting the PC monitor into the headset).

    9. Once running, click on the 360° button in Enscape. This will bring up a new window in your headset, which will show on your deck. Click on X to close this window. This is the most important step because the first time Enscape shows up in your headset the quality is horrible. You have to force the headset to close the Enscape window it opened. It will close it but automatically reopen it again (because the 360 button is still pressed in the Enscape menu). When it reopens again, you will notice a vast improvement in viewing smoothness.

    Some additional things:

    • I played with the NVIDIA Control Panel settings, because initially I could not see my PC screen inside my headset. It was all black. Changing some NVIDIA settings helped to fix the problem.

    • I played with the headset frequency and resolution settings. Be careful because changing the frequency, makes the smoothness worse. Turning your head inside a 360 scene becomes jumpy and jagged and impossible to experience.

    • When viewing scenes, don't turn your head around very fast. Even with a cable, the software still has to transfer and render a lot of data on the spot, and depending on the complexity of your scene, this might take some time. Turning around fast, might make Enscape lag for a couple of seconds.

    I will later upload pictures to elaborate on #9. For now, happy to report that everything is working.

    Thank you again Demian! 🙂

    Hello Ilias,

    happy new year to you too! 🎉🎇

    Thank you for the clarification. Does this mean that the "Improved rendering quality for captures" statement between 204048 and 202715 is essentially just a copy-paste from the 3.5.6 service pack and does not mean that the rendering quality was further improved between 204048 and 202715?

    Paid over 500 euros for the new Meta Quest 3 and the official Link Cable to implement VR into projects.

    I've read on the forum that Steam and SteamVR are no longer required, so I'm trying to start the VR view through the official "Oculus" app but it's not working.

    I'm connected to the laptop with the Meta Link Cable. Laptop is connected to Internet through WiFi but I deliberately turned off WiFi from settings in the headset, to make sure it uses the Link Cable.

    • Test shows 2.4Gbps of transfer speed through the cable

    • CPU: i9 13th generation

    • GPU: NVIDIA RTX 4070 8 GB RAM

    • RAM: 32 GB

    I click on the VR button, and it stays on the "Looking for Headset" forever and apparently doesn't see the headset... Moreover, the second I click on the button, the headset starts lagging and stuttering very badly.

    I was able to make this work in the past with SteamVR, but all that time I thought it was using the Link Cable, however I realized today, it was on WiFi all that time and the cable was only acting as a charger... SteamVR doesn't work without internet, is that correct? I cannot use the Cable with SteamVR.

    Then I found out you don't need SteamVR anymore and I tried to run Enscape through the Oculus app, but it was still using the SteamVR app from within Oculus. So I uninstalled SteamVR from the laptop.

    Now I can't even start the VR in Enscape... Someone please help.

    Enscape and SketchUp are all latest versions.

    Most of the time people use their own PBR textures in a professional workflow.

    Having to import a texture in SketchUp (1), then going to Enscape and adding the corresponding Height Map (2), then the Roughness Map (3), and potentially an Opacity Map (4) requires a lot of clicks and navigation through the File Explorer on Windows.

    It would be awesome if Enscape could automatically find and add an Albedo's corresponding Height/Roughness/Opacity maps and import them at the moment the texture is created. Doing this would save quite a lot of clicks and hassle with the File Explorer pop up window. Here's some brainstorming regarding the idea:

    • A texture's additional maps are statistically bound to be located in the same folder where the Albedo is located.

    • Most industry standard textures carry either a suffix or a prefix of the map name.

    It could work like this: A separate settings menu to tell Enscape how to think. Should it search for a suffix or a prefix (checkmark)? Text fields for every type of texture map Enscape supports (Normal, Bump, Displacement, Gloss, Roughness, Opacity etc) where you would type all possible variations of a texture map name, for example _Normal, -normal, -norm for a Normal Map,

    Once this is set up, the moment a Diffuse/Albedo is imported/created in the host program (SketchUp, Revit etc), Enscape automatically scans the same location of the imported texture for additional maps and if there are any that match the search based on suffix or prefix, they get imported.

    Since a Height Map in Enscape has 3 options but only one of them may be selected at any given time, it might be a good idea to set a "preference" in the aforementioned settings menu to tell Enscape to "prefer" a Normal map over a Bump map when importing, or vice versa.

    Thanks a lot for your suggestions and input! I wasn't able to find free time to test it all out yet. I think it'll have to wait until the weekend but I'll make sure to update the thread with my tests! :)

    The Quest devices are really affordable and generally offer a good experience beyond the teething issue for setup. being wireless by default is pretty cool.

    There are definitely options that do things better - such as head and hand tracking that is harder to occlude - but for general Enscape moving around I don’t really know if that is a huge benefit.

    Would connecting the headset to PC with a cable instead of wireless help with the stuttering/speed? I wouldn't mind trading a wireless connection with a wired one if it would improve the performance.

    You're quite right, I do remember the tutorial I watched on YouTube lead me to changing some settings in the Oculus software, namely the OpenXR thingie (although I don't have any idea what it is or what it does). It is quite a hassle I had to go through to set the headset up and in the end, although I was impressed to walk right into my projects, I expected to see the same quality I see on the monitor screen, because my graphic card is more advanced than the recommended requirements. Alas, that was not the case, so now I'm left wondering whether the problem is in the headset, in Enscape or in my new laptop.

    I will uninstall Oculus and try Steam Link tomorrow, to see if it handles the scenes better.

    Regardless of all of this, do you believe Meta Quest was the best choice for viewing 3D projects in VR?

    From your quote I understood that Meta Quest carries a processor inside, so it's essentially a "computer" that goes back and forth with the PC it's attached to. Does that mean that there are VR headsets that don't carry processors and therefore would be a "better" choice?

    Hey Adam,

    forgive my ignorance, but doesn't that bring us back where we started? Whether SteamLink or Oculus PC software, there's still a 3rd wheel in the middle right?

    Or do you mean to say that Steam Link software is better than Oculus?

    I really wish the Enscape team could officially support Meta Quest and/or develop a separate app for Meta, so we could access the projects without flickering or stuttering.


    Thank you for the reply. I couldn't wait yesterday, so I went ahead and bought Meta Quest 3. Tested it yesterday with 2 of my most complex scenes.

    Firstly the emotion - it is impossible to put into words what it feels like to jump right into a "picture" you made. I was quite in tears, because I checked VR on a reconstruction project of my childhood old house, which no longer exists. Walking in it again was out of this world...

    Now the problems - I did not use SteamVR. I used Airlink to connect the headset to my PC with the Oculus app and it worked well. Laptop gets very loud, although the specs are much higher than the "recommended" settings by Enscape. It's true, you don't need SteamVR for Meta Quest and Enscape. But I don't know why, the experience is very jumpy and jittery, whenever I move my head.

    The quality of 3D also seemed blurry/jagged, especially at the edges although I have antialiasing and rendering quality maxed out. But the textures are crisp.

    I tested it only for 2 hours so I can't say much yet, I will do more tests, but it would be nice if Enscape could collaborate with Meta to get whatever is necessary for a smooth VR experience.

    You attached the same picture twice.

    I'd like to see your Exposure setting. I bet Auto Exposure is on. Like Alef Coelho said, this is most likely because the intensity of artificial lights in your scene is too exaggerated. Turn of Auto Exposure (actually, I'd advice to always keep it off if you're doing still renderings). If we're right, after turning it off your viewport will explode with overexposed brightness (with Exposure on 50%). From here, just adjust the intensity of your artificial lights. Try not to play with the Exposure % value much or at least keep it around 50% (depending on your needs).

    Hey good people,

    Joining this thread because The Meta headsets are currently under discounts and I'm really considering getting one.

    Problem is I have zero experience with VR, so I'm going into this completely blindly. Could someone explain what is meant by "tethering", "Airlink" and some other terms?

    From what I understand, Meta Quest headset (2 or 3) are the best choice for Enscape, is this true? I have my eyes on Quest 3 128 GB.

    And it seems there were 2 version in the shop, which I checked today - with battery and without battery, is it true?

    If it's true, can anyone tell me the difference? As I understand if you buy the battery version, you have to attach it to the headset, stick the battery into your pocket and carry it around?

    Really confused here, would be very grateful for any extra input. YouTube and Google don't seem to be able to give me the answers I'm looking for.

    Demian Gutberlet said that Meta Quest 2 is now (as of November 2023) capable of running Enscape without SteamVR. Did something change in the headsets or did Enscape change something internally in the software? Would I be safe if I buy the latest Quest 3?

    Also, planning to run all this shenanigans on a new laptop:

    Alienware M18 R1

    NVIDIA RTX 4070 8GB

    32 GB RAM

    Intel i9 13th generation

    Windows 11

    Would this be enough to ensure lag-free walkthroughs at "Ultra High" graphics?

    Thank you!

    I am all for UI redesign. It is painful, navigating through the UI on a daily basis. As a Designer myself I don't understand what's happening behind the curtains on the management floor. This needs to be addressed much earlier than the current roadmap objectives.

    UI, UX, 3D assets, and refined rendering features - 4 of the most crucial aspects that Enscape should fix before it's too late.

    1 year later and I still very much need this feature. I have projects with 50 views sometimes. Imagine having to manually change a view preset to each view, it's a painful job.

    Given the recent polls regards view presets, I feel my suggestions are much needed quality of life features that could immensely improve the view presets and the user experience in general.

    Adding my voice to this. We just had 160 scenes in one file and 2 different presets per 2 scenes. I'm currently clicking on every single scene and setting the correct visual preset. To say this is a pain in the back would be an understatement. Please bring more useful features into Enscape!

    I had to make a separate thread about this, although there are several similar posts already.

    For those who didn't know, your 4K textures are useless if your 3D models are small objects! And no, even using 2K textures won't help you.

    Apparently Enscape decides on its own if it will use your high quality texture or mess it up and there's nothing you can do about it.

    I've been playing around with this for the past hour, and I'm left somewhat disappointed and speechless because this is a huge problem!

    In product 3D visualisations, the scale of the objects are much smaller than in the architecture industry. I discovered to my shock today, that Enscape internally and automatically degrades textures depending on the scale of the mesh. Here's proof:

    This is a camera with a height of around 7.5 cm. All the textures and maps used are 4K.

    The maps and the albedo are very detailed and crisp. Here's what they look like in the SketchUp viewport:

    Now here's what an Enscape render of the same viewport looks like when the camera is its designed 1:1 size:

    Look at this! I am asking myself, is this acceptable behaviour in 2023? Now look what happens when I close the Enscape window, scale up the 3D modell 100x in SketchUp without changing any textures, and start Enscape again:

    Suddenly Enscape decided that it will grace us with its capability of utilising the whole detail of the 4K texture. How is this acceptable? There is no way I can use this camera as a detailed 3D asset in a future scene. If I import it at a 1:1 model, Enscape will mess up its textures. I can scale it up, start Enscape, then scale it down while Enscape is still running. Then it will work, but if I close Enscape or decide to continue the work on a later day, I will have to start this dumb workaround process of scaling models up and down all over again! And if you have several 3D assets in your scene, then good luck playing that up-down game for another 2 hours.

    Why do other real time rendering programs such as D5 Render not have limitations like these, and Enscape does? This is pushing it too close to the edge in my opinion.

    Enscape team, please let us decide how to render our textures, not Enscape! This is unacceptable for a product visualisation pipeline! Essentially this means that Enscape does not support 4K textures. This same problem occurs when I use the 2K version of the Albedo. Even the 2K texture gets degraded if the 3D model is the size of a palm. It purely depends on the scale of the mesh, not on the texture size.

    Demian Gutberlet please pass this to your managing department. I cannot stress out how crucial this is for any self-respecting 3D Artist, I'm sure most of your users would agree with me.

    And if anyone has a miracle workaround, please let me know because I have no idea what to do in this situation. Thank you!

    We have a specific way of naming our furniture visualisation renderings. Each version of the furniture has multiple (at times a dozen) views to show it from different angles and the views have numbers at the end. The problem is, if I want to use the batch rendering window to render some of the scenes (see the checkmarks in screenshot) I cannot see the numbers and therefore I have no idea what exactly I am choosing to be rendered.

    In this example I have 9 scenes per each version. Each version is named carefully as XYZ 1, XYZ 1, XYZ 3 etc. I need to render only Scenes 1, 7 and 9. Because I can't stretch the window, I cannot see the end of the scene name and therefore I have to manually count from top to bottom every single time.

    Enscape, please fix this in your next release! Allow us to stretch the sides of the batch render window!

    That mentioned workaround is a very bad idea. The created watermark will work ONLY for that specific aspect ratio. If you have to render in several different aspect ratios (which we do almost on a daily basis) you have to create separate “watermarks”.

    I don’t understand why we have to go through all this, instead of asking for a real composition grid.