you said you reduced the resolution, so I assume you are aware of the SteamVR video settings as described in https://enscape3d.com/communit…-virtual-reality-headset/ ?
The reason why I am asking is because the better the card the higher SteamVR thinks they should push the oversampling. On the RTX A6000 it wants to do 3x oversampling and although that card is a beast, even that will bring it down to it's knees. It is generally advisable to use Enscape in the device's native resolution. In the case of the Vive Pro that is 1440x1600.
I agree that 90fps is desirable, but in practice that will never be reached. As long as the frame rate is above 45fps is tends little to no effect to the fidelity. This comes from the fact that the driver interpolates each second frame. Only once you go below 45fps it starts to really feel bad.
You are running the Vive Pro wired, right? This is a known issue with some, but not all, that have tried using the wireless link. Somehow the wireless link breaks the frame to frame interpolation and has less tracking fidelity. We are not sure what is happening here and are in contact with HTC about the issue.
The unfortunate reality is that in many cases medium or draft quality is needed when the models become to complex. We are working hard to make VR experience as good as possible.
Hello norooya ,
the VR starts in the same place and mode where you where in non VR. So what ever you use as "sitting height" is irrelevant to Enscape. On top of that comes the room scale vs. sitting; sitting will always maintain the center of the VR headset where the camera was before, but in room mode the center is roughly the center of the "play area", so if you are sitting and we assume walk mode your head will be approximately at chest height of the "walking" spectator.
rmoore97 Does this happen with batch render or single render.
Jason H I was about to suggest the opposite. I have seen a case where the batch render has the wrong camera angle. Can you please send in a feedback report:
https://enscape3d.com/communit…sing-the-feedback-button/ so we can sort this out.
I am assuming this is an indoor scene with no light at all? What happens if you turn off auto exposure? If the scene is effectively black, then the auto exposure works REALLY HARD to compensate for the total lack of light. The real time rendering in this case is "wrong" and you just luck out that is actually looks better. I strongly advise you to place some lights in the scene, this should solve most of your issues. If it looks ok with auto exposure off and the slider setting somewhere in the middle you have a reasonably well lit scene.
The program's response was only : "Unable to convert source file." What is wrong ?
Unless you show us the actual model file, little can be said about your problem. You can always reach out to our support and they will try to help you out and may react quicker than the forum will.
Custom asset import error- high poly count. I realize you caution anything north of 20,000, but I did not have any performance issues and have busted that limit on every job.
There should not be any hard technical limitation on the max poly count, except what your hardware can do and how low an fps you can tolerate. [The UI actions are also slowed down.] The 20,000 is an arbitrary, but reasonable limit for assets. We want to prevent customers to unwittingly downloading a super high poly espresso cup and placing it 100 times in the scene. If you can reach out to support with your problem and model we can have a look what is causing the hard issue.
There is a technical limitation that Enscape can only render 4 layers of transparent material. In most cases this is sufficient. But if you have glass with a depth, you get two layers, add a second sheet of glass and you have 4 layers and the water and you end up at layer 5. As a workaround you can make the sheet of glass 0 width, which gives you only one transparent layer. Visually with sheets of glass you will not see the difference.
I had a look at your model and the problem is that for many faces the winding order is wrong. You are seeing the the inside out, which results in messy lighting artifacts. To see what I mean, in Belnder turn on Face Orientation in the 3d viewport. You should have light blue faces, dark blue faces are inside and red faces, blender cant make out if the face is inside or outside.
It used to be that the face Normals dictated the "inside" and "outside" and thus the winding order. I tried to do that in Belnder 2.90 and was not able to switch the face's winding order... It should be possible, since many game engines have this issue, I have found especially many posts for Unity; but the described old process is not existent.
I am sorry I can't help you further, maybe asking in the blender forum on how to change face winding order should help you further.
three.js and babylon.js both build on top of WebGL. Our rendering engine is quite advanced and surpasses three.js and babylon.js in many aspects. I understand your frustrations with the quality of the web standalone, but we are making a conscious decision of what features can run in the web standalone, to be able to still render an entire building complex. We are working on making the web standalone better all the time, but certain features are just not available though a browser. As technology advances we will certainly take every advantage we can get.
welcome to the Enscape forum.
Offsite VR is always a challenge. The best experience is IMHO using a sufficiently beefy laptop (high end "gamer" laptop) and a Oculus Rift S. The Rift S, with it's inside out tracking is really simple to setup and the laptops performance is comparable to a desktop. Although not officially not supported, some users have had some success with setting wireless Oculus Quest up. The Quest is only supported in tethered mode as we don't have a standalone for the Quest. Even the Quest 2 will not have sufficiently beefy hardware to handle Enscape. So in the end you can't skip the PC (laptop) hardware if you want a fully interactive walkthrough.
A half way solution is rendering a number of stereoscopic panoramas and viewing them with one of the many panorama viewing apps for the Quest.
The render of Enscape in VR is (almost) no worse than in normal. Only some of the effects become more obvious as the screen resolution per viewing angle becomes less. We are always trying to improve the quality performance, but as things stand if you want real-time global illumination, you need to take some off the visual artifacts into account. Switching to draft mode will remove most of the visual artifacts and replace them with low quality effects; depending on your preferences this may feel "better".
The VR sickness should not be caused by the visual artifacts, as you can see when you switch the quality mode to draft. This is a special case of motion sickness, which happens to many people and can be trained away if exposed to it allot. We strongly advise not to walk/fly in VR at all and only use teleport, since this does significantly reduce motion sickness. Some people feel safer sitting and some have less nausea when in room scale, you may want to experiment a bit with this.
EGIE Well I am not offended.
Here is the thing, we check and double checked that we do everything right. The data is based of photogrammetry data for physically based rendering. Unless the company that create the data made a mistake (implausible), the person that stepped into the photogrammetry rig almost certainly had that skin color.
Same lighting, different materials, different (auto) exposure settings. Many "fake problems" can be attributed to the auto exposure / auto contrast doing weird things, especially when reflecting unnaturally reflective surfaces, like the perfect white material you get in SketchUp by default.
If you choose anything but lossless as the format, the video is encoded directly from RAM. The single frame is rendered, downloaded from the VRAM and then encoded into the video stream, which then is saved to disk as is convenient for the encoder.
So unless you run out of space for the final video disk size should not play a role. Your RAM may play a role as the frame and temp video data needs to fit into memory, but in most cases this should not be an issue.
If you want to do an studio scene, I have some reasonably good experience with using HDRI "sky boxes" with indoor scenes. Here you have an example for a small scale photo studio style setup: Studio environmet/setup
For something larger scale, you could use this kind of skybox: https://www.cgtrader.com/3d-mo…a9-4cdf-9cc2-869bb6250eac
So I had a look at this. So this is mostly intentional and half a bug.
So what is happening here is we add a tiny amount of ambient light from the sky. This is to simulate the light that would get into the building even though many bounces. Since this is a closed environment this is kind of wrong, if you don't want it, you should simply use night time to remove that effect.
The next thing that is happening, especially with no light sources, is that the auto exposure is working REALLY HARD at compensating for the dark scene. If you turn off auto exposure you will get a near black image.
The third thing that is happening, there is a minor bug that "accumulates" light in corners and it appears you have found the perfect geometry that shows this effect. This effect is normally not visible, but because the auto exposure it is exaggerated to 11.
To get a more realistic scene in your case, use night time and manually adjust the exposure and you should get adequate results.
Can you try the following:
I like your grass Herbo , it clearly looks like most grass around here (Germany). But when I first saw the Enscape grass I had to think about St. Augustine Grass, the staple of grass you find in the US in parks and gardens.
Could you please elaborate what the issue is? Is the framing wrong? Is the result different between batch render and immediate render? Is a visual preset linked to the scene?