Posts by Macker

    If you transition quickly from day to night the auto exposure seems to react too slow, resulting in a initially too dark night portion of the exported video. To solve this a "look ahead" on the auto exposure for video export would be needed.

    That sounds exactly like what is happening. Does Enscape not evaluate exposure frame by frame? Would be good to be able to control how many frames it averages over.

    Hi all,


    I'm animating a day to night shot. In the viewport I set the daytime shot up as I want it, then I move to around 8pm and set up the nighttime shot how I want it. Both look excellent.


    Then I render it out (with auto exposure on) and whilst the daytime shot looks fine, the night shot looks much darker than when I was setting it up in the viewport. Why is this?

    Hi all,


    This begins with a question, which could end up as a suggestion if there's no suitable way to achieve it...


    I'm putting together an animation in Enscape which requires hundreds of animated people (just greyed out/semi transparent) and obviously Enscape doesn't do this. It's something that I could quickly and easily achieve in 3DS max which begs the question; can I export the Enscape camera paths in such a way that I can import them into max? The alternative for me at the moment is having to track the enscape footage (uuggghhhhhhh!!!) which is something I really, really do not want to do.


    Is the current camera path export format a proprietary one? Or is it based on fbx and possible to import in max? i.e. a simple change of file extension?

    If this isn't possible; what is the likelihood of you guys implementing a camera path export that is compatible with other software?


    Kind regards,

    Chris

    Hi all,


    I'm currently working on a large site and I'm already concerned with the size of the files I'm creating, which I'm sure will eventually give me issues when in Enscape. I'd love to be able to see a breakdown of which textures are taking up the most RAM on the gpu so that I can look at reducing them. Is this something that could be implemented?

    Kind regards,

    Chris


    Okay, got it working! Once you start the Oculus app on the PC, go into the Enscape menu and enable VR Headset, you have to press the SketchUp button in the Quest environment when you've got the headset on... I was missing that last step.

    Again, I'm using the Link method with a USB3 cable and NOT the wireless hack referenced above.

    I can say this: The Link method is utterly fantastic looking for a VR experience. Great visual fidelity - shadows, reflections, etc. Navigating and menu options work great so far...

    Well that's a positive on it's own. How's the tracking (as it doesn't use base stations)?

    Would you be kind enough to give the wireless hack a go?

    Kind regards,

    Chris

    Actual displacement/tesselation appears to yield better results than parallax occlusion (particularly when it comes to accurate lighting), but I'm not very familar with either technique. Also this reminds me of the need for rounded edges. Is that also on the (tentative) agenda Demian?

    Tessellation can yield better results, but it is quite computationally expensive. Parallax occlusion mapping (when done right) can look every bit as good and not have nearly as high overheads.

    At this point I'd settle for either as I'm really struggling to make open coursed brickwork (amongst other things) look any good in enscape.

    Hi All,


    I'm fairly sure that this has probably been asked (or at least something similar)...


    I'm importing some reasonably heavy geometry from 3ds max, which renders really nicely in Enscape and doesn't lag too much. However it does lag quite badly in Sketchup. Is it possible to create a low poly version/proxy that I can use in Sketchup, but let Enscape load the high poly original? In much the same way as the objects in the Enscape Asset Library load - but my own assets.

    Kind regards,

    Chris

    Thanks! Yeah I tracked the footage in AE and brought the tracked camera and tracking data into 3ds max via .fbx (I think? It feels like a lifetime ago now!).


    Honestly the camera tracking and aligning in max was by far the most difficult part of the projects because the tracked data is at a random scale/orientation and as soon as you move it in max all the keyframes go mad, etc/ It's something that (if we were to do this again) would give serious thought into buying some high end tracking software, or outsourcing the tracking. It really is a skill all of its own, and makes you want to tear your hair out!