Posts by landrvr1

    Good to see this thread pop back up. Again, the mechanics seem simple: Every frame of animation in which there would be object movement would require a brief pause to let the image resolve....then the frame is saved. That pause time could even be manually set; since individual animated objects would vary in the time they (and the entire scene) would need to resolve.

    Ray tracing support is the most interesting and needed addition to TM. The lack of photorealism - especially in deep interior spaces where you don't have the benefit of the sunlight source - prevents me from fully onboarding the program. However, with a multi-billion dollar company behind the development, it's only a matter of time until they increase the visual fidelity - no doubt the Unreal V engine will help with that. lol.

    Okay, the jerky framerate is definitely the result of enabling RTX. Once that's turned off, it goes back to an acceptably smooth experience. Hopefully that will be fixed.

    The cove lighting is still broken, however, with odd light leaks:

    I'm pretty disappointed with Enscape at this point. The last stable version in which everything seemed to work well - with very high/smooth framerate, was 2.6.0+11215. I'm paying the same subscription rate as everyone else, yet cannot take advantage of any of the new features that come with updates. The jerky framerate from 2.7.1+20886 is simply unacceptable - given the fact that we do a lot of live virtual tours for customers. All other versions between 2.6.0 and 2.7.1 would cause freezes upon opening the render window. Exposed cove lighting - mentioned in my previous thread - remains an unfixed issue (and the thread was never addressed after my last post).

    I just re-installed 2.6.0+11215 and the framerate is smooth as glass - at least 60fps.

    Which leads to another problem: Enscape devs need to truly add the ability to view the FPS in the Enscape render window. Our company does constant testing on different hardware configs, and the ability to see the frame rate is important.

    At this point I'm open to suggestions and trying whatever you suggest with 2.7.1.

    Here's my machine specs.

    I've tried both versions on SketchUp 2019 and 2020.

    This cannot happen because .max is a proprietary format. Nobody knows how to read it, and even if we knew it is forbidden.

    Everyone's software is in a proprietary format, and plenty of 3rd party plugin companies like Chaos with V-Ray have access to AutoDesk's APIs, dev kits, etc. It's not an issue of 'knowing how to read it'. That's not the problem at all, in fact.

    The basic challenge with a realtime render engine existing within Max (or any other 3D parametric or procedural modeling program such as Maya, Solidworks, etc) is the unbelievable amount of variables that you could introduce and that the realtime engine would have to IMMEDIATELY take into account. Adding a simple TURBOSMOOTH modifier to the stack within Max would probably blow up a realtime engine, lol. Nurbs, patch grids, point clouds, displacement, FFDs, etc etc etc. These are all massively expensive procedures that a stack of RTX8000s wouldn't be able to process efficiently and quickly...yet, anyway. Even V-Ray GPU, which is optimized for RT cores, is far from instantaneous realtime images.

    There's other issues as well - but mostly around modifiers - that make realtime living within Max an impossibility right now. It's the precise reason that Chaos didn't offer Lavina within Max itself. It's not computationally feasible.

    Enscape for Max will never happen.

    Working within Max's parameters is a total nightmare, and the restrictions are enormous. Ask yourself why Enscape, Twinmotion, Lumion, and Unreal have NEVER supported a direct .max import format or any kind of 'live link' feature. Unreal, after nearly 2+ years of development, finally got Datasmith going as an export plugin format from Max into Unreal and...believe took another year to work out a huge amount of bugs. It's still buggy. Even then, note that it's still not a direct import of the .max format into Unreal. Having Enscape sit within Max presents those same challenges, and lots more.

    Even V-Ray's realtime engine - Project Lavina - cannot live within Max as a plugin, and must remain as an external player. I wish it were different...!

    I've never understood why panoramic images cannot be rendered locally on your computer instead of being first uploaded to Enscape's server?

    I'm using Yulio to host the panoramas AFTER I have to download them from the Enscape server to my machine. lol.

    Very nice! My only comment is that you may want to change the height of the drawers under the worksurface. It's going to be quite difficult pulling the chair up to the desk in order to work...! The knees may not fit. Perhaps just make the drawers pencil tray height?

    You may need to check that the worksurface is at least 29" high from the floor.

    Adjusting the FOV after the image has already been generated only causes a different kind of distortion. The incorrect spatial relationship of objects remains the same. It's no different than taking a still photo of a distant subject matter, and cropping everything out but the subject matter and enlarging the image. It will always look odd.

    Does it go away when you turn off your ceiling? Is it always at the same spot on the screen, or does the effect move when you pan around?
    If so, then most likely the problem is caused by a ceiling plane that's not thick enough!

    An app for the Oculus store from Enscape would go a long way towards a great setup and user experience. You could each fire up a Quest and experience the environment at the same time. This is the way Prospect Pro works, and we've been using it quite a bit as of late - connecting several people at once who are all in their homes. The issue here is that the Enscape scene may have to be substantially reduced in quality for stand-alone mode.

    However, I've been pushing to simply get away from a VR experience whenever possible. For large groups, it's never going to be that effective. There's simply no substitute for broadcasting your scene on a huge monitor and taking folks around the space. keeping with the latest 'at home' virtual meetings, firing up Microsoft Teams and doing a walk thru screen share. Teams has an incredibly great compression algo that keeps frame rates very high indeed.