Posts by Macker

    Hi folks,


    I'm not sure if this has been suggested or not but my reason for it is this; we currently own a lot of renderpeople.com 3D assets that we use in our 3DS max and sketchup visuals - however we cannot at the moment take them into revit because revit lacks proper UV coordinates.

    Would it be possible to allow users to save things to the asset library (in Sketchup) so that they can be loaded in other software (Revit)?

    Kind regards,

    Chris

    In 3DS max I can set a portrait aspect ratio (larger height than width) and see how this looks in the viewport, then render it out. I want to be able to do this.

    Hey guys,


    I'm sure this would be a (relatively) easy thing to implement... I'd love to be able to take portrait shots straight out of enscape, like I can in 3DS max.

    Hey guys,


    I'm wondering wether or not any of you have found any good looking trees (and shrubs/bushes) to use in Enscape? I'm used to using very high poly trees in 3Ds max/VRay, not highly optimized ones that would be suitable for realtime. I am after quality, not quantity.


    Are there any Enscape tree packs available to purchase, or just any at all that you can recommend?

    Kind regards,


    Chris.

    Hey guys,


    Having just come from a meeting where we used Enscape, I thought I'd write this whilst the idea is fresh in my head.


    We were showcasing the interior of a school/college that we've designed to the head of the school. In the meeting we were discussing everything from floor and wall finishes to doors and windows, and we had a table full of brochures of the products we were specifying; carpets, vinyl flooring etc.


    It became quickly apparent that a GREAT feature to have in VR would be the ability to click (using your oculus/htc controller) on an object and assign it a different material - these materials could be ones that I've set up prior to the meeting. So for example, we had specified grey doors in our original model, but the client wanted to see what it'd look like with timber door finishes - something that required us to go back into Sketchup to do, which then required finding the right wood texture, then tweaking the colour of it to match our sample (something that could have been done beforehand).

    These material "choices" could then be saved into the .exe export that we then send to the client so that they can play around with different combinations themselves - though only from a select set of predetermined materials that I have created.

    Hey guys,


    Whilst the Auto Exposure is very good, it isn't perfect and a number of my renders have large amounts of shadows clipped to black, which cannot be rescued in post. Can we please have an option to output at a high bit depth or even an HDR format (.exr is industry standard)?

    Thank you,


    Chris

    Hey guys,


    When producing animations I get noise; which is fine and totally acceptable (even looks good on some surfaces), however when the camera moves it does appear to be static and has the effect of looking as if it's "crawling" across surfaces. Would it be possible to randomise/ jitter the noise so that it doesn't have the same uniformity on every frame? I appreciate this would make it look like it's sparkling/scintillating but I think it would be far less distracting than it crawling across surfaces.


    Thanks,


    Chris

    We already have a machine with a 1080ti and it's still the bottleneck. It doesn't make economic sense to keep building additional computers (each of which would require another Enscape license) instead of simply adding additional graphics cards to a single PC.

    I tried turning off both auto-exposure and auto-contrast on some new footage that I've exported and there is still flickering; it's definitely the gi/lighting solution. The test animation was 60fps; I feel like 120 fps is overkill, especially when enscape only uses 1 GPU at a time.

    Hey guys,


    We are using Enscape quite a lot at the moment, and need a lot of frames rendering out. What is the best way to make this happen? This is a very clear bottleneck for us.l With offline rendering we would simply build a render node with a large amount of computational power - but with Enscape using only single GPU's I'm not sure how this would work.

    Hey guys,


    Just put together this "test" animation for a meeting on friday to showcase a new college we're working on - the final animation will be for this wednesday. We've been pleasantly surprised by the quality, though there are obvious things such as GI flicker that we'd like to see reduced, and we definitely would like more control over animation paths and parameters (such as fog amount, depth of field, etc) and an ability to save multiple camera paths (similar to scenes in Sketchup).

    https://we.tl/KBwTywIiyz

    Let me know what you think.

    Excellent. Surely then, this is an argument for SLI? Or at least being able to select which card to render on? Otherwise you're limiting yourself to 1 gpu per machine.

    Hey guys,


    Quick question; how does enscape render videos out? I assume it's all GPU based? Is it possible to have multiple instances of Sketchup/Enscape open and rendering out all at the same time?

    Thanks

    Quote

    Regarding performance, VR has higher requirements for several reasons, so we tune down certain parameters. You can still tune the overall quality to ULTRA in the main tab to get some of the quality back, but be aware you need a strong machine.


    Ultra still doesn't load in things such as grass. Surely the option should be there to obtain the maximum quality possible - we are looking to really wow our clients. Hardware isn't what's holding us back, we are able to invest in it.


    Quote

    3- We're working on that, see above link. Will be released in a one of the next releases.



    Is there a Beta with this functionality available? I'm in the process of producing an animation right now.

    Quote

    4- Screenspace shadows and ssao have nothing to do with each other.



    I didn't say they did. For years we have used AO as a means of grounding objects in architectural visualisations - though this is an old technique (because computing power is such that we can use brute force) it is still worth having in a VR experience.