Posts by Adam.Fairclough

    That is indeed the case. It's an obvious recommendation but, you may want to try to keep such objects in focus while the rest can be blurred.

    Just as an example:


    Even in this example the DOF has no awareness of the object - you cans artefacts and weirdness - the amount of blur across that glass is down to the surface that is behind it - in that case the surface of the table.

    You can see the same effect here too, but the wall is the area in focus.

    With that in mind, placing a cluster of other objects behind would help render it with a more appropriate level of blur.

    I don't think this can be avoided due to how enscape is setup to handle transparent objects - a technical challenge for any realtime renderer,

    You could work around it by rendering a 2nd image with a solid material for the glass - then using the depth output in photoshop to create the depth of field - rather than doing that in Enscape.

    But even then, that may create some strangeness through the glass as what is behind it would not be blurred.

    You say that it's not specific to any job/model, what does it do if you start a new model in SU and just have a floor plane and a cube? Is it still choppy/blurry there?

    Are you connected with a cable or are you trying to do it wirelessly? (wireless requires a more controlled enviroment and good network hardware)

    I have found another render that proofs better quality of old enscape grass (before 03/2018).

    The grass bed is lower than the path in all of those pictures- which is one of the ways to handle the grass

    I'm running Revit through Parallels on a Macbook pro M1 & as has been well documented, Enscape is incompatible due to the limitations of Parallels Open GL support. Is there any chance that the upcoming Enscape for Mac release will somehow allow for mac-based virtual machine Revit users to workaround these issues?

    Assuming no because this is a less-common setup, but it would be great to know if at some point there will be a feasible solution to running Enscape/Revit on an M1 machine.

    Also - I assumed that I would be able to at least run an enscape .exe in the parallels environment (again, this on an M1 machine) however it appears that does not work either.

    That sounds like a question for parallels - it sounds like they need to improve their open GL support.

    Enscape is moving towards Vulkan too, which parallels also doesn’t support.

    While compatibility is being sorted out with M1, anyone know if a temporary solution like using a Black Magic eGPU would work?

    I'm not sure if you mean for the older machines or the new ones , but Apple don't support eGPUs on the M1 Macs currently - and it's uncertain if they will in future.

    Perhaps when / if they rejuvenate the Mac Pros with their own chips then we will see support.

    For regular scene textures (except displacement maps) only (the standard) 8bit precision per channel is used - going higher there (e.g. using EXR) doesn't make sense for regular renderings.

    That’s great thanks, good to see displacement maps use it.

    Also, is there an upper limit on the resolution that Enscape will use for textures?

    If you have a SketchUp subscription, you can have it installed in 3 places at any one time.

    Alternatively you could export from SketchUp in .IFC, which is something Revit can open easily.

    Very interesting, Adam. I have never made use of normal maps before 😬 but I like the way you have so clearly described how to do this. I will try it!

    Normal maps were the newer version of a bump map, so those are the ones that you should use most often - only when you need a deeper relief might you need to also use a displacement map

    I don't have relief either with my displacement map (for very rough concrete). It's a PBR texture from ShareTextures. Just the shadow, but flat, with no relief.

    Here is a quick guide as to how to get it to work - I'll use a sharetextures example : Brick Wall 5 in 1K (a sensible choice for real time)

    Also worth noting that Sharetextures seem to use 8bit Jpeg files - ideally we would like 16bit PNG files for this.

    I'm going to extract these into a folder and open the Normal Map and the displacement map in Photoshop

    When the normal map is open I'm going to open the channels and press the plus button

    This will create an alpha channel

    now go to the tab with the displacement map in and press ctrl-a (select all) and ctrl-c (copy) to copy this displacement map

    Go back to the normal map tab and paste the displacement map into the black alpha channel

    If you click the eyeball next to RGB in the bottom right , it will show all channels combined - in this example it looks a little more red than it did.

    Save the file as a tiff.

    Load this into Enscape's height slot and choose "displacement"

    This will load both the displacement AND the normal information in

    Notice that Enscape shows the alpha map we loaded in as the black details

    This will give you a nice result than using just normal/displacement, but as is always the case with Parralax occlusion maps (which is what the displaceme is) this does not change the silhouette of the object

    RTX 3080 Ti

    We specifically upgraded to this GPU hoping for better performance and quality using RTX but honestly, most of the time I end up disabling Hardware-accelerated Ray Tracing, NVIDIA Denoiser and NVIDIA DLSS because of crashes. When I say crashes, I mean like hard, SketchUp just closes crashes.

    Are you using the Proxy versions inside of SketchUp?

    It's not going to affect performance until you've used up that memory.

    My laptop workstation has 6GB and I've got some full masterplan sized projects that get close to that - with raytraced shadows enabled I will get messages telling me I'm running out of memory.

    It's going to depend on the complexity of your project and how well you optimise it to work within that memory budget.

    I don't know what the logic is for what is a duplicate , but if you try importing the same enscape texture into a model multiple times, then renaming each one, the plugin merges them.

    (I just gave it a go as I've never given it a thought about how it works)

    I guess even just merging duplicate RGB triplets means there are 16million+ possible combos, so the chances of erroneous merging must be small

    The plugin is super useful, one of my most used!

    I use these settings reguarly