Posts by Lucas Cunningham
-
-
-
Enscape isn't exporting element specific transparency overrides. I have also tried this in one of the 2.7 pre releases.
This is image is to show the appearence without the blue selection graphics.
These are nessesary so you don't have to create and assign entirely new materials when making views with semi-transparent materials.
Thanks
-
The Oculus Quest is a revolutionary device that eliminates much of the friction of VR, because of this the device has been selling out consistently and its user base is huge and always growing. It would be great for Enscape to find its way into that market. The Quest is far eaiser and impressive to take to a client as the whole setup basically fits in a shoebox.
WebXR standalones would be a great place to start, they are:
- Building off of existing Enscape Features
- Easily accessable
- Easy to manage
- Not suseptible to Oculus Store approval
- Open to support future standalone devices
Of course it would be great to also see native support for the device, however I understand that the current iteration of the hardware may be too limiting.
Market influence
AEC:
(taken from InsiteVR blog post)
Entire VR industry:
NOTE: Some of the 2.8 M are other standalones but it is safe to say that Quest sales are a majority of that figure, the deivce is backordered till March right now.
-
I'm a little late here but one thing to note is that using Oculus Link compared to a traditional PCVR headset can take a little additional GPU resources because the image must also be encoded before it is sent to the device. The technology is still pretty solid in my opinion.
-
Can we seperate the rendering resolution for pictures and videos, I usally render stills in 4k and videos in 1080 meaning I have to change my settings all the time. This small change would be much appriecated, thanks.
-
If you using Revit then check out Dynamaps, I'm yet to try it personally but it looks pretty capable.
-
I think it would be a good way to address Oculus Quest standalone compatibility rather than having to develop for a whole new OS.
Oculus Link Beta also came out this week and it works pretty well, the VR industry is always evolving.
-
WebVR and WebXR are both APIs for creating web based VR experiences and I think it would be a great way to get models to VR standalone devices like the Oculus Quest. Adding more functionality to the web standalones would be great because they are a really accessible way of viewing a model.
-
Can you add an option to batch render the panoramas based on saved views as well. We find ourselves using panoramas just as often as stills and it would be cool to also have this functionality.
-
+1 for the ultra extreme still settings, (i think I've seen this referred to as print quality before) Most are fine with letting one of these renders run for a couple minutes to produce an image that is as close to photo-realistic as possible.
-
I would love to be able to create a dynamo script that utilizes Enscape tools, unfortunately there is no API so all interaction is manual clicking. (Which means we have to rely on manual entry
and not glorious scripts and automation
)
Here's some pseudo-code to illustrate:
An Enscape API would open up a ton of productivity opportunities, and maybe help when you have to re-render all of your content due to a design change.
Thanks and keep up the good work,
Lucas
-
-
iModels will be a way to bridge 3D content and design data in to one common location from multiple design applications like
- SketchUp
- Revit
- Microstation
The benefit of this is that a program could support all these platforms and only have to maintain one connection.
This new platform (iModel hub/iTwin Services) could definitely use a simple to use high quality visualization/VR software integration from a company like Enscape.
Here is their SDK page also: https://www.bentley.com/en/software-developers/sharing-deliverables/i-model-sdk
Thanks
-
iModels are going to be a common data source (Revit and Bentley models federated in one cloud location) for projects moving forward for a lot of large firms, it would be great to see Enscape leveraging information from there to create the virtual environment.
Here is a link to their API reference: https://imodeljs.github.io/iMo…modeljs-common/rendering/
Thanks,
Lucas
-
I would like the ability to create renderings like images, panorama's, and videos from and EXE.
Opening the Exe is often way faster than opening the Revit model, additionally opening just the Exe runs smoother and with less computer resources.
Exe's are really easy to deal with and would make it where somebody without Revit experience, like somebody in marketing, could generate graphics without the need for a Revit license or the know-how to operate the program.
Understandably when you enter "render mode" you then would require an Enscape license, but being able to make renderings from a standalone without the need for modeling software would make Enscape even more desirable.
Thanks.
-
Diagnosing problem areas continued
Oculus/Facebook just released a really in depth article on how they developed the Insight system and how it works. You can find that here, if you have some reading time https://ai.facebook.com/blog/powered-by-ai-oculus-insight/
But knowing that the system relies on SLAM (Simultaneous Localization And Mapping) I think I have found a way to diagnose problem areas for the tracking system, we can download SLAM based applications for our smartphones that can allow us to see what areas will register.
Android link: https://play.google.com/store/…t.visualslamtool&hl=en_US
Apple link: https://apps.apple.com/us/app/tape-measure/id1271546805?ls=1
It is worth noting that these systems use different sensors and SLAM engines, however the underlying concepts are the same.
These apps will place points at areas with high contrast that SLAM systems will be able to detect, if you notice that there aren't very many points then that would be area for concern because the insight tracking system wont be able to create references.
^poor tracking (few reference points)
^good tracking (many reference points)
-
dfersh I'm no to sure what the best way to do precise height *calibration is.
Maybe try setting the touch controllers on the ground and walking around a bit during the floor position setup portion of the Guardian setup. This should theoretically give the tracking system more points of reference for where the floor is.
The Oculus Insight tracking system is a SLAM (Simultaneous Location And Mapping) system that relies on finding unique edges for tracking so rooms that a very basic with just plain colors also aren't the best for it. Read more on that near the bottom of this article: https://uploadvr.com/how-vr-tracking-works/
One thing I haven't done but would like to try would be to make some stickers with distinct patterns on them to act as markers to assist the tracking system if it is having difficulties.
The new tracking system is far easier to setup, but we have run into a little more precision issues when it is up and running unfortunately.
EDIT: Changed "calculation" to "calibration"
-
I think the issue is with the new Oculus Insight tracking system itself, we had the same problem when trying to use the headset under a pop-up outside, the ambient sunlight wrecked the tracking system and messed with the floor height and controller tracking. If the issue arises in a room with open blinds or curtains, close them and see if resolves the issue. Basically what happens is that the sunlight overexposes the cameras in charge of the tracking and the system looses its reference points.
It is strange though that you mention that it is a specific problem while in the Enscape application.
-
I figured out the issue was actually related to the text and layout scaling (probably relates to the dpi you mentioned) on our display it was set to %300 in the windows settings, when i set it back to %100 it displayed in full definition.
Thanks Demian Gutberlet