Thought I'd take a moment and explain something if I may:
Every camera - whether virtual or real - needs to establish a focal length/field of view. This is no different with a 360° image. Otherwise, you'd have no image, lol!
I use the term FIELD OF VIEW in these cases because that's what 3D software such as V-Ray or Enscape typically use. Another term could be FOCAL LENGTH. While the two are distinct elements in the world of film and photography, they are connected and basically arrive at the same result:
In conventional photography, whereby you are creating a full 360° pano image and stiching the shots together, a longer focal length/lower Field of View will bring objects CLOSER to the camera but require more stitching -because you've generated more still photos. With a 360° lens - or a render engine - what's happening with a longer focal length/lower field of view is that the same amount of visual data is present in the image, but the final output looks much more distorted (until it's unpacked in a viewer).
When creating a pano, the Enscape camera is set to a to a very short focal length/higher FoV. In V-Ray and other engines, you can control this for the final output.
Still waiting on this fix, by the way.