Tracking image/horizontal panorama

  • In most phones that have built-in cameras there is a panorama/tracking app that takes lots of slices as you move the camera and stitches them together to form one long image. Normally used to spin on the spot and create panoramas, but it inspired this idea:


    It would be cool if Enscape could create a long 'tracking' image; stitching together vertical strips as it moves the camera from point X to point Y. This could be an easy and initiative way to create long, 'flat' elevations and sections without having to adjust image width/height or trim a larger image or play with fov settings etc.

  • I suppose so it will not work for scenes with depth. The phone pano function works if you don't move the camera, only rotation is ok. Maybe for flat scenes it work.


    I did a test last summer and here a shot I did from a driving car. :)



  • It's a good/simple way to output architectural elevations, sections and plans without having to change the FOV and set up the perfect camera position (and set the canvas size). It could be set-up with two clicks.


    Technically you could do it just now by setting up the canvas size as a narrow vertical strip and making a "lossless" video, then just stitching the resulting images together. But it would involve a bit of trial and error (or math that is beyond me) to get the frame rate and width of strips to line up properly.


    Currently the panorama feature keeps the camera position static and changes the direction the camera is pointing. Then it runs a render for a strip, then stitches this strip into a combined image. The only change here is that the camera direction is static and the camera position changes.


    The 'preview' of this would be like sitting on the scanning light of a photocopier as it moved - it's designed for output rather than being interactive.

  • But don't forget - it will not work if the relation between foreground and background objects is changed. Only very slight changes can be ignored. Best, let us talk about this limit before. ;)

  • ? do you mean that if (eg) a tree is close to you in the foreground it will appear stretched along the travel plane? Sure - you might have to be careful of the foreground and perhaps add a cutting plane too but I imagine it would be like the inverse of a photo-finish camera.

  • The problem is the background. Objects far away doesn't change the position at the image, through the slit of your scanner you see nearly the same at every time. Only the foreground is changing. How will you scan a constant background? And the relation of the objects in the foreground to the background is changing. How will you see the background in relation to the foreground? ... Finally you get something like from my experiment - foreground right scanned, background nearly constant as stripe (there should be a sea and trees).


    Here an example, two images - the red object stay in the middle of the image during the foreground is moving. How should the combined image be looking?


    You could try to shot some images from a scene per moving camera and try to combine it to a single view. I think, it will not work.