Jump to content

camera match to a stitched photo ?


Spacelord
 Share

Recommended Posts

If you match the camera's FOV to the center photo and the aspect ratio.

Then everything on either side of the stitched photo is not rendered.

So how do you adjust the camera to compensate for that ?

Do you adjust the aspect ratio to fit everything ?

 

I had thought about trying the cyclindrical lens shader in mental ray.

I might give it a go.

Link to comment
Share on other sites

I tried the second methode and yes it is messy.

 

How wide is the stich compared to the middle image? and how much of the model sits in the stiched area?

 

Basically, I use the middel image to first line up the model, then adjust the aspect ratio to that of the stiched image, without changing the camrea lens

 

jhv

Link to comment
Share on other sites

The ideal method would be to ensure that everything you need to render lives in the center portion of the stitched image so you can view match to one photo. You can then render your CG elements to that aspect ratio to bring into Photoshop and locate in your stitched image using the extents of the center photo as a guide. If you need to render content for all pieces of the stitch it's much tougher because the optics are much harder to figure out. You may end up having to stitch your CG as well...?? I've had to do that before and it's not fun, especially if you do a lot of post (render mattes, passes, etc.)

Link to comment
Share on other sites

You don't mention how the images were stitched. If they are a plain flat, rectilinear stitch you should know the horisontal and vertical fov (I use PTgui for stitching, where you always know those parameters)?

Given those fov's I set up the camera accordingly (in LightWave) and rarely have any problems. It is very important to not crop the stitch until the composite is ready though, or else you'll often have big problems.

If your stitch is cylindrical or spherical you'd need to set up a cylindrical or spherical camera, or map the background on a cylinder or sphere.

Link to comment
Share on other sites

I've tried 2 ways of stitching in photoshop,

Perspective and cyclindrical, I think Perspective works better.

The main problem is the photos don't have much for me to work from,

theres no buildings, theres only a road, grass and trees to work out the perspective\FOV. I think I'll just end up using the photos as a background and create most of the stuff in the foreground.

Link to comment
Share on other sites

Maybe if you try with some better stitching software, like PTgui (Pro) or Hugin (free) it will be easier?

At least it will give you the hfov and vfov directly, and the correct crop.

It will be trickier if you don't have any existing buildings in your photos, but on the other hand you are also more free to "interpret"?

Attached is an example with a 104x60 degrees stitch that has been cropped some after compositing.

Link to comment
Share on other sites

Thanks Bjorn,

I'm trying ptgui, the FOV the ptgui is giving is focal length 8.366mm, focal length multiplier 4.448. Whats the multiplier for ? Do I need to take this into account when setting the Focal Length ?

I also remember hearing that the 3dsmax camera will never match a real cameras FOV. But I guess this should give me a starting point.

 

cheers

Link to comment
Share on other sites

You don't have to bother with the focal lengths, because that is a property of the source images anyway. The numbers you need are the panorama settings, which can be set from the Tab of the same name, or interactively from the panorama editor window. The multiplier is like crop factor, for 35mm equivalent focal lengths - but you don't need it.

I don't know about Max cameras, but in LW I set the camera output size to the same as the exported non-cropped (very important to not crop it!) panorama. Then I set the hfov, and the vfov will follow automatically.

And then moving the camera around usually works fine for aligning buildings with background.

I also use the Photo Match in SketchUp on such stitched images, and when I import the scene with cameras into LW I get the correct locations, but often need to adjust the focal length to make it fit exactly.That is a process that requires exisiting buildings though.

Link to comment
Share on other sites

No. The FOV is not directly linked to the width/height.

You may have a hfov anywhere between 2 and 120 degrees, depending on the focal length, and still have the same aspect, 4:3 or 3:2 or whatever.

But if you have a given hfov of, say 100 degrees, and size of 4000x2000 pixels, you can easily calculate the vfov from those numbers: vfov=(hfov/w)*h or vfov=(100/4000)*2000= 50 degrees.

Focal length is really not telling you anything unless you know the physical size of your sensor, which for most 3D programs probably is set as default at the 35mm standard 24x36mm, for a 3:2 aspect ratio. But most computer formats/screens aren't 3:2 ratio anyway, and then the focal length is kind of useless.

FOV will however tell you the exact field of view regardless of your aspect. If you increase the height of the image you will get a higher vfov if the hfov stay the same.

Unfortunately the 34mm standard on film cameras have "taught" us to think of a 50mm focal length as a normal lens, instead of thinking of the fov as a standard, which for that lens on 36x24mm film would be hfov=47degrees and vfov=27degrees.

Your lens would then use the multiplier in PTgui to calculate the equivalent focal length for the same fov on film, which would give it a focal length of about 37mm (8.366mm*4.448). You may use that in a 3D program, and set the "film size" to 35mm, but I find it much more predictable to use the hfov and vfov.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...