Saturday, April 16, 2011

Matchmoving Test + SynthEyes Tips



Here's a little test I did with SynthEyes and Blender.
Some tips that may improve your matchmove workflow:

* Be intentional when you shoot footage. Minimize blurry camera movement. If you can, have as much light in the scene to increase shutter speed and aperture (the amount of the image that is in focus)

* When exporting to the image sequence out of quicktime pro (or FCP/others)...USE THE NATIVE framerate that the video was shot in (usually 29.97, 23.97, 24.0 or 30.0 fps). This may end in fewer blurry frames.

* Before bringing any footage into syntheyes, delete the very first image export (labeled name_000.png). This will remove the possibility of everything playing back but being 1 frame offset (image timeline in Blender is frame 1 and the video is matchmoved starting at Frame 0. If everything in your scene looks right but is one frame off sync, try moving all your animation back by one frame.

In Synth Eyes
* Try the Auto Solve (Big green button)
* Go to the camera menu and turn on 'Calculate Lens Distortion', re-solve in the Calculator section
* Clean up trackers (Shift+C), then hit GO in the Solver (calculator) menu
* Repeat last step again
* Set up the three point ground plane
* If you need to set up 'Supervised Trackers' to your own points, do so (hold down C key and left click). Search around for "SynthEyes supervised tracker tutorial"
* Each time you re-solve, keep an eye on the error number in the summary panel - under .6 hPix is good.
* Drop in test geometry by going the the '3D Objects Panel' (cube icon), select a tracker in the video frame, then choose which 3D primitive to put in the scene.

Export with the Blender script once everything looks good. Open in 2.49 in TXT editor, go to FILE>RUN PYTHON SCRIPT, then once you see a camera, save and reopen in 2.5X blender.

That's it!

No comments: