Native 360VR Solving

< Previous | Contents | Manuals Home | Boris FX | Next >

Native 360VR Solving

You can track and solve 360VR shots directly as-is within SynthEyes, using its native 360VR solving skills. The workflow for this is as follows.

- Open the shot, marking it with a 360 VR mode of "Present."

- Do automatic and/or supervised tracking of the 360VR shot.

- Solve the shot, producing a 3D camera path and orientation and 3D locations for the tracked features.

- Set up a coordinate system using the 3D locations, orienting the overall scene: "Which way is up?!"

If you wish to perform automatic tracking, you will almost always have to perform preliminary roto-tracking work , to identify the areas that should be tracked and not tracked. The following areas must be excluded:

- The vehicle, drone, rig, etc holding the camera.

- The sky or clouds.

- Any moving objects or actors in the scene.

- Anything else that is not stationary and rigidly connected together.

The roto-tracking step can be very rough, as it is intended only to delimit areas for tracking, unlike the super-precise rotoscoping required to matte objects in and out of a scene.

Note that you can do rigid-body tracking of 3D objects in 360VR shots just like you can in SynthEyes for regular shots. Similarly, the solver handles 360VR tripod shots, where the camera rotates but does not translate.

If you are tracking long sections of 360VR footage (thousands of frames), be sure to see the section Solving Long Shots, which applies to both regular and 360VR shots.

 

©2024 Boris FX, Inc. — UNOFFICIAL — Converted from original PDF.