Nuke Tracking Training Video

cmivfx_s013D tracking really simplifies many compositing tasks, whether it’s removing unwanted objects or adding in additional elements. Having to swap applications in order to obtain the necessary camera data has always been time consuming and making alterations difficult. Now, Nuke comes with its own camera tracker – so 3D tracking can be done exactly where you need it! Focus on Nuke’s camera tracker and learn other relevant tasks for a good track, i.e. grain/noise, lens distortion and 2d tracking.

After analyzing the point cloud and solving the camera movement, learn how to set-up some geometry and recreate the scene with 3D and UV projections, then animate a new camera movement which will turn that into a stereoscopic set-up and end up with a nice (red & blue) anaglyph for a stereoscopic workflow alternative.


Preparing the Footage

In order to get the most out of the trackers in Nuke it is important to remove any distractions. Grain and noise can offset the tracker. Inside of Nuke there are many tools to achieve any particular goal. We show quite a few of them here, but note, there are many applications that can do these kinds of things prior to even importing them into Nuke. When it comes to 3D tracking the rolling shutter is an issue as well, and this is a job that The Foundry applications do quite well.




2D Tracking

How does Nuke’s 2D tracker work and when would one use it? 2d trackers can be helpful to keep Roto shapes in position. It is even possible to patch areas by connecting the tracker to a corner pin node.


Lens Distortion

How do we reduce the amount of lens distortion in our footage? A simple thing when using a lens grid but there are also other ways like analyzing lines in the sequence or using the lens distortion estimation in Nukes camera tracker. It is important to address the lens information prior to tracking in order for your cg objects to appear like they are sticking into the scene properly.




3D Camera Tracking

Even though a compositing app, Nuke provides its own camera tracker! So how do we solve a scene? What can our point cloud tell us? And how do we produce results that are suitable for our cause.




Geometric Restoration

Now that the camera movement is solved we will use our tracking points to place some simple geometry to the scene. We will move them to the correct position in 3D space and check if everything lines up with our original shot.



Using the geometry just created we will use our tracking camera to project the footage onto our geometry. This gives us the possibility to stabilize the shot and create a fairly smooth camera movement even though we started out with a very shaky shot.




Detailing in 3D

In order to make our projected texture stick to our geometry we will switch to UV projections. That also gives us the option to displace the geometry. That way we can create a facade that has a contour, rather than just a flat card. There are some tools in development that will help create more detailed geometry for scene reconstruction. We will update this video with a free patch when we are allowed to discuss it to the general public.


Find Best Frame (FBF)

Since textures can sometimes not be created from just one frame it is necessary to stitch a couple of frames together. Rendering them out as UV’s gives us the possibility to match, merge and paint them. Some of these topics can lead to advanced solutions in scene reconstruction techniques.  


Stereoscopic Alternatives

The scene now only consists of textured geometry. That gives us the freedom to animate a new camera movement. This camera can be paired to create stereoscopic images. If you have red and blue glasses, you can view the set-up as an anaglyph.  



Get the most out of Nuke’s camera tracker, learn how to deal with lens distortion, interpret the point cloud, set-up geometry accordingly and texture it. You will be able to exploit your 3d set-up to either create a smooth camera movement or create your own camera animation and even turn your footage into a stereoscopic shot. NOTE: This video is not meant to be watched in 3d from your computer… it is however instruction you how to do so. You can use red left, green right glasses to view the top image in 3d.