Tackling Common Issues in CameraTracker

Ever wonder how to tackle common issues in CameraTracker? This post explore the various issues when tracking in CameraTracker.


Headbutting Common Issues in CameraTracker

Nuke’s CameraTracker is easily one of the most user-friendly workflow when the need arise when tracking a shot.

Still that doesn’t mean it can get the desire result you’re looking depending on the project you’re working on. Read on to identify the list of issues and suggestions to fix it.

I’ll be updating this post with proper images/screenshots to better illustrate each problem. For now, I’ve highlighted keywords in case of TLDR (too long; didn’t read) syndrome.


When receiving a shot footage, one need to be aware of the colour space of the footage. This is especially true when dealing with footages shot with Arri Alexa or RED Epic camera where the Log colourspace which resulted in low/poor contrast and dull looking.

A quick Grade or ColorCorrect node to bump up the contrast/saturation can help CameraTracker to track better with higher confident compared to the raw footage.


Markers are something that are not really important depending on the shot but whenever possible, setup markers on location if you have the chance (especially so when you are working on your short film project).

Depending on the environment, objects that have contrast pattern like a signage or brick tiling can double up as markers too (if that make sense).

If you need to work with green/blue screen set, remember to put really contrast markers (the opposite colour in the colour wheel). CameraTracker love contrasting stuff be it in pattern and colour so remember to think forward if you’re planning to track the camera movement.

There is a good writeup by John Montgomery about the ideal tracking marker for use on set over at fxguide (although the article is over a decade old, it is still useful):


Blurry shot can be really problematic when it comes to tracking as fine details are pretty much absent which can confuse CameraTracker.

Heavy motion blur usually break CameraTracker tracking which is something that you want to take note.

Other than that, rack focus and depth of field can wreck CameraTracker too.

My only advise for this scenario is to persevere and try your best to understand which part of the shot that is trackable and later combine the tracking data into a seamless shot.


Frankly I haven’t try tracking a shot that has focal length changes aka zooming.

If you have the exact information of the focal length value that is use in the shot, you can try to use it as a starting point and slowly guide CameraTracker in handling the change of focal length.


Usually my general rule of thumb before doing any camera tracking is to ensure there is at least enough resolution in it.

1280×720 aka 720p is the bare minimum that I will consider before I start tracking. Any less and it will affect the overall accurarcy of the final solved camera. Again, it depends on the quality of the footage that you’re about to track though which bring up the next issue that is often paired together aka compression noise.

Compression noise refers to the blocky artifact that you often see back in the MPEG-1 days aka VCD especially on high motion scene.

Another form of noise will be film grain or digital noise. Do try to reduce the amount of grain/noise whenever possible (I usually export a new sequence file after degrain/denoise and tracked from there).

Just remember that it is possible to receive really bad footages that is unsuitable for CameraTracker. If that happens, do request for a better quality version (or politely decline the job/task since it will be a complete waste of time).


This is pretty straightforward. Remember we are tracking camera movement so any form of motion like moving objects/subjects need to be roto to be use as mask.

Still there are situation where masking can remove a lot of information at a given time and cause CameraTracker to get confuse whenever the track pattern gone AWOL.

Also do remember to factor in reflection/refraction objects like mirror, water surface, glossy coating and etc.

What else should I do when it still fail?

Tackling troublesome shots can be a pain in the neck if the camera motion doesn’t really match up closely with the live-action footage.

If you’re lucky, you might get away with a completely bizarro point cloud that doesn’t make sense but the camera motion is roughly there. It will be handy to identify what you can get away or outright cheat if the tracked camera data is still usable in production.

Do try not to lose your motivation as you handle troublesome shots and if you have access to other tracking software like PFTrack, Syntheyes or 3DEqualizer, do give it a try and see if the issues can be solved when CameraTracker failed.

Lastly, the reason I love CameraTracker is that it is already part of Nuke (well NukeX) and the tracked results can be immediately use for test compositing.

Anyway happy tracking and nuking with CameraTracker!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.