Nuke 1001 – Practical Compositing Fundamentals (3DCG)


Practical Compositing Fundamentals for Everyone

In this tutorial, you’ll learn practical compositing fundamentals and the typical process that a Nuke compositor faces in a small project. At the end of this tutorial, one will be able to do a quick compositing (aka slap comp) and fine tune the final result with flexibility based on feedback from supervisors/directors/clients.

For this project, the task involves creating a believable rack focus shots of hundreds of teapots. Given the nature of team work in the industry, the 3DCG renders has been done by the 3D team and it is the compositor’s role to take the various render elements from it to be composited into the desired results.

Again, the goal of this tutorial is to show the fundamentals in a compositing and remember that every facility has their own compositing pipeline but nonetheless shares a common fundamentals in any compositing tasks be it 3DCG, live action or motion graphics.

Compositing for 3DCG

While I’m not a dedicated compositor at my current workplace (my current job involves creating FX elements such as fire, smoke, debris or blood), any FX artists need to know the process of creating a “slap comp” to see if their FX are integrated nicely into the scene.

As mentioned above, we’ll need to create a rack focus animation of the hundreds of teapots and this can be achieved through compositing (with the help of the AOV (Arbitary Output Variables)
aka render elements from the 3D renderer).

Before we proceed, the important skill of a compositor (and pretty much any jobs in this world) is problem solving and being resourceful. To keep things simple for this tutorial, everything
are prepared in a “perfect case scenario” but do keep in mind that in the industry, things often goes awry from my experience (I can’t comment on Western-pipeline studios but it is common occurrence for Japanese-pipeline studios as my workplace is an outsource studios for many Japanese AAA projects).

Basically, prepare to die I mean sacrifice lots of time and money when you just starting out in this industry.

So with that out from the bag, let’s proceed to the actual compositing in Nuke!

(well this tutorial is applicable to other compositing software like After Effects, Fusion, Flame or gasp Shake except you will need to search for the equivalent functions in the respective software)

Continue reading

Recreate After Effects Settings in Nuke (and vice versa)


Note: I will update this post regularly with more settings with the proper graphic comparison between both software.

Recreate After Effects Settings in Nuke? Why?

Lately, I’ve been practicing an odd pipeline that I develop myself to improve my productivity (although my colleagues thinks I’m crazy).

After Effects (AE) is the main (and only) compositing software for the project that I’m currently involved and there are several reasons why I loathe it during production:

  1. Heavy use of EXR multichannels which are a pain to manage in AE (the Create ProEXR Layers will forever haunt me)
  2. No easy way to share and reuse mask across multiple layers.
  3. The roto tools are not as robust as in Nuke.
  4. You can’t see which effects are being applied to a layer until you select it.
  5. Unpremultiply/premultiply process is still a chore in AE unlike Nuke.
  6. Lastly but not related to AE, the in-house motion blur plugin that we must use (read: forced) suffer from memory leak which gobble up all the allocated RAM for AE.

So my odd pipeline involves Nuke as my first step in compositing my FX works and transferring it to AE once the cut are approved by the director for use by the compositing team.

To ensure that the settings between both software can produce the same result (or as close as possible), I’ve done many trials and errors and has been proven ready for actual production (at least I’ve haven’t been caught yet–).

WARNING: Make sure your AE project settings is set to 32-bit Float and sRGB working space (or etc) with the Linearize Working Space checked to match the results as close as possible to Nuke.

Levels (AE) and Grade (Nuke)

Grade node in Nuke is similar to AE Levels.

  • Grade Blackpoint is Levels Input Black
  • Grade Whitepoint is Levels Input White
  • Grade Lift is Levels Output Black
  • Grade Gain is Levels Output White
  • Grade Gamma is Levels Gamma

For offset, the function is similar to the offset found in AE Exposure effects.

Hue/Saturation (AE) and HSVTool (Nuke)

The HSVTool parameters match the one used in Hue/Saturation. I made a huge mistake in using HueShift initially in Nuke to recreate the settings of Hue/Saturation.

Time Remapping (AE) and TimeWarp (Nuke)

Both of them does the same thing except that in Nuke, you need to set the TimeWarp filter to None to get the same result as Time Remapping in AE.

Opacity (AE) and Mix (Nuke)

The Opacity function of a layer in AE is the same as the Mix slider of a Merge node in Nuke.

Matte Choker/Refine Hard/Soft Matte (AE) and Erode (Nuke)

Both of them does the same thing except I found Erode to be more intuitive compared to Matte Choker. Be careful though as Matte Choker is not 32-bit ready in AE so it will clamp the pixel values to 16-bit Half Float.

Alternatively, you can use Refine Hard/Soft Matte as it is 32-bit ready although it can be too slow to my liking.

Box/Fast/Gaussian Blur (AE) and Blur (Nuke)

Seriously, you should stick to either Box or Fast Blur in AE if you’re planning to transfer settings from Nuke as Gaussian Blur is pretty limited. Stu Maschwitz’s A Tale of Three Blurs explains it in-depth for each Blur filter in AE.

More to come.

Nuke Tips (of the month) – Vector Blur or 3D motion blur?


Jingle Blur, Jingle Blur, Jingle all the way!

I’m surprised that after working full time as an FX artist at a subsidiary of a large Japanese studio, tackling motion blur in compositing stage can be problematic if the subject moves in Z axis which is where 3D motion blur triumphs.

For my personal project, I’ll analyse every shots and see if the use of 3D motion blur is necessary to improve the overall photorealism.

So in this very brief Nuke Tips, I’ll explain the difference between vector blur and 3D motion blur during compositing.

Wheels of blurry

For this demo, I’m using Octane Render for 3ds Max to generate the velocity pass. You need to look up in your preferred renderer documentation on how to output velocity pass. The default 3ds Max scanline renderer support native velocity output through the G-Buffer when saving as EXR files.

In Nuke, the Vector Blur node allows you to blur a subject either through the Transform node values or by piping in velocity pass from an external renderer

Sadly there is no way to output velocity pass together with “the motion blur” result in Octane Render so be aware that it is not a glitch or bug. Just output the velocity pass with a really high value of max velocity if your object moves really fast. I use a value of 512 for this demo and make sure to output it as 32-bit float EXR!

For the 3D motion blur, I simply enable Motion Blur in Octane Render settings as long your object has animation (which you need to enable object motion blur and moveable proxy in the object properties).

Refer to the following screenshots for the settings that I used for this demo:



Do take note that the motion blur also blur the alpha channel so if you’re dealing with complex compositing, make sure to render the background as matte to reduce background bleeding in the beauty render.

Blurring the edges in Nuke


Please refer to the Node Graph in the screenshot above on how to setup the Vector Blur using the Velocity pass from Octane Render. In my example, I shuffle the Velocity pass into the Forward channels but you can shuffle it into the Motion, Velocity, Distance or your preferred channels.

Also do read the official documentation on using Vector Blur at The Foundry site as I’ll not explained every single settings in this article.

Remember to set the calculation method to Forward and enable the Alpha premultiplication since the Velocity pass from Octane Render has been premultiplied. Failure to do that will create a streak/jaggy edges.

So Vector Blur or 3D Motion Blur?

Typical 3D motion blurring!

This is where things start to go awry as it rotates really fast where the motion blur failed in blurring curve motion. Also take note of the shadow on the ground where it remains crisp as there is no velocity information since it remain static in the scene.

The velocity pass in motion. A reminder that there is only two channels that will be output in velocity pass aka Red and Green channels.

Pros or Cons?

Vector Blur

  1. Flexibility in controlling the amount of blurriness during compositing.
  2. It works great… only in X and Y axis.
  3. Anything less than 32-bit Float values means less than accurate blurring process. BE WARNED. (although you can still get away with 16-bit Half Float depending on the situation)
  4. Can’t do rotational blurriness accurately (hence why the subject for this demo is a wheel!)
  5. Can be problematic when dealing with various overlapping subjects/limbs/etc. I did not prepare examples for this case but just imagine an arm crossing over the chest. Vector Blur will failed in blurring the arm edges accurately when it overlaps the chest.

3D Motion Blur

  1. Accurate because it blurs in actual 3D space!
  2. You still need to worry if the shutter angle changes during production through client/supervisor/director feedback AKA re-rendering.
  3. Depending on your choice of renderer, 3D motion blur is EXPENSIVE or nearly FREE of rendering time costs.
  4. Can be problematic when dealing with various overlapping subjects/limbs/etc too! This is more of a headache for the compositor though depending on their situation.

Quick Intro to 2D Tracking in After Effects and Nuke

Another Intro to 2D Tracking in After Effects and Nuke

Here’s my latest video covering 2D tracking in After Effects and Nuke.

I’ll update this post with a short transcript of the overall video but for now here’s a list of what I’ve cover in this video:

  1. What footage should one choose for practice.
  2. The concept of tracking in Position (Translation in Nuke), Rotation and Scale.
  3. Linking the tracking data to the chosen layer/element.
  4. Checking for accuracy (although I did not explained in depth)

I chose to use the same footage for the demo in both After Effects and Nuke so you can see how to approach it both compositing software.

Do not limit yourself to either software and hopefully this video is helpful to you!

Tackling Common Issues in CameraTracker


Headbutting Common Issues in CameraTracker

Nuke’s CameraTracker is easily one of the most user-friendly workflow when the need arise when tracking a shot.

Still that doesn’t mean it can get the desire result you’re looking depending on the project you’re working on. Read on to identify the list of issues and suggestions to fix it.

I’ll be updating this post with proper images/screenshots to better illustrate each problem. For now, I’ve highlighted keywords in case of TLDR (too long; didn’t read) syndrome.

Continue reading