Learning Houdini for Project DQ

Learning Houdini for Project DQ!

In this post, I’ll be covering the overall goal of Project DQ that is as a learning platform for using Houdini for all FX tasks.

As the demand of Houdini skills for FX shots risen for the past several years, I realise it is time to pick it up or get left behind.

While I do not have exposure to Houdini in actual production at my prior workplace, I tried to adapt what I’ve learn in Maya, 3ds Max and RealFlow into Houdini… should be a frictionless learning curve right?

The Uncanny Valley in Learning New Software

3ds Max 5 Screenshot. Not my screenshot but look at that glorious Windows XP theme… and ICQ, Photoshop 7, Outlook Express, IE6 etc running in the background.

Back in 2003, I started playing around with 3ds Max 5 as my very first 3D Digital Content Creation (DCC) to create prefabs for Max Payne 2 mods (that didn’t really see the light of the internet–). 3ds Max became my main 3D package (excluding my short lived period with ZBrush and Sketchup) until 2011 where I need to use Maya 2012 for university’s assignment… which took me longer to accomplish something than what I can do in Max.

So there goes my half-hearted attempt in learning Maya and ended up using Max for majority of my assignments including final year project… at least creating a cube is straightforward in both software.

Well that all changes when I start working at my ex-workplace where Maya is the main DCC for majority of the work. I remember spending more time asking my colleagues, referring to the docs and googling for solution on navigating Maya UI for the first two months! That was really not a productive moment for me but as I become accustomed to Maya workflow, I can see why it is the still the primary backbone DCC for most studios.

Still I’m more of a Max person but I recognise the strength of Maya and the transition is not that bad actually since it is quite similar in both the UI and UX!

Now back on topic, what about Houdini?

Hmm looks familiar but different.

I have a Network within a Network within a Network

The initial “culture shock” for me when using Houdini for the first time is the concept of navigating the networks.

A typical Nuke node graph

The Relationship Editor in RealFlow

As you can see in the screenshot above for both Nuke and RealFlow, majority of the nodes are layout as it (with the few exception like Gizmo, Group and Precomp in Nuke).

Now let’s take a look at Houdini 16 Network Editor (there are minor changes in the UI for older version).

The obj Network View in Houdini 16

Not every node can be simply place anywhere. If you observed the dropdown at the top left of the screenshot, we have

  • ch (CHOPs aka Channel Operators)
  • img (Compositing)
  • mat (Materials)
  • obj (Objects)
  • out (Outputs)
  • shop (Shader Operations)
  • vex (VEX Expressions)

Depending on the type of nodes, you’ll be managing a minimum of one (obj) and up to seven major networks! Well excluding the possible dozens of subnetworks-

While most networks will be created manually, some like the popsolver are actually pre-made network

More info about the network types can be found here: http://www.sidefx.com/docs/houdini/network/flags.html

Just remember that there are nodes that can be created at any networks while some are restricted to specific level.

Contrast this with… 3ds Max and Maya.

You don’t get to see what’s going on as most of the operations are hidden from you and with that in mind, destructive while it is near non-destructive in Houdini. Note the “near non-destructive” instead of “non-destructive” as not everything are fully revertible unless you progressively version your scene file!

Maya Node Editor

Maya Hypergraph (Connections)

Maya do have a similar feature like the Network Editor aka Node Editor, Hypergraph and Hypershade although each has their own specific UI and not unified like Houdini’s Network Editor.

Slate Material Editor was introduced since 3ds Max 2011

The nearest equivalent for 3ds Max is the Slate Material Editor and Particle Flow although there is the new Max Creation Graph introduced in Max 2016 onwards that has a similar layout like VOP.

Variables and Relationships is the Spice of Houdini Life!

If you have played around with programming languages like JavaScript or Python, variables and relationships are the backbone for majority of your workflow in Houdini! If you have no idea what it is about… you’ll have a steeper mountain to climb troubleshooting stuff in Houdini.

The following are samples of global and user created variables:

Snippet of HScript Global Variables from Houdini Online Docs

An example of user-made variables in VEX (Attribute Wrangle)

3ds Max and Maya users might recognise some of the variables’ naming and usage but in reality, rarely use it in their daily works as majority of the tools are already written to use such variables without any inputs by the artists (unless you’re writing scripts).

In Houdini, you can easily manipulate a parameter using both variables and relationships! Relationships are super useful if you need to reference any valid parameters! The following screenshot shows the relationships of the different parameters through expression.

A typical Relationships in Houdini

The same are not easily replicated in Max… although Maya does have similar feature albeit limited.

Here’s the list of pages that you will be visiting often when you start dipping your toes into writing expressions/scripts in Houdini:

http://www.sidefx.com/docs/houdini/network/expressions.html

http://www.sidefx.com/docs/houdini/expressions/index.html

http://www.sidefx.com/docs/houdini/hom/index.html

http://www.sidefx.com/docs/houdini/vex/index.html

Easily Identify Data

By MMB a node, you’ll get an informative popup window about the various data that’s happening in that node

This is something I wish Nuke has so you can just middle-click (the default Houdini shortcut) on a node and identify the channels/layers on a particular downstream. Houdini is something like your best buddy that are transparent to you on what they are holding (if they are not your slave labour-).

Unlike a certain kleptomaniac ex-leader, being informed of the data that are being processed is very helpful and a must have in a node-based workflow. At least Nuke does let you know what’s going in a node stream through a subtle colour label on the nodes.

Time to put that Lotus 1-2-3 skill into use! Except you can only edit the attributes value (no formulas are allowed)

You can view Points, Vertices, Primitives and Detail attributes using the Geometry Spreadsheet. Shown here is the Detail attributes of the Whitewater sim cache.

Enter the Geometry Spreadsheet is a handy feature that display the values for the various attributes (points, vertices, primitives and detail) of a specific node. While it is not possible to write formulas/expressions ala traditional spreadsheet software like Excel, you can directly edit the value of the attributes (that are valid of course).

Max/Maya users? You either write your own scripts to identify the data belonging to an object or use third party solutions as far as I know.

Mingling Python and the Holy Scripting

Bridgekeeper: WHAT is the airspeed velocity of an unladen swallow?

King Arthur: What do you mean? African or European swallow?

WRONG PYTHON

Back on topic, Houdini has great Python integration and that means if you have silly or crazy ideas, you can write you own Python scripts to boost your productivity! Even the Shelf tools are secretly a handy one/two clicks Python script.

The Explosion Shelf Tool are nothing more than a four-lines Python script

If you don’t know Python, you can easily torment learn it or harass ask politely for help from someone that knows Python!

But if you have basic Python knowledge, you can write the following “Search and Replace” script…

Well you might wonder why I don’t simply save the scene file as a text file (think Maya ASCII format) and do the search and replace in a text editor? Consider the possible situation where I might accidentally replace an innocent word/number that I didn’t intended and screw up the whole work?

Not to forget that Python allows you to write multiple conditions that when couple together with the built-in Python functions in Houdini, it can be really powerful!

Or maybe I just need to level up my Notepad++ skill…

Maya Script Editor Listener GIF

The Script Editor in Maya will print out most parameters command that you can use for scripting

The only thing that I missed out a lot from Max/Maya is the listener which is a great tool if you’re starting out scripting as it prints out the command for majority of the actions that you performed. I briefly covered on using listener in a prior article: Coding in VFX and Animation

Basically prepare to dig through the Python docs for Houdini to get the desire functions that you need to use!

List of Python functions usable in Houdini

Or ask the friendly community over at the official Houdini forum or Odforce!

Just be prepared for the lack of responses if your question sounds like a FAQ or at the very least, attach an example file/video/pics/GIF for others to inspect.

Chanting the Mantra

Mantra is easily one of the best built-in renderer in a 3D package although the latest Maya and Max has Arnold Renderer bundled together (although Arnold are also available for Houdini).

Easily the greatest part of Houdini is the ability to manage dozens if not hundreds of render nodes

Presumably you can manage multiple renderers available for Houdini like Renderman, Redshift, and Arnold without switching to other renderer first like in Maya/Max. This is very useful in scenario where you need to output render elements with different renderers depending on the project requirements.

…I can’t think of an equivalent for Maya/Max. There is Render Layer Editor in Maya (which has been replaced with Render Setup from Maya 2017 onwards) which act like Photoshop Group Layers as you add objects onto a layer with optional render overrides but still at the mercy of the current renderer while for Max… I don’t think there is anyway to achieve similar functionality natively. The Scene States in Max doesn’t really work for me (simply due to the not user-friendly interface) and it is better to save several different Max scene files that can be Xref when outputting render elements for compositing.

Actually Mantra has a lot of parameters to fool around! Not every settings are shown in this GIF.

While my only production experience are Vray (both in Max and Maya), Mantra has similar workflow so the learning curve is not that steep. The overall render quality settings are similar to Arnold unlike Vray. Generally I find this approach to be more accessible when you have less parameters to play around to achieve the final look.

So far I’ve yet to render scenes that requires large amount of geometry/textures/particles/etc so I can’t say about the overall performance of Mantra against other renderers but I can vouch that it is very stable and rarely crashes on me (excluding some silly setup that chokes Mantra while I’m exploring Houdini…).

The Extra Image Planes setup can be intimidating for newcomer

I’ll have to say though that the Extra Image Planes setups are not user-friendly unlike in Maya/Max. There are many times where I accidentally included or excluded passes because the checkbox are not that obvious due to the UI design.

Probably I need to write a custom Python panel to simply the process.

You’re a Wizard, Houdini!

Not the type of wizard I’m referring to. If only using high-end 3D software are as easy as using a software wizard!

Again, my only experience with Houdini is mainly replicating what I’ve known in other 3D package and I’ve yet to scratch the surface of what it truly can do! While I’m not trying to paint a rosy picture on using Houdini, there is still area where I wish it is more accessible (like modelling and shaders creation).

To wrap things up, Houdini is already becoming a must-have skill for any FX artists in the VFX and gaming industry.

As someone who never have any actual production experience with Houdini and managed to pick it up (after destroying thousands if not millions of neurons) to generate FX works that I previously do in other 3D package, remember that there is no harm in learning a new skill to broaden your skill sets!

Nuke Tips – Kronos, MotionBlur, Oflow, or VectorBlur?

The last draft for this article dated back to Jan 25, 2016… time to revive it!

Kronos, MotionBlur, Oflow, or Vector Blur? I’m blur…

The great thing with Nuke is that we have many ways to skin a cat problem.

I’ll be focusing on adding motion blur to CGI FX elements like fire, blood and debris. Yes you can abuse I mean use Kronos and Oflow for adding motion blur instead of slowing down footage!

Time to explore the various methods and see which one make or break depending on the situation.

Do not get confused with MotionBlur2D and MotionBlur3D which are not designed to generate motion blur by analysing the image sequences. MotionBlur2D uses the Transform animation while MotionBlur3D uses the camera animation to generate motion blur.

Continue reading

Blood FX using RealFlow

Blood FX using RealFlow

Blood FX using RealFlow as seen in Gantz: O

With Gantz: O premiering on 14 Oct 2016 in Japan, I’m proud to be part of the production team as the FX artist for the film.

While there are many challenges throughout the production that I wish can be handled better, there is one thing that the film are not afraid in showing visually is… blood! Well 3D fluid simulation instead of fake blood.

For this tutorial, the scene file and settings was based on my own personal R&D to generate a blood squirt and slash for the many shots in the film that I was tasked for.

So why the personal R&D?

Well there is an in-house blood asset BUT it is really heavy to sim (on a Intel Core i7 3930K 3.5GHz) and this prompted me to improvised a new setup that sim faster for faster WIP turnaround for the FX supervisor and director.

I’ve noted down the necessary settings that one need to modify to shape the blood and the simulation variable depending on the situation.

What will this tutorial covered?

  • In-depth explanation the blood creation in RealFlow (setting up SPH emitter, meshing, cache and graph editor)
  • Quick glance on importing the BIN mesh into Digital Content Creation like 3ds Max or Maya.

If time permits, I’ll be updating this tutorial on the look development of the blood aka material setup and rendering using Vray and compositing in After Effects/Nuke.

Before we start

Make sure that you know the scale of your scene setup as RealFlow internal scale system differs from 3ds Max and Maya.

Maya is the primary software used in production and the scene scale was set to Centimeter (cm) and RealFlow uses Meter (m) as the default scene scale.

This means we need to export the necessary data from Maya to RealFlow and vice versa.

To keep things simple, we don’t need to worry about scene scale conversion as majority of the shots are done with the default scale for the respective software.

The pipeline goes like this:

Export Characters and Background (Maya with cm unit) -> Blood Simulation (RealFlow with m unit) -> Importing generated Blood Mesh from RealFlow into Maya

This ensures the blood that you see in RealFlow is exactly the same in Maya once you imported the blood mesh.

Getting started

For the production of Gantz: O, every shots have their own unique blood sim which means I can’t  reuse the same sim for other shots (heh) and if possible, make sure to have your own characters/creatures animation exported to RealFlow.

Download the project files here (RealFlow 2014).

  1. Create a standard SPH particles emitter. I use the Circle emitter for my blood sim in Gantz: O.
  2. Use the following settings as shown in the diagram. The blue label shows the required changes from the default emitter value.bloodfx_emitter_settings
  3. Position the emitter to where the blood need to emit from (e.g.: sliced wound from katana, gunshot impact, etc)
  4. Fiddle with the Circle parameters. Higher V/H random means more violent burst. I usually use a value of 0.1 for Ring Ratio.
  5. Animate the Speed. Refer the following diagram:
    bloodfx_speed_graph
  6. Before we start the actual simulation, let’s refer to the following diagram for the list of Daemons that I used:
    bloodfx_daemons
  7. Quick description of the listed Daemons above:
    • Drag Force add drag to the blood so it doesn’t travel with zero resistance. Keyframe it appropriately for high-speed shot. The default value is 0.1 but I find it to be slow down the emitter too much so I used 0.01.
    • Sheeter is needed to create the tendrils after the blood emitted. The min cavity size is very sensitive so you want to adjust in increment of 0.1. My setup uses a value of 1.0. Make sure to enable Create Tendrils and set the appropriate number for the @ count value. I use 10 as RealFlow will attempt to create approximately 10 tendrils but it can either go higher or lower depending on the simulation. Also remember to up the @ strength value (I used 50) so the blood will have nice wavy “blood-looking” tendrils.
    • Noise Field is really critical to break the shape of the blood in world space and I usually animate it to last 10-15 frames after the blood emitted before the noise fade out to 0. Make sure to set this Noise Field to affect Force. Leave Space and Time scale at the default 1.0. Refer to the following diagram for more description.bloodfx_noisefield_graph
    • Gravity is a 2013 British-American science fiction film co-written, co-edited, produced and directed by Alfonso Cuarón. Ahem, Gravity means what goes up must come down… well more like the direction where the Gravity is pointed. Remember to up the Gravity strength appropriately if the blood is falling down slowly as the default value of 9.8 can be really slow. I used 98.0 for my setup.
    • k_Age is situational but currently disabled in my setup as it is meant to kill the particles after a certain amount of time (frames). Super useful if your shots have the blood flown out of frame where you can kill the particles for faster simulation.
    • k_Isolated helps to kill *isolated* aka stray particles after a certain amount of time. The value is in second so the default value of 1.0 is more than sufficient.
  8. For geometry collision, I prefer to use standard geometry inside RealFlow unless you have specific custom mesh which you can import into RealFlow through OBJ or SD file. In the screenshot below, I used a Cross geometry as a collision to break the shape of the blood.bloodfx_collision
  9. The following settings for Liquid – Particles Interaction are pretty much my go to values if I arrange a geometry to collide with the blood emission:
    bloodfx_liquidparticleinteraction

    • Collision Distance and Distance Tolerance: Keep both of them at the same value (this affects the “collision” thickness of the geometry).
    • Collision Normal: Leave it at Both.
    • Collision Tolerance: Leave it at 0.0.
    • Particle Friction: 0.001 is just sweet if all you need is to break the blood emission. What about floor/wall/etc? This is where you up the value to 0.4 and higher so the blood flow will stick/stop onto the geometry surface. Remember you can keyframe the friction (e.g.: if you need animate the blood flowing and slowly stop on a floor).
    • Bounce: Leave it at 0.0 (I personally have never use it before)
    • Sticky: Leave it at 0.0 although you can try to up the value although Particle Friction does a better job.
    • Roughness: Leave it at 0.001 (think of it as randomizer for the friction).
  10. Before we start simulating, let’s go over the Simulation Options (refer to the following screenshot):
    bloodfx_substeps
  11. If possible, always try to stick to as high as possible for your sim as using a low substeps for testing is a complete waste of time due as the particles are susceptible to burst violently.
  12. Easy said than done, there are times when you need to speed up your sim due to very tight deadline *ahem* so I prepare the following settings with a brief note:
    • Quick Sim (MIN: 50, MAX: 150) – Use this if the blood sim doesn’t interact with any objects in the shots.
    • Decent Sim (MIN: 100, MAX: 300) – All purpose sim that is stable for most situations. Slower sim time obviously.
    • Beauty Sim (MIN: 300, MAX: 333) – The beautiful part of this settings is that the blood sim rarely goes wrong. Also *beautiful* (read: very slow) sim time…
  13. Once you have decide on the min/max substeps, hit on Simulate and hope all goes well! There is always the Reset button to revert back to square one in case the simulation went awry or it doesn’t look the way you wanted to achieve.

It’s meshing time!

Whenever possible, do not mesh during simulation as it will further slow down the overall sim time!

For meshing, there do not have a magic number to it as it all depends on the situation of the shots. I’ll briefly go through this process as it is a matter of experimenting the value to get the right look for the blood.

bloodfx_particlemeshsettings

Mesh

  • Type: Use Weighted isotropic (default value)
  • Weight normalization: No (default value)
  • Auto polygon size: No (default value AND DON’T ENABLE IT PLEASE as it is hard to predict the shape of the automatic shape that RealFlow will generate)
  • Polygon size: 0.03 (this should be a nice balance for most situation as the anything lower means bigger mesh filesize for a diminishing value in the overall shape. Higher value will result in smaller mesh filesize with a jagged look to the blood which are not desirable unless it is a very fast blood movement)
  • Smooth: 50.0 (default value and I never touch it as I prefer to use Filters which I covered below)

Filters (take note to use the same value for Thinning, Relation and Tension)

  • Filter: Yes (self-explanatory and it does increase a miniscule amount of meshing time but worth it)
  • @ Thinning: 0.3 (think of it sharpening the edges of the fluid through mesh shrink which is useful if you want that elongated stringy blood)
  • @ Relaxation: 0.3 (helps to round off the edges and slightly stretches the blood mesh)
  • @ Tension: 0.3 (smooth out high frequency noise aka uneven surface of the blood)
  • @ Steps: 32 (think of iterations and higher value result in thinner looking blood. Refer to the animated GIF below)

bloodfx_particlemeshsteps

Importing the Blood Mesh into Maya

You need to have the RealFlow Connectivity Plugins for your DCC of choice before you’ll be able to import the mesh BIN sequence.

bloodfx_mayaimport

The above screenshot is taken from Maya with Vray .

To import the BIN mesh, it is as easy as clicking the import BIN mesh (the last icon in the toolbar/shelf) and navigate to the blood mesh directory:

bloodfx_mayabinsequence2

And here we go, the blood mesh inside Maya.

bloodfx_mayaimportedmesh

Future updates

As mentioned at the beginning of this article, I hope that I can further delve into the look development of the Blood FX using Vray in Maya and composite the final result in Nuke.

The header image is lit using a Tokyo Dome HDRI taken from hdrlabs.com with added motion blur using Nuke’s Vector Blur.

Hope you enjoy the bloody tutorial and do watch Gantz: O!