Tuesday, June 30, 2009

Zbrush to Houdini - Texture and Normal Map Workflow

This is a quick set of notes/instructions explaining, for complete Houdini/ZBrush newbies (that would be me), how to move a model from ZBrush to Houdini, and to ensure that textures and normals make it along the way. This will include the creation of a very simple shader network that will take the normal and texture map files created by ZBrush and use them in Houdini. Houdini will be used to create the UV coords, and export this as an .obj file. Prior to this evening, I wasn't sure how hard this was going to be, and I stunned at how easy it is (or perhaps I'm finally starting to grasp how the various parts of all of this are supposed to work).

Create the model in ZBrush. If you're using ZSphere's make sure to subdivide up a number of levels *first*, then resume modeling.

From Learning 3D


After subdivision
From Learning 3D


The reason is so that you have at least one level of subdivision, and to "tighten" up the initial sub division level 1 geometry. Without this, things may appear a little weird when you export the model to Houdini (or Maya, Modo, et al). I'll use the "Super Average Man" model supplied with ZBrush 3.1 for this. Don't forget to make the tool into a PolyMesh3D. Do your ZBrush editing/sculpting as you normally would.

From Learning 3D



Bring the subdivision levels back down to level 1, and export the model as an OBJ file. The default export settings seem to be OK for this.

From Learning 3D


Exporting
From Learning 3D



Bring the model into Houdini. A simple way is to create a Geometry node in your scene, drop down a File SOP, and import the new .obj file you just created.

From Learning 3D



From Learning 3D



At this point we have the base model in Houdini. We need to create UV coords that we can then make use of in ZBrush for the Normal and Texture maps. One simple way to create some UV coords is to add a UV Unwrap SOP to your file node.

From Learning 3D




Switch to UV view and you can see what's happened. Houdini has unwrapped your geometry, to that point, inside a UV square. Make sure that the uv unwrap node is toggled as the render/display node, and then export the geometry as an OBJ format. This save the geometry, which is unchanged, and add the uv coordinates that were added by the UV unwrap SOP.

From Learning 3D



Exporting with UVs
From Learning 3D





In ZBrush import the newly saved obj file, making sure that you're at sub division level 1.

From Learning 3D



Note that in the Tool palette's Texture options, the EnableUV button is disabled - meaning that it picked up the UV map in the new obj file.

From Learning 3D


At this point we can create the Normal map. Open up the ZMapper plugin. Make sure that the Object Space.nmap option is selected.

From Learning 3D


Click the Normal/Cavity Map tab, (bottom far left tab in the UI), and with the default options, click the Create NormalMap button on the far right. This will take a few seconds as the normals are calculated. Once it's finished you can exit the ZMapper. Now we have a Normal map that's selected into the Texture area of our tool. Select the map, and export it as a tif file.

From Learning 3D


From Learning 3D





Let's verify that the normals are OK by creating a shader network and a shader to use. Create a SHOP Network in your geometry container.

From Learning 3D


Enter the SHOP Network node, and add a "VOP VEX Displacement SHOP" node.
From Learning 3D



This will use the normal map data to displace the geometry. Make the node a Material by selecting the node, and hitting "Shift" + "C" keys - this will wrap your displacement node into a Material node and attach it to a sub-output node.

From Learning 3D


From Learning 3D



Enter the displacement node, and enter the following network:

From Learning 3D


From Learning 3D


From Learning 3D


All that's happening is the creation of a UV parameter - a special parameter that is a vecgtor type, and has it's node name and Parameter Name set to "uv" - case matters, "UV", Uv", or "uV" won't work - it's got to be "uv". This will store the current UV coordinate from the geometry network as it's being evaluated for rendering. Make sure that it's set to invisble, as we do not want to promote this in our Material parent node.

This is pushed into a UV transform node. We need to flip the data to properly handle the data output by ZBrush. There is an option in ZMapper to do this as well, but it's worth noting what cna be done in Houdini without altering the ZBrush data.
The transform output is converted from a vector to a float, and then sent to a Bump Map node. This creates the displacement, with the u and v coords send to the "s" and "t" inputs of the Bump Map node. Create another parameter for specifying the Normal Map file name by middle-clicking on the "tmap" input of the Bump Map node.

Finally connect the "dispN" output of the Bump Map to the "N" input of the final output node ("output1").

Jump up one level to the Material node, right click, and select the "Promote Material Parameters".

From Learning 3D


This should make the texture map parameter visible here. Enter the tif normal map file you created earlier from ZBrush.

From Learning 3D



Back in the Geometry container network, add a Material node to your geometry, and select the material you just made in your SHOP network.

From Learning 3D


At this point you should be able to render with Mantra and see the effects of the normal map, despite the low res geometry.

From Learning 3D


From Learning 3D


From Learning 3D


From Learning 3D


From Learning 3D




At this point I'll go back to ZBrush and poly-paint the model.

From Learning 3D


From Learning 3D




When that's done, go to the Tools > Texture sub-palette and click Col>Txr. This will create a texture map, based on the UV coords we had earlier, and colorize it according to what you've painted on the model/tool. Export this texture out as a tif for use in Houdini.

From Learning 3D


Export
From Learning 3D





To use this in Houdini we need to modify our SHOP material. Go back to the material node and edit it. Add a new "VOP VEX Surface Shop" node and attach it to the suboutput1's "next" input.

From Learning 3D


From Learning 3D



Enter the surface node and create the following network.
From Learning 3D


From Learning 3D



We'll start with a similar setup to the previous normal displacement network. Add a uv parameter, flip it's V vector component, and separate out the individual U and V components. These go into a texture node, into the "s" and "t" inputs. The output of the texture, the color, is sent into the "diff", or diffuse color, input of a Lambert node (you could use something else, this is just a simple example). The "clr" output is then connected to the "Cf" input of the final output node. Render with Mantra, and Voila! a textured model!

From Learning 3D


From Learning 3D


From Learning 3D



Credits:
The normal map stuff was gleaned from this post at odforce for starters. I found a tutorial on ZMapper elsewhere. The bits about initial subdivision in ZB I picked up from the 3D Buzz ADP tutorials.

Monday, June 29, 2009

Finished Felix

The Death Droid (or Felix the Chicken, as my wife likes to call him) is now done, for now at least.

Some shots:

From Learning 3D


From Learning 3D


From Learning 3D




From Learning 3D


From Learning 3D


From Learning 3D

Thursday, June 18, 2009

More work on the arms

Our little droid now has arms, with rigging in place.

From Learning 3D


From Learning 3D


Here's an example of the back curving forward. The Line SOP that drives this gets resampled into a number of points, which in turn get used to place the various spine elements and shoulder structure.

From Learning 3D