normal map?

   26167   16   3
User Avatar
Member
313 posts
Joined: July 2005
Offline
using zbrush has become a new trend in Building 3D models! Does anyone know if there is any renderman shader which can render zbrush's normal map in Houdini?

there are actually not many tutorial about zbrush and Houdini! Does anyone have any experience of rendering a zbrush model in Houdini?

thank you for your attention!
User Avatar
Member
2199 posts
Joined: July 2005
Online
I tried it out a few months back, and posted the results over at zbrush. Also so has Dajuice

http://206.145.80.239/zbc/showthread.php?t=20310&page=18&pp=15 [206.145.80.239]
The trick is finding just the right hammer for every screw
User Avatar
Member
112 posts
Joined: July 2005
Offline
Simon, I was wondering the the VOP setup you have is for a displacement map (grayscale) or a normal map. Aren't they two different things? In zbrush, you can create a grayscale displacement map in addition to normal maps which store normal data as RGB vector values.
www.tirgari.com
User Avatar
Member
2199 posts
Joined: July 2005
Online
That VOP setup was Dajuices, I did it by rewriting the displacement shader that ships with Houdini. However I think both methods were for using greyscale maps.
You could easily write one that reads in the rgb values and uses that as the displacement vector. It wouldn't be much different from the VOP network in that pervious link but instead of multiplying N by the map amount you would just convert the rgb value to a vector and add it to P directly.
The trick is finding just the right hammer for every screw
User Avatar
Member
252 posts
Joined: July 2005
Offline
Be very careful here. There are two different types of Normal maps- Object Space Normal Maps and Tangent Space Normal Maps.

If you use an Object Space normal map, you would just use these normals as your render normal- no addition or anything. This would only be useful for objects that remain still and in the same orientation as they were created.

The other type of normal map is based off of the tangent and binormal coordinate system of your geometry. You just can't simply add the RGB of your normal map to your normal in your render. You have to convert this vector from tangent space to whatever space you are in before adding it to your normal.

Also, be very careful- different software calculates this tangent space differently. What looks good in Maya may not look good in FX Composer, RenderMonkey or other HLSL capable software/hardware. This is something we are currently dealing with here at EA, and could be a problem in a software render also. (Someone wrote a plugin that you can get online to allow Maya to render to Mental Ray using a Normal Map created in Maya that might give some hints to the tangent space creation algorithms.)

That being said- the correct and easiest way to work with ZBrush in a non-realtime environment like Houdini software renders is to use it to get a standard black and white height map that you can then use as a standard bump map for small detail on your lower res mesh, or a displacement map if desired (displacements being much more expensive, but will affect your silhouette nicely if your displacements are large). Standard bump map and displacement map algorithms calculate your normals for you and you have a lot of flexibility for touching them up and modifying your geometry afterwards. Normal maps simply take the calculations farther to improve real-time performance but are really tied to the geometry used to create them and the specific tangent space algorithms also.

I put in an RFE for Houdini to create and support Normal Maps a while ago, so hopefully Houdini will catch up with the rest of the 3D world in this regard at some point.

Craig Hoffman
User Avatar
Member
7709 posts
Joined: July 2005
Offline
Given a piece of geometry and a normal map, then isn't it up to the surface shader to apply the normal map?
User Avatar
Member
252 posts
Joined: July 2005
Offline
Sure. But for them to work exactly as expected depends on what you are doing. If you have a nicely shaded surface and then you want to add a normal map that is from a B/W height map and created using the NVidia Photoshop plugin to create a tangent space normal map you will get good results because you are just perturbing normals, not recreating the shading of higher res mesh.

But using a program like ZBrush or NVidia's Melody or ATI's NormalMapper or Maya to take a high resolution highly detailed model and a low resolution model and generate the normal map that will make the low resolution model shade like the high resolution one, you need to make sure you are using the same tangent space math to get the results you expect.

Honestly, many people may not see the artifacts, so it may not matter to you, but standard B/W bump maps in a software renderer are just safer and easier to use, as well as simpler to interpret and touch up. Normal maps are great for real time rendering, but why bother with this if you are using Mantra?

-Craig
User Avatar
Member
7709 posts
Joined: July 2005
Offline
Have you tried this?
http://odforce.net/forum/index.php?showtopic=723 [odforce.net]
User Avatar
Member
252 posts
Joined: July 2005
Offline
Not really, although I have done similar things in the distant past. Houdini supports texture baking now with an undocumented “mantra -u” option, so you don't have to jump through those hoops.

But there is still the problem of having problems at seams (textures need to be dilated out to fix this). I have an RFE in for that too.

By the way, this still wouldn't be useful to create a normal map. You need to shoot rays from the low res geometry and keep the normal from where the ray intersects the hires geometry. There are probably other ways to do this with Houdini, but it isn't built in like in other packages.

-Craig
User Avatar
Member
12429 posts
Joined: July 2005
Offline
Here is some info; feel free to add more:

http://www.odforce.net/wiki/index.php/ShaderUnwrap [odforce.net]
http://www.odforce.net/wiki/index.php/NormalMapping [odforce.net]
Jason Iversen, Technology Supervisor & FX Pipeline/R+D Lead @ Weta FX
also, http://www.odforce.net [www.odforce.net]
User Avatar
Member
330 posts
Joined: July 2005
Offline
is normal mapping a way to get non linear displacements?
User Avatar
Member
112 posts
Joined: July 2005
Offline
You don't get displacement with a normal map…the surface geometry doesn't actually change.

Normal maps seem to act more like sophisticated bump maps.

In zbrush, for example, you can basically create a high poly model with lots of details on the surface. Then export the surface detail as a normal map. The surface normal of each polygon on the detailed model is defined by a 3 component vector (X, Y, Z) which pertain to each polygon's normal direction. Normal maps take X, Y, Z values and convert them to RGB color infomation since RGB is also a 3 component vector.

Then you can have a very low poly version of the same model and apply the normal map onto it. During render time, you have the renderer read the RGB values from the high poly model as surface normal information and apply it to the low poly model. The low poly model renders with as much surface detail as the original high poly model, but the actual model remains low poly.
Edited by - Dec. 22, 2004 17:41:05
www.tirgari.com
User Avatar
Member
330 posts
Joined: July 2005
Offline
good explanation, thanks

i can see why it's good for games
User Avatar
Member
252 posts
Joined: July 2005
Offline
Normal maps give the exact same effect as bump maps- nothing more, nothing less. They just bake more of the bump map calculation into the texture rather than make the shader calculate everything necessary for bump mapping, which is why they are good for real-time rendering.

They do not displace, although real time parallax mapping can look like displacement mapping except around the edges since it doesn't really displace.

-Craig
User Avatar
Member
112 posts
Joined: July 2005
Offline
In Houdini, you can actually set up a simple network to demonstrate the basic concept about normal mapping….not a true normal map system, but merely one to show how data flows.

create a grid (may be with higher poly count), apply Vex Mountain sop to the surface, wire the grid into a point SOP and add normals to the grid points. Then in the color parameter of the point SOP, have R, G and B values reference nx, ny, nz of the normal parameter. Your grid would take on a rainbow range of colors…roughly similar to what normal maps look!

Again, the above is merely a rough demo of the concept behind normal maps. One other thing to be aware, the above example deals with point normals, not poly normals which is what normal maps deal with.
www.tirgari.com
User Avatar
Member
14 posts
Joined: Dec. 2008
Offline
i am trying normal mapping with example scene from houdini exchange http://www.sidefx.com/exchange/display.php?projectid=252&old=on&hide=on&sort=lastModified [sidefx.com]
and the tangent space one doesnt look right. In the view port it look as expected. Any idea why it doesnt work?
EDIT: Vex compiler errors… lets see

Attachments:
zbrush.jpg (76.2 KB)

User Avatar
Member
86 posts
Joined: Aug. 2010
Offline
Hi,i got a real kool digital asset in the freebiez section,its eal cool and allows me to use tangent space normal maps from zbrush…however in houdini 12,ive noticed that it only works when im rendering with microplygon(it doesnt work with raytacing)…anyone know anything about this?
the xon can
  • Quick Links