How to do face weighted normals in Houdini?

   9643   13   4
User Avatar
Member
23 posts
Joined: April 2016
Offline
Hi. I'm a newbie and I'm looking for a way to do face weighted normals in Houdini for game assets. (http://wiki.polycount.com/wiki/Face_weighted_normals) [wiki.polycount.com]

The left box is a box made in Houdini with a polybevel and nothing else. The right box is a box made in Blender, beveled, then the normals were edited with a face weighted normal script.

Attachments:
face weighted normals.jpg (16.6 KB)

User Avatar
Staff
5156 posts
Joined: July 2005
Offline
I think you want to add vertex normals; easiest way to do that is with a Normal SOP, then adjust the Cusp angle to suit.
User Avatar
Member
23 posts
Joined: April 2016
Offline
That method gives me hard edges which I don't really want, but it could be useful for some other things I guess. What I'm really looking for are smooth edges that kind of look like a high poly (subdivision) mesh when it's really just a simple mesh.

Here's another example that maybe makes more sense. The left mesh is the face weighted normal mesh and the mesh to the right uses a normal node with cusp angle 30. The geometry is the same.

If there's no such thing built in I guess I could write a script for it instead.

Attachments:
face weighted normals2.png.jpg (10.9 KB)

User Avatar
Staff
327 posts
Joined: July 2005
Offline
HenrikVilhelmBerglund
That method gives me hard edges which I don't really want, but it could be useful for some other things I guess. What I'm really looking for are smooth edges that kind of look like a high poly (subdivision) mesh when it's really just a simple mesh.

I don't think any built-in node provides the behaviour you described but a custom tool could be built using the Attribute Wrangle SOP. Attached is an example implementation.

Attachments:
custom_point_normal.hip (58.6 KB)

User Avatar
Member
23 posts
Joined: April 2016
Offline
That looks great, thanks! Now I just need to find out what makes that work… :shock:
User Avatar
Staff
327 posts
Joined: July 2005
Offline
HenrikVilhelmBerglund
That looks great, thanks! Now I just need to find out what makes that work… :shock:

Wrangle nodes let you run a snippet of VEX code on your data. The Attribute Wrangle SOP in this example runs over points so the code is evaluated for each point. Also, the “v@N” means a vector attribute called “N” on our data (in this case points). Here is an easier to read version of the code:

vector nml = { 0, 0, 0 };
// iterate over each primitive that references the current point
foreach(int pr; pointprims(@OpInput1, @ptnum))
{
// use the primitive's area as its weight in the weighted sum
float w = primintrinsic(0, “measuredarea”, pr);
// accumulate the primitve's normal at the primitive's center scaled by the weight
nml += w * prim_normal(@OpInput1, pr, 0.5, 0.5);
}
// normalize the vector
v@N = normalize(nml);
Edited by - May 2, 2016 16:35:13
User Avatar
Member
333 posts
Joined: Oct. 2012
Offline
Derrick, thanks so much for this. Very useful!
User Avatar
Member
4189 posts
Joined: June 2012
Offline
This would be a nice addition to the presets menu in the Attribute wrangle - very useful! 8)
User Avatar
Member
23 posts
Joined: April 2016
Offline
derrick
Wrangle nodes let you run a snippet of VEX code on your data. The Attribute Wrangle SOP in this example runs over points so the code is evaluated for each point. Also, the “v@N” means a vector attribute called “N” on our data (in this case points). Here is an easier to read version of the code:

vector nml = { 0, 0, 0 };
// iterate over each primitive that references the current point
foreach(int pr; pointprims(@OpInput1, @ptnum))
{
// use the primitive's area as its weight in the weighted sum
float w = primintrinsic(0, “measuredarea”, pr);
// accumulate the primitve's normal at the primitive's center scaled by the weight
nml += w * prim_normal(@OpInput1, pr, 0.5, 0.5);
}
// normalize the vector
v@N = normalize(nml);

Thank you. Makes a lot of sense now!
User Avatar
Member
1743 posts
Joined: March 2012
Offline
It won't be in Houdini 15.5, but in the major release of Houdini after that, there'll be an option in the Normal SOP for this.

There'll be a new parameter “Weighting Method”, which you can set to “By Face Area”, instead of the default “By Vertex Angle” or the near-equivalent of what the Facet SOP does “Each Vertex Equally”. It can't be backported, because there are a bunch of changes in that code, e.g. parallelizing it, that make it produce very slightly different results from previous versions of Houdini. Slightly changing normals tends to significantly change results for some types of simulations.
Writing code for fun and profit since... 2005? Wow, I'm getting old.
https://www.youtube.com/channel/UC_HFmdvpe9U2G3OMNViKMEQ [www.youtube.com]
User Avatar
Member
2 posts
Joined: July 2018
Offline
This is the technique
Ant
User Avatar
Member
1 posts
Joined: Sept. 2021
Offline
The calculations provided don't give the completely desired result: to have the vertex normals facing the direction of the largest face.

Taking the largest isn't always the desired case though, primarily for corner pieces they should take into account all the faces as they are the same size.

The best way is to take the weighted normals of all the faces and use the faces above a certain area threshold compared to the largest face(I use 0.666) so as you approach a more even shape the normals will revert back to being a face average

Here is my code for that:

vector nml = { 0, 0, 0 };
vector offsets[];
float lengths[];

// iterate over each primitive that references the current point
foreach(int pr; pointprims(@OpInput1, @ptnum))
{
    // use the primitive's area as its weight in the weighted sum
    float w = primintrinsic(0, measuredarea, pr);
    vector offset = w * prim_normal(@OpInput1, pr, 0.5, 0.5);
    
    // accumulate the primitve's normal at the primitive's center scaled by the weight
    append(offsets, offset);
    append(lengths, w);
}

float mx = max(lengths);

for (int i=0; i<len(offsets); ++i)
{
    float ratio = lengths[i] / mx;
    if (ch("snap") < 1 || ratio >=ch("tolerance"))
    {
        nml += offsets[i];
    }
}

vector N = normalize(nml);

v@N = N;

Attachments:
houdini_byFaceArea.png (703.4 KB)
houdini_byFaceAreaWithTolerance.png (678.8 KB)
houdini_T8B2jfQjKf.png (729.2 KB)
houdini_weightedNormals.gif (3.2 MB)

User Avatar
Member
2 posts
Joined: July 2018
Offline
Ideally it would be good if it could work on the bevel history so that the newly introduced surfaces are ignored and the other surfaces have their vertex normals face weighted. I am thinking of the AMTnormals lite plug in for Maya and how that works would be an great.
Ant
User Avatar
Member
8 posts
Joined: March 2022
Offline
jflynnxyz
The calculations provided don't give the completely desired result: to have the vertex normals facing the direction of the largest face.

Taking the largest isn't always the desired case though, primarily for corner pieces they should take into account all the faces as they are the same size.

The best way is to take the weighted normals of all the faces and use the faces above a certain area threshold compared to the largest face(I use 0.666) so as you approach a more even shape the normals will revert back to being a face average

Here is my code for that:

vector nml = { 0, 0, 0 };
vector offsets[];
float lengths[];

// iterate over each primitive that references the current point
foreach(int pr; pointprims(@OpInput1, @ptnum))
{
    // use the primitive's area as its weight in the weighted sum
    float w = primintrinsic(0, measuredarea, pr);
    vector offset = w * prim_normal(@OpInput1, pr, 0.5, 0.5);
    
    // accumulate the primitve's normal at the primitive's center scaled by the weight
    append(offsets, offset);
    append(lengths, w);
}

float mx = max(lengths);

for (int i=0; i<len(offsets); ++i)
{
    float ratio = lengths[i] / mx;
    if (ch("snap") < 1 || ratio >=ch("tolerance"))
    {
        nml += offsets[i];
    }
}

vector N = normalize(nml);

v@N = N;

Amazing, thanks a lot for this !
  • Quick Links