Why does Ambient Occlusion in Houdini do this?!

   28291   32   3
User Avatar
Member
319 posts
Joined:
Offline
Hi,

This is a problem i've had before doing AO in Houdini and have not properly solved. Why does it produce black areas when geometry meets? I have a prefractured brickwall for use in a DOPs simulation, I want to render out the simulation using AO, but huge black areas appear where each brick is fractured. Although each brick is fractured into two pieces, each brick is aligned and pointing in the same direction so I think this shouldn't happen, it doen't seem to happen in other packages either. Image of render attached……….

Thanks for any tips!

Attachments:
Black_Areas.jpg (48.2 KB)

User Avatar
Staff
2619 posts
Joined: July 2005
Offline
Dean_19
Hi,

This is a problem i've had before doing AO in Houdini and have not properly solved. Why does it produce black areas when geometry meets? I have a prefractured brickwall for use in a DOPs simulation, I want to render out the simulation using AO, but huge black areas appear where each brick is fractured. Although each brick is fractured into two pieces, each brick is aligned and pointing in the same direction so I think this shouldn't happen, it doen't seem to happen in other packages either. Image of render attached……….

Thanks for any tips!

Do the bricks have shared vertices? Is the normal being interpolated across the faces of the bricks?

You can either unique the points or turn off smooth shading of the geometry. I think that should help.
User Avatar
Member
319 posts
Joined:
Offline
The bricks are just polys which have been cookied into two sections. I haven't done anything to the normals. I seem to have solved the problem by adding a resample SOP to the network. Is this a usual problem encountered by others?
User Avatar
Member
4271 posts
Joined: July 2005
Offline
Dean_19
The bricks are just polys which have been cookied into two sections. I haven't done anything to the normals. I seem to have solved the problem by adding a resample SOP to the network. Is this a usual problem encountered by others?

All the time. As Mark said its probably due to polygons having shared points. Try the Facet SOP and turn on Cusp Polygons.

Attachments:
fused.jpg (6.5 KB)
unfused.jpg (6.4 KB)

if(coffees<2,round(float),float)
User Avatar
Member
12603 posts
Joined: July 2005
Offline
Dean_19
I haven't done anything to the normals.

Wolfwood and Mark have hit the nail on the head for sure; you must pay attention to your normals in order to tell Mantra whether you want your surface to shaded smoothly (fused vertices causing interpolated surface normals) or not (unique vertices giving a faceted look).
Jason Iversen, Technology Supervisor & FX Pipeline/R+D Lead @ Weta FX
also, http://www.odforce.net [www.odforce.net]
User Avatar
Member
319 posts
Joined:
Offline
Im a bit confused. I solved the problem by using a resample. What has this done to the polys to stop the black areas? I'm not sure exactly what ‘interpolated normals’ are. On Wolfwood's example the black areas are present on the image named ‘fused’, so that means the black areas have appeared after the vertices have been fused together?
User Avatar
Member
4271 posts
Joined: July 2005
Offline
Ambient occlusion works by firing lots of rays from the surface based on the surface normal. In the case of polygons, points can be shared between polygons. When the points are shared the normals for the points are interpolated between the polygons, that way you get a nice smooth look. Now the problem with ambient occlusion is these interpolated can work against you. Which is why you have black areas in my fused example.

Attachments:
fused.jpg (9.8 KB)
unfused.jpg (10.1 KB)

if(coffees<2,round(float),float)
User Avatar
Member
4271 posts
Joined: July 2005
Offline
In the following image the cluster of normals represent what happens at render time. (A bunch of rays being shot from the surface based around the surface normal.)

Look at the Fused box on the left. Since the normals are interpolated across the adjacent polygons, that means at the edges a lot more samples will be taken in the wrong direction.

In the images I posted above, the dark areas on the sides of the box are due to a lot more samples being fired underneath the box instead of off to the side.

Attachments:
scatter.jpg (26.9 KB)

if(coffees<2,round(float),float)
User Avatar
Member
319 posts
Joined:
Offline
Hmm I see. I tested it by rendering a box and indeed you are correct. I used a facet SOP and turned on cusp polygons, from turning on point numbers I can see that this simply adds unique vertices for each face. This does solve the problem but I am still a little confused. I was always under the impression that having overlapping vertices was bad, from a modelling perspective, or is this only the case if you are modelling from a low poly cage and intending to subdivide? I tend to prefer to model in 3ds max and do effects stuff in Houdini and I've spent a lot of time welding together overlapping vertices to prevent rendring artefacts in max.

Does Houdini's AO work differently to in other packages then as I haven't had this problem in for example 3ds max using the light tracer (which is simply an ambient occlusion ‘quick setup’ lighting solution).

Attachments:
lightTracer.jpg (65.1 KB)

User Avatar
Member
319 posts
Joined:
Offline
Where do you turn the display of point normals on in Houdini?
User Avatar
Member
7046 posts
Joined: July 2005
Offline
What I did to avoid this problem is modify the Occlusion VOP so that it (optionally) uses the Geometric normal (Ng) instead of the “regular” normal (N). Ng is never interpolated, so you don't have to do the point-unsharing stuff which is an extra and sometimes annoying step.

This also works well with Displacement shaders (use Ng instead of N to displace along) since unsharing the points will result in cracks where the unshared edges are displaced.

Cheers,

Peter B
Cheers,

Peter Bowmar
____________
Houdini 20.5.262 Win 10 Py 3.11
User Avatar
Member
319 posts
Joined:
Offline
Hey Pete, its Dean, the guy who talked to you about exploding the building for my project!

Hmm..modifying the occlusion VOP, so if I go into the VOP net of my AO shader and rewire into Ng instead of N I wont get this problem?
User Avatar
Member
7046 posts
Joined: July 2005
Offline
Yep, that should do it! Drop by the lab sometime )

Cheers,

Peter b
Cheers,

Peter Bowmar
____________
Houdini 20.5.262 Win 10 Py 3.11
User Avatar
Member
319 posts
Joined:
Offline
I'd be there in a flash to digest your pearls of Houdini wisdom but I've had to go back to my parents in Worcester as I've had to move out of my house and don't move into the new one until 10th July (start of our last term)!!

Did you get my email I sent you with some VEX questions? I only sent it earlier today but I'm not sure my uni email is working properly, I've emailed a few people lately and no one has responded, including Phil who is usually quite prompt at replying…..!
User Avatar
Member
4271 posts
Joined: July 2005
Offline
The important thing to remember when doing renders is that there is no “always works” solution.

In some cases using Ng or unfused points is exactly what you want. (Like machined edges and such.) But in other cases using the interpolated normals, (N), maybe what you want instead. The key is to understand what is going on and how to use it to your advantage.

Attachments:
NotInterpolated.jpg (11.0 KB)
Interpolated.jpg (10.9 KB)

if(coffees<2,round(float),float)
User Avatar
Member
319 posts
Joined:
Offline
I understand. I've learnt some valuable stuff! Thanks alot guys..much appreciated!
User Avatar
Member
4271 posts
Joined: July 2005
Offline
One last picture. Interpolating of normals happens at render time when the polygons are diced up into micropolygons. Below is an example of what happens.

The interpolation of normals, (or any attribute for that matter), is important for giving a nice smooth look. But it can work against you when you want a nice hard edge somewhere.

Attachments:
normals.jpg (14.6 KB)

if(coffees<2,round(float),float)
User Avatar
Member
319 posts
Joined:
Offline
So the normals of the geometry are interpolated whether the points are fused or not as it happens at render time anyway?

But the difference is the renderer knows what you are trying to achieve (i.e. hard or soft edges) by looking at the initial state of the normals before the geometry is cut into micropolys?
User Avatar
Member
4271 posts
Joined: July 2005
Offline
Dean_19
So the normals of the geometry are interpolated whether the points are fused or not as it happens at render time anyway?

But the difference is the renderer knows what you are trying to achieve (i.e. hard or soft edges) by looking at the initial state of the normals before the geometry is cut into micropolys?

Your geometry can either have user created normals that you have added/smoothed/tweaked, or if you don't have any normals on your geometry when you render, Houdini/Mantra will create some for you. Its this phase where fused/unfused points come into play. If a point is shared between a couple of different polygons that point's normal will be an average of the different polygon's normals. If your point isn't fused then the normal is just based off that polygon's normal and resembles Ng. Once the points on the polygon have either user created normals or automatically generated normals the polygon is split up into micropolygons. These micropolygons get assigned two types of normals. The first is Ng, which is the true normal of the micropolygon without any interpolation. The second is N which is interpolated from the points that make up the polygon. At this stage the renderer doesn't have any knowledge of whether your polygons points are fused or not.
if(coffees<2,round(float),float)
User Avatar
Member
7865 posts
Joined: July 2005
Offline
Dean_19
Where do you turn the display of point normals on in Houdini?

It's the button right underneath where you turn on point numbers. ie. in the viewport's right vertical toolbar, 4th button from the top or so. The catch is that you won't see anything if you haven't added point normal attributes. There's a few ways to compute default normals for your geometry. I personally prefer using a Facet SOP and turning on just Pre-compute Normals. By doing that on geometry that have fused and unfused points, you get the pictures that Wolfwood has been posting.
  • Quick Links