from displacement map to actual geometry + creating world/object space normalmap on a lowpoly created from the highpoly

   9324   24   1
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Hey All!

Here is my newest challenge
I have a simple subdivided, latticed mesh with uv on it, and i assigned a principleshader on the mesh with a displacement map created externally
Now i see the displacement map on the surface in scene view, and it looks ok, but i want to create a real/actual geometry from this displacement driven surface, but how?
I need some kind of “freeze displacement on geometry” magic node

i still not there, but i'll need also a world/object space normal map bake “ability” from the “freezed” highpoly to the lowpoly, so i can create my tangent space normalmap in any external software (if i have to)

hip file attached, but i didn't attached the input map using to displacement.
but it can be any map for test, i just need the magic “convert displacement to real geometry” solution

Thank You for your help in advance!
Edited by ostyascukor - 2019年7月15日 18:01:20

Attachments:
displacement_vs_realgeo.jpg (403.8 KB)
lattice+uv+displacement+wsnormal.hip (291.5 KB)

User Avatar
Member
900 posts
Joined: 2月 2016
Offline
hi, you can use a ‘attribute from map’ node to import the color information on the geometry and then use that to drive the displacement, something like this in a vex wrangle:

@P = @P + (@N * @Cd.r * mult)
Edited by Andr - 2019年7月16日 02:23:45
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Sorry but what is a vex wrangle (You mean attribute wrangle?)

It would be great if i can use directly the map connected as displacement with the actual material settings (such as effect scale, offset.
My displacement map is coming “through” a principleshader.

Ok, here is my final goal, so maybe with this info You can help me out with more efficiency

I want to “use together” subdivided, latticed, uved meshes with displacement map + Base color (if it's even just a constant color because of matid) + Tangent normalmap (tangent map “baked” on a simple plane (like a tileable “2d” texture, it's only shows on the latticed sdived mesh)
After i got my final mesh set (for example rock walls with stony ground) i want to “freeze” each of my highpoly mesh with its own displacement map (and settings are coming from the material).
Next i want to merge those mesh into one constant surface (like a convert to vdb) and then optimize the merged mesh into lowpoly.
After that i want to uv my lowpoly mesh, and then i want to calculate
-highpoly displacement modified normals combined with the tangent space normalmap (added also in the material) into a world/object space normalmap based on the lowpoly and it's uv
-each Base Color on each material (even the one with just a constant color) into one Base Color based on the lowpoly and it's uv

But first step, i would like to “freeze” my displacement map (added and set by material) driven geometry to get a workable (mergeable) highpoly mesh, that i can convert into lowpoly mesh

Thanks for the fast reply and the help!
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
another problem is (shot attached), displacement map stepness:
i have a 16bit png map as displacement, and i still have stepness (map works correctly in substance designer, marmoset toolbar)
is there any trick like file format (in zbrush there is (was?) a trick: only psd fileformat was ok to apply a displacement map to a surface), or anything else?
Edited by ostyascukor - 2019年7月16日 05:07:19

Attachments:
disp_stepness.jpg (191.7 KB)

User Avatar
Member
900 posts
Joined: 2月 2016
Offline
I never understood how to get rid of this problem, but I'm not a rendering guy.
Are u seeing it in the viewport only, on in the rendering too?

Try to play with texture filtering settings first, i think I was having good results with gaussian and 20-30 width. Or maybe was another kind of filter (like hinning). Do some testing.

And if that's not doing the trick maybe try other formats like openexr
Edited by Andr - 2019年7月16日 11:12:50

Attachments:
filtertexture.JPG (53.1 KB)

User Avatar
Member
474 posts
Joined: 7月 2005
Offline
Hi,

I have no idea to get the attributes from the principleshader directly. But if you just want to displace a geometry along the normal you can use the colormap() function to get the color information for each vertex by its uv. After this you can save this information into the point of this vertex. Each point can share potentially more than one vertex, but this should be no problem since the color/uv will be the same.
After this you can use this point information for displacement like Andrea discribed.

Ok this can be a first step.
Edited by Aizatulin - 2019年7月16日 17:51:40

Attachments:
lattice+uv+displacement+wsnormalX.hipnc (298.7 KB)

User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Thank You for both the answers!

Sorry i forgot to mention i already “solve” (just “stoled” a very simple method that no need any scripting) the displacement to real geometry problem.

I tried out if i move my uv with real geometry deformation (displacement), and it's not that fast as if i use the material version trough viewport (is it opengl displacement what we see in viewport in realtime?), but it's surprisingly fast enough to set my mesh in almost realtime.

Now i have two big questions:

1-how can i tell Houdini to bake from highpoly to lowpoly the normals and base color (or any info to lowpoly)
now i see in output network i can assign uv object (i guess that's the lowpoly) per object, and i can't access into the object to show this is the lowpoly and this is the High Res Object.
So my question here is how can i get my lowpoly (which is created with polyreduce node) from highpoly “object” network to another object network, so i can bake out (hopefully) my high res infos to my uv of the lowpoly mesh. i even made output nodes, but it not helps to output, it sees only the object node that contains both highpoly and lowpoly meshes (shot attached)

2-i have a tangent normalmap (which is part of my “texture set” with the displacement map) that should applied to the highres mesh (yes on the high res, not the lowres mesh)

question: is there a way to bake the highpoly mesh displaced positions (what i already have as an actual geometry) with the tangent space normalmap applied on the highres mesh, but only with the tangent space normals as world space normals during the baking process?
I attached a shot that shows how it's working in modo.
i assign the tangent map as (tangent) normalmap on the material, and then i bake out the “normal output” into a world space normalmap which contains not just the mesh normals but also the added tangent normalmap converted into “world space”.

It's important to do like that, because i rotate my high res objects (plus the uv's can be rotated on the high res object) plus i need the freedom to create my lowpoly (not the highpoly) mesh tangent normal in special software (or just want to make more than one tangent versions). It's also important to get this info from the high res to the lowres mesh.

Thank You for your patient
(sorry for the word combinated yes, it's combined )
Edited by ostyascukor - 2019年7月17日 03:05:45

Attachments:
lowpolyhighpoly.jpg (478.2 KB)
t2_ws.jpg (799.6 KB)

User Avatar
Member
474 posts
Joined: 7月 2005
Offline
Hi,

First of all: I'm not very experienced using normal mapping, but I'm also interested in this topic right now.
For your question 2:
You can try to use polyframe to get tangentu, tangentv, N (normal), which should represent the tangent space.
I will try this myself later but it will take some time.
The other thing is that, if you want to transform your geometry you have to convert your normal map into a height map, which is not trivial (I haven't an easy solution).
From what I have found an naive approach can be integrating/accumulating your normal map row by row and col by col with border constraints. You can use the normal-map X and Y component for this.
The problem will be, that the intersection will not necessarily match, so you may take the average or something and add the delta of this position on each following point of each row, col. I will this try myself soon.
This is only an idea.
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Hey Aizatulin!

I don't want to use my normalmap to transform (deform) my geometry. I have displacement map that is doing it.

As far as i know You can't transform normalmap into heightmap
i mean You can but will give false result, because normalmap is a “fake” something that only represent the divergency of the face normal (in per pixel) from the default (in normalmap it's 1,1,05 or 255-255-127 in rgb, and red and green channels represent the 2d divergency ), instead of heightmap that represents the ascent and descent of the surface.
normalmap “bends” the mesh normals with it's tangent values, heightmap (displacement map) push and pull the surface, nothing to do with the original mesh normals (only pushing vertex positions that of course can affect the normals ).

What i need here about normapmap is what i attached as shot secondly.
It's basically gets the normals of the mesh and mix it somehow (sadly don't know how, but maybe some i should try to compare the mesh world space normalmap with and without the tangent map on it to identify the mixing method), but the more important thing is not just the mix, but it's converting the tangent space normal map into world space normal considering the tangent map pixel on the real mesh.
And You can only doing that only by baking the highresh mesh with tangent normalmap on it to get back a world space normalmap that contains:
-high res mesh normals (not the positions)
-the tangent space normalmap as world space “normalmap” but mixed with the high res mesh normals

Please also don't forget to mention any useful info about baking from high res to low res mesh in Houdini, if You have any ;-)
User Avatar
Member
474 posts
Joined: 7月 2005
Offline
As far as i know You can't transform normalmap into heightmap
i mean You can but will give false result, because normalmap is a “fake” something that only represent the divergency of the face normal (in per pixel) from the default (in normalmap it's 1,1,05 or 255-255-127 in rgb, and red and green channels represent the 2d divergency ), instead of heightmap that represents the ascent and descent of the surface.
normalmap “bends” the mesh normals with it's tangent values, heightmap (displacement map) push and pull the surface, nothing to do with the original mesh normals (only pushing vertex positions that of course can affect the normals ).

Substance Designer for example has the function “Normal to Height”, which converts a tangent space normal map into a height map. I have no idea if this implemented in houdini, but I think it is not, so my idea in my previous post was to rebuild this in houdini. I don't know how Substance Designer version is implemented, but I think that the main idea is to accumulate the X,Y values of each texel of the normal map along some paths to get a height profile.
Anyway this approach can't be 100% accurate, since the normal map is discrete and due to numerical inaccuracy. But this can interesting, if you don't have a height map already.



What i need here about normapmap is what i attached as shot secondly.
It's basically gets the normals of the mesh and mix it somehow (sadly don't know how, but maybe some i should try to compare the mesh world space normalmap with and without the tangent map on it to identify the mixing method), but the more important thing is not just the mix, but it's converting the tangent space normal map into world space normal considering the tangent map pixel on the real mesh.
And You can only doing that only by baking the highresh mesh with tangent normalmap on it to get back a world space normalmap that contains:
-high res mesh normals (not the positions)
-the tangent space normalmap as world space “normalmap” but mixed with the high res mesh normals

Please also don't forget to mention any useful info about baking from high res to low res mesh in Houdini, if You have any ;-)

I haven't this done yet, but I think there should be posts related to this (I will probably try this myself soon, but I don't when). I'm not sure in this, but if you'll create a normal map (tangent space) using low poly geometry and a high poly geometry, you will use the tangent space from the low poly geometry and project its normals on the high poly geometry (which can be a height map transformed version of a subdived copy of the low poly geometry for example). From the projected point there exists a normal, which represents the new normal used for the normal map. If this normal is represented in tangent space in can be also represented in object/world space. The tangent space should be obtained by polyframe (tangentu, tangentv) (with style: Texture UV Gradient) and it is used for the low poly geometry.

There is an official video: “H15 Masterclass | Mantra Rendering and Texture Baking”
https://vimeo.com/143618200 [vimeo.com]
I haven't watched the whole video yet, but the baking part start at minute 50 around.

If you want to go deeper here is an interesting paper:
Morten Mikkelsen: Simulation of Wrinkled Surfaces
http://image.diku.dk/projects/media/morten.mikkelsen.08.pdf [image.diku.dk]
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
“if you'll create a normal map (tangent space) using low poly geometry and a high poly geometry, you will use the tangent space from the low poly geometry and project its normals on the high poly geometry”

i attached a shot that maybe better explaining what i'd like to get (sorry, if You already got the point)
i don't want to use the tangent space map from lowpoly geometry.
my tangent space normalmap (what i want to use on the highres mesh) is mesh “independent”, because it's converted from the displacement map (without any high or low res mesh, it was just a texture conversion from displacement to tangent)
i need to use the tangent normalmap (that is created from the displacement map), because i don't want to use ultra high res highpoly to represent every micro detail to the lowpoly mesh. a bumpmap, or tangent normalmap is perfectly enough to show the details, and displacement map is only needed to give to the high res mesh a more detailed shape then just a simple subdivided mesh.

the video You linked also shows that i need somehow separate my highpoly and lowpoly mesh.
my problem is still that i don't know how can i set my node system to export out my lowpoly and get back somehow just to use it in a separated Geometry node (in the Object network), even i'm not sure if i have to export my lowpoly to import it back into a separated Geometry node, but if i have to then how can i make it “automatized” just to bake from highres to lowres mesh.

Thank You for both the pdf and the video!
Edited by ostyascukor - 2019年7月18日 18:34:38

Attachments:
disp_tangent_bake_ws.jpg (583.1 KB)

User Avatar
Member
474 posts
Joined: 7月 2005
Offline
Ok

Let me summarize:

You have both a normal map and a height map (tangent space)(which are related to each other).
You have a low res geometry, a high res geometry (non transformed and transformed with height map).
Now you want to transform your normal map (tangent space) to world space using the existing high res geometry (transformed).

I haven't done this yet, but in theory this should work.

Loop over all texel of the normal map (tangent space).
For each texel use the uv-coordinates to get the vertex/point from the high res geometry which has the closest uv (uvsample() looks like the function which can do this).
Maybe this vertex/point is even interpolated prim/(intrinsic)uv value. uvsample() will return any (interpolated) attribute at this position.
You can use polyframe to calculate tangentu tangentv (attribute gradient for UV) for example, which can be used for the (non intrinsic) UV coordinate (needed for uvsample()).
(tangentu, tangentv, N) should represent the tangent_space matrix.
Now recalculate the Normal (from color value) and apply the tangent_space matrix on it. The new normal should be at least in object space. Now retransform the Normal back to color space.
I have tried this all on object level (I haven't tried this by a shader).
The result ist still looking a bit weird and it looks that this tangent_space matrix is not the right transformation…
Maybe my file can help you a bit or here is someone else with more experience doing this.
But if you put in your normalmap make sure to replace the string name and take the right resolution for the grid.

Attachments:
normalmap_transformation.hipnc (1.7 MB)

User Avatar
Member
75 posts
Joined: 2月 2019
Offline
“Now you want to transform your normal map (tangent space) to world space using the existing high res geometry (transformed).”
And bake it to (the uv of) the low res mesh.

Sorry but i can't confirm or deny your detailed explanation
I have no idea how it should work, or how it works for example in designer (there You can bake any tangent map+mesh (you don't need high res of course) to any other mesh as world space normalmap (i don't know the exact name all of sudden, but i already used it)
Maybe You (or anybody with this kind of knowledge) to examine the math behind that kind of solution (to a programmer i guess it's not a big challenge)

Thanks for the feedback!
Edited by ostyascukor - 2019年7月20日 09:13:50
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Hello again!

i found this very interesting presentation from 2015, and it fits here perfectly:
https://www.slideshare.net/VesselinEfremov/authoring-of-procedural-rocks-in-the-blacksmith-realtime-short-51753674 [www.slideshare.net]

i made my own extraction just to see if i understand it right (shot attached)
the only extra thing is i still need my world space normalmap, but now from this lowres mesh with the “stolen” normals from subdivided mesh+tangent normal on this lowres mesh with the modified normal.

basically now i don't have question, just please confirm that:
- if i bake out the world space normal of my lowres mesh+tangent map on the mesh, will give correct result, so i can “recompile” my world space normalmap with any external “tangent creator” (that knows world space normalmap of course)

only “question” is what will happen my lowres mesh with the custom normals (stolen from subdivide mesh) if i export it like fbx (i guess it should keep it's custom normals). please confirm that if it's possible to export from Houdini as fbx with custom normals (created in the geometry network)

Thank You for your answer in advance
Edited by ostyascukor - 2019年7月21日 08:16:25

Attachments:
displacement+tangent.jpg (1.1 MB)

User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Hey All

http://www.sidefx.com/docs/houdini15.0/examples/nodes/out/baketexture/BakeHirestoLowres [www.sidefx.com]
how can i get this example file?

it's not there in houdini local render node examples

or more like i still have that question how can i get a primitive from a geometry able to load in it the bake texture render node (as UVObject1) and get from the same geometry another primitive able to load to the bake texture render node (as High Res Object 1)
i hope i don't have to save (export) my lowres primitive to disk and then load (import) it in to another geometry just to bake.
i would kill the flexibility and the speed of baking process

plus is there a way to use 16bit png instead of 8 bit png?

Thank You
Edited by ostyascukor - 2019年7月26日 08:08:03
User Avatar
Member
900 posts
Joined: 2月 2016
Offline
you could look at the gamedev toolset, there are a couple of nodes that simplify the baking process.
The traditional rop baker node is a little bit annoying to use (works at object level only, and u need to rmb to have the display flag on for both low and high poly object)

https://www.sidefx.com/tutorials/game-tools-maps-baker/ [www.sidefx.com]


Also, if you want to load the example files from the docs, you should be able to load them from in-software help instead of the online version.
Edited by Andr - 2019年7月26日 08:09:54
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Hey Andr

i tried that game tool stuff, and yes there is a baker node that (should) allows to bake within a geometry from one primitive to another.
however i get this error message when i'm trying a simple high-low res bake with basecolor+normals

Error running callback:
Traceback (most recent call last):
File “<stdin>”, line 1, in <module>
File “opdefgamedev:op/sop_simple_baker::2.0?PythonModule”, line 6, in render
File “CPROGRA~1/SIDEEF~1/HOUDIN~1.258/houdini/python2.7libs\houpythonportion\ui.py”, line 850, in decorator
return func(*args, **kwargs)
File “CPROGRA~1/SIDEEF~1/HOUDIN~1.258/houdini/python2.7libs\hou.py”, line 51183, in render
return _hou.RopNode_render(*args, **kwargs)
OperationFailed: The attempted operation failed.
Error: Error rendering child: /obj/displace_test/sop_simple_baker1/ropnet1/rop_games_baker/render_plane
Error rendering child: /obj/displace_test/sop_simple_baker1/ropnet1/rop_games_baker/shell2
Error rendering child: /obj/displace_test/sop_simple_baker1/ropnet1/rop_games_baker/render_plane
Error rendering child: /obj/displace_test/sop_simple_baker1/ropnet1/rop_games_baker/render_plane
Error rendering child: /obj/displace_test/sop_simple_baker1/ropnet1/rop_games_baker/render_plane

i attached the hip file + a zip file with diff+height pictures (in hip file the nodes are using textures highlighted with red color)
Edited by ostyascukor - 2019年7月26日 09:25:30

Attachments:
lattice+uv+displacement+wsnormal.hip (1.3 MB)
test.zip (15.0 MB)

User Avatar
Member
900 posts
Joined: 2月 2016
Offline
Oh, I was experiencing the very same problem with the simple baker.

Instead try the maps baker node of the gamedev tool, which is a different node (and different tech)
The tutorial I linked is about it
User Avatar
Member
75 posts
Joined: 2月 2019
Offline
Thank You, that's better!
is there a way to set 16 bit png or tiff?

maybe should i write something “magic” here to get the 16 bit png?
$HIP/render/${HIPNAME}_$(CHANNEL).png

one more very important thing is to bake back my tangent map (which is on highres mesh) to the lowres mesh as world space normalmap
is there a way to set a texture on the high res mesh as tangent normalmap?
User Avatar
Member
900 posts
Joined: 2月 2016
Offline
16-bit TIFF (.tif16)
from here:
https://www.sidefx.com/docs/houdini/io/formats/image_formats.html#16-bit-image-formats [www.sidefx.com]
  • Quick Links