Ambrosiussen
Hi Ron!
First of all, thank you for taking your time to share some ideas with us!
Like you said, we currently do not have a solution that does exactly what you describe in that way.. But we do have a solution to achieve the same effect using some of our other tools.
If you merge multiple pieces of geometry together, their UV are overlapping.. Which is what you'd like to solve. So to fix that you would run that through the UV Layout node. This should ensure your UVs are no longer overlapping, but your textures would not match anymore. This is where tool nr 2 comes in. With our baking tools you can either bake high poly geometry to low-poly geometry, or simply do something called “texture-reprojection”. This transfers texture data from one mesh to another based on proximity. (Therefore generating a new texture for you) Your low-poly would be the mesh with your UVs run through UV-Layout, and the High Resolution mesh would be the originals.
UV Layout - http://www.sidefx.com/docs/houdini/nodes/sop/uvlayout.html [www.sidefx.com]
Baker - https://vimeo.com/244246886 [vimeo.com]
If you have a specific reason why you'd like to composite the textures together rather than doing a layout of your UVs please do let us know.
Paul
The drawback to this method is that you would cause wasted texture space on anything in the scene that was a duplicate. For instance, say you have a game level with duplicated props (boxes, shovels, barrels, etc etc) and even buildings that share tiling textures. If you repack the UV's on the merged mesh and transfer texture data you've essentially made all actors unique textures meaning you will lose a ton of texture space on your atlas.
The alternative would be everything is merged and in the process of merging any actors sharing the same material would have their UV's repositioned together into the correct section of the resulting atlas.
This second option is the most useful in many cases for game development, especially for optimizing levels for VR and Oculus Quest or mobile games. This is something I'm currently exploring. It would be tricky to use Mosaic COP since I cant see any way to keep track of meta data to know which texture went where and how to properly match the UV's. It also doesnt work with different size textures which isnt great either. I guess the best solution would be a python approach to atlasing the textures?
The current approach we are looking into for working with mobile and on VR titles would be to input dozens of meshes, have them combined into a single merged mesh, atlas the textures associated with the assets, and correctly line up the merged meshes' uv data to the new texture atlas layout.