Ok. In Maya you project any old UVs onto your geometry (e.g planar Y). Then you project a texture onto it using a shader with a projection node set to perspective. Then you select the object and the shader and press “convert to file texture”. That's it. Super simple baked texture.
How do you do this in Houdini? I've had look at the odforce wiki but it all seemed very complicated for such a common workflow and I didn't see anything about baking the projected texture. Is there a more simple way of doing it? I used the UVTexture SOP to create a simple projection but I came unstuck when trying to bake the results of this projection down onto the geometry.
Any help would be greatly appreciated.
Camera projected texture baking
14453 3 1- lloydwood
- Member
- 104 posts
- Joined: Nov. 2007
- Offline
- jimc
- Member
- 295 posts
- Joined: Oct. 2008
- Offline
It sounds like 2 questions:
1) how to project UV's using the Camera
2) how to bake the texture to an image
I haven't tried 1, but there several UV SOP nodes that do different things, maybe try UVProject, as opposed to UVTexture, if I recall correctly UVProject has more options. It's definitely possible.
For 2, assuming I'm understanding this correctly, once you have UV coords assigned to geometry, you can create a Mantra output node, add a custom command line parameter with “-u” and the path to your geometry object. It is a bit awkward to use, no doubt. You will need to have a Camera object called “cam1” despite the fact that it doesn't seem to matter. If you call it something else it doesn't seem to bake out correctly, or at least that was my experience. You also need to provide the resolution of your output image (make sure to override the camera settings in the mantra node).
1) how to project UV's using the Camera
2) how to bake the texture to an image
I haven't tried 1, but there several UV SOP nodes that do different things, maybe try UVProject, as opposed to UVTexture, if I recall correctly UVProject has more options. It's definitely possible.
For 2, assuming I'm understanding this correctly, once you have UV coords assigned to geometry, you can create a Mantra output node, add a custom command line parameter with “-u” and the path to your geometry object. It is a bit awkward to use, no doubt. You will need to have a Camera object called “cam1” despite the fact that it doesn't seem to matter. If you call it something else it doesn't seem to bake out correctly, or at least that was my experience. You also need to provide the resolution of your output image (make sure to override the camera settings in the mantra node).
- lloydwood
- Member
- 104 posts
- Joined: Nov. 2007
- Offline
Thanks jimc.
I managed to work it out. It turns out I don't actually need to bake the texture down. I was getting the texture swimming across the deforming geometry it was applied to so hence the trying to figure out how bake it.
What I actually needed to do was apply the UVTexture SOP earlier in the network before the geometry was deformed. I also promoted the uv point attributes to vertex attributes. Now it all works as expected and the projected texture sticks to the deforming geometry.
I managed to work it out. It turns out I don't actually need to bake the texture down. I was getting the texture swimming across the deforming geometry it was applied to so hence the trying to figure out how bake it.
What I actually needed to do was apply the UVTexture SOP earlier in the network before the geometry was deformed. I also promoted the uv point attributes to vertex attributes. Now it all works as expected and the projected texture sticks to the deforming geometry.
- jimc
- Member
- 295 posts
- Joined: Oct. 2008
- Offline
-
- Quick Links