This has been asked a few times before, I wanted to see how other people and studios are dealing with the differences between applications' preferred scene unit scale.
The rest of our pipeline is currently modeling, animating, and rendering in decimeters.
Houdini defaults to meters, and I've been a little bit afraid to change that.
In the past I've experienced problems where some values did not get adjusted properly and some values don't have physical correlations (looking at you, flame height on the pyro solver), and I don't know what I'd need to do balance that at a different world scale.
The thing that worries me most is that when I put down a shelf example at the default of meters, it will look different than the same example will when I change the scene scale from meters to decimeters.
So my solution has been to scale assets as they come into Houdini into meters, then scale them back again on the way out. This happens automatically without any artist intervention and it works reasonably well.
One problem that we've encountered with that is that volume density is off by a factor of 10. I'm a little surprised by this, since I'd have hoped that renderers have a unit scale as well, for aggregating density along a ray. So we have the option of adjusting the shader when we export it, or adjusting the voxel values directly. I don't really want to do either, but I think that maybe adjusting the voxel scalar values on the way out is preferable to having to change the shader.
Other headaches around having a scene scale different than the rest of the pipe is that lights and cameras don't like to be scaled in Houdini, so we need to scale the translates. And light intensities and falloffs all need to be adjusted as well.
Anyway. Does anyone have a silver bullet for me? Maybe I should just get used to the settings and differences in Houdini working in decimeters? If someone has experience working this way, have you had any issues?
Thanks,
Ben
