proper sRGB viewer LUT

   10385   3   4
User Avatar
Member
30 posts
Joined: Oct. 2013
Offline
I've been doing some color-critical work of late which inspired me to try to achieve 100% viewer matching across multiple apps. In my mind, if it looks correct in Nuke, then it's correct. The viewer LUTs and colorspace transforms are very straightforward. It was rather difficult for me to achieve a 100% match in the Houdini viewer - it's especially noticeable in the shadows - but I finally got it.

I realized that my confusion stemmed from the subtle yet noticeable difference between viewing linear images through a simple 2.2 gamma curve (Houdini's default, as well as most other 3D apps), versus an actual sRGB LUT. Nuke uses an sRGB LUT by default, which I always thought was the right way to go, given my monitor is properly calibrated to sRGB, and I have other apps like PS setup to use an sRGB ICC profile.

This of course led me to try to get a proper linear-to-sRGB LUT to work in Houdini which was kind of a pain. There's a method in Nuke for LUT generation using a CMSpattern, but the resulting .blut file was way off when I imported it into Houdini (I think that had something to do with the necessity of a 1D prelut before the blut's 3D transform, but I could never quite figure it out). The other solution was to generate a LUT using the OCIO libraries in a shell scenario (I'm no programmer and I couldn't quite crack how to get ocio working in a custom environment)…or…create a LUT using a CHOPS or COPS network - again, it was a simple matter to save out the lut from a COPS net, but I couldn't quite figure out how to program in the specific sRGB transfer function. (I still consider myself a new Houdini user )

Long story short: This all led me down the rabbit-hole to wikipedia, grabbing the sRGB transfer function, and simply coding an ascii tab-delimited lut using python. It took some trial and error. I was on the right track and able to generate a 10-bit (1024 step) LUT which worked reasonably well but still noticeably different from the Nuke LUT. Finally when I upped the resolution to a 14-bit LUT (16,000+ steps) I had perceptually a perfect match between what I was viewing in Houdini versus Nuke.

So, I lay this all out if anyone else has had difficulty in getting luts to work in Houdini. The resulting lut is attached if you need it and find it useful. This whole thing made me wonder if it's simply better to stick to gamma 2.2, if it's more efficient for the graphics processing, or if that all really matters. I assume the sRGB spec is there for a reason, and if it's what we calibrate our monitors to, it'd be best to match it everywhere in every app to simplify/unify the grading process. Yes, it's a subtle difference, but it was bugging me.

Attachments:
linear-to-srgb_14bit.zip (131.8 KB)

User Avatar
Member
7709 posts
Joined: July 2005
Online
I think this would make a good enhancement request for Houdini to ship with an sRGB lut for users who wish to use it.

https://www.sidefx.com/index.php?option=com_content&task=view&id=768&Itemid=239 [sidefx.com]
User Avatar
Member
463 posts
Joined: Aug. 2014
Offline
I'd like to thank you for the LUT, Keith!
User Avatar
Member
10 posts
Joined: Sept. 2006
Offline
I know it has been a while but I thought this would fit on this topic.

I am having the color grading department generate a LUT so we can use it on the 3d department.
On Vray in maya and on Nuke the LUT works like a charm.

In Houdini it does not as in Vray there is an option to convert the Image from Linear to Log before applying the LUT.

It seems that Nuke, Maya and Da Vinci Resolve all use the LUTS the same way.

Any idea on how to have this 2d Lut (lin to log) applied before the 3d LUT ?
  • Quick Links