I'm creating a color calculator for my students and I'm trying to get my object colors to appear in the viewport with their true linear values. I could be very wrong about this; please correct me if I am
. But it seems as if Houdini is assuming that all of its input color values are in sRGB space thus “decodes” the value before performing operations. I was able to “encode” the result of my rendering shader by taking the resulting color to the 2.2 power. This ultimately places the color back to linear space. The rendering results look good. However, I'm trying to get the viewport (“openGL”)shader to do the same.
Other than writing a new openGl shader, is there another way of doing this?
I am including two images.
The first is a snap shot of display window with the swatch(in linear space)and the display with is in what I believe to be sRGB space.
BTW/ the calculator is performing(as shown) an “encode” operation which basically shifts a linear color to sRGB color space by taking it to the 2.2 power.
The second image is a rendering of that calculation, which appears to be in linear color space.
Image Not Found