linear workflow

   20408   15   4
User Avatar
Member
2624 posts
Joined: Aug. 2006
Offline
Hi guys ,
I am over at a large studio using Maya and MR of all things , but they have a very interesting workflow :- They don't gamma correct the HDR files they use instead they use a linear workflow and they gamma corrrect in post .
In Maya they strip off the gamma values for textures and then use a camera shader ( mis exposure_simple) so they correctly view a gamma corrected image.
This of course gets me to thinking what sort of shaders do we Houdini users have to handle a linear workflow ? .Maybe you are using a linear workflow and could share some issues problems you came accross.

PS . Maya still crashes all day long .I think senior developer needs some Houdini love .
Gone fishing
User Avatar
Member
156 posts
Joined: July 2005
Offline
Hey circusmonkey, we'll be transitioning to a linear workflow for Houdini “soon”, so I'm glad you brought this up. I'm very interested in seeing what people have to say. I put soon in quotes because our nuke lead refuses to work with Houdini again if we can't get it working, however he's not contributed to helping set it up, so we'll see when, but it will definitely come.
“In the beginning the Universe was created. This has made a lot of people very angry and has been widely regarded as a bad move.” - Douglas Adams
User Avatar
Member
4140 posts
Joined: July 2005
Offline
There's nothing in Houdini that prevents anything related to a linear flow. First off, unless you go out of your way, shaders are linear. Texture maps, typically sRGB, need to be linearized before using(same as with any renderer). All you need to do is set the mplay you're doing your shader tests into to have a gamma of 2.2(lower right of the interface). Then light/shade. Get it to look the way you want in that context, and you have a linear workflow - the images rendered from that should get read in as ‘linear’ in nuke, and you're done. The goal is to have everything - the maps, the render and the final image all linear. If you mplay your rendered image in a default mplay, it should appear ‘darker’ than you would expect. This is because mplay(and most other image viewers) makes some assumptions about what they're displaying and where. Nuke is more anal about assumptions of colourspace.

If I'm matching to a live plate, I'll typically throw a BG grid in there with a linearized plate constantly displayed on it, and render against that. You need to be careful here - again if you mplayed that plate, it would look dark unless you set the gamma in mplay to 2.2. Don't have a cineon or sRGB plate there, or something's out of the linear pipeline.

In theory you could keep everything sRGB and then reverse gamma, the downside here is that the lighting model in the renderer isn't really behaving properly(even though you might get a decent image from it). The renderer is linear unless told to do otherwise, so that's part of your flow that isn't matching the others. This tends to show up in PBR renders and anything relying on proper falloff.

Also, when you get into calibrated monitors and software, that adds another workflow possibility. I'm sure other shops have different approaches, but the theory is the same. One thing I can tell you, though, is to tread carefully in this world. Bringing up display gammas and linear workflows tends to bring out the worst in people. Some get defensive, some get aggressive, and many get awfully arrogant about it. It's well defined, but when you follow it back to the ultimate issue of all of our retinas collectively responding to light frequency, it starts to become a tad subjective. Careers float on these subjective opinions.

Cheers,

J.C.
John Coldrick
User Avatar
Member
258 posts
Joined:
Offline
Truthfully this is the exact same work flow that someone would have to follow to get a linear flow working correctly in Maya, the only difference is that in mental ray they are using a tone mapper, in this case mia_exposure_simple to render into at gamma 2.2. This tome mapper than either needs to be deleted or rendered in non tone mapped buffers.

s
User Avatar
Member
26 posts
Joined: Nov. 2008
Offline
Hi!

I'm very interested in the current state of a linear workflow in Houdini. We're using Softimage 7.5 at the moment which offers a completely hassle-free workflow when it comes to rendering with linear lighting. You don't have to care about de-gammaing your sRGB textures or correcting your viewports to get the right result and it's a great combination with nuke / OpenEXRs.

Many months ago in this thread there was word about something like this in Houdini “soon”.
So, are there any advances in this direction already?

Regards
Ceterum censeo Autodesk esse delendam!
User Avatar
Member
1533 posts
Joined: March 2020
Offline
hi

we are using linear workflow here at blackginger, one or two issues i have with linear workflow in houdini:

we need gamma(LUT) control on texture nodes, LIKE XSI, this is really the way to go, at the moment i have to add a color correct in my shader to down gamma, i'm not sure that this is doing what it's supposed to, seems a little off!

color swatches, i've set mine to 2.2, but when middle mouse on a swatch to sample colors, they aren't sampling the correct values, BAD..

Exposure control for HDR images, how long have we known about high dynamic range?

Tone mapping, we could really do with this to!

viewport color display, not sure if you can change this? but when using 2.2 gamma swatches, the displayed color on geometry in the viewport is still at gamma 1.

energy conserving shaders, i find with gamma 2.2 that no having conserving shaders, bad lighting models become a lot more apparent, especially when trying to match live action elements!

having separate control over gamma in mplay on a backgound image would be great, that way i don't have to go save out a separate proxy with 0.45 gamma.

also, i do find that setting the gamma to 2.2 in mplay doesn't quite match what is diplayed in nuke, this is on the same monitor at the same time btw. XSI (again, sorry) ships with a sRGB LUT, now if i render from XSI with this, it matches nukes display….something SESI might want to look into doing

overall i think LWF shouldn't really be some thing that you have to worry about(but you should know about it ), it should just be there and it should work!

jason
HOD fx and lighting @ blackginger
https://vimeo.com/jasonslabber [vimeo.com]
User Avatar
Member
26 posts
Joined: Nov. 2008
Offline
Thanks for your instant reply!

Very promising to see that LFW seems to be a widely used feature, even though Houdini doesn't help us much when it comes to actually enjoying its advantages.
In XSI (or “Autodesk Softimage” as it is called now) LWF opened a whole new world to our rendering / compositing and made it possible to easily and reliably exchange objects and shaders between scenes and even projects without having to use lighting hacks over and over again to get realistic or matching lighting.
Ceterum censeo Autodesk esse delendam!
User Avatar
Member
320 posts
Joined: Aug. 2007
Offline
jason|slab
hi

we need gamma(LUT) control on texture nodes, LIKE XSI, this is really the way to go, at the moment i have to add a color correct in my shader to down gamma, i'm not sure that this is doing what it's supposed to, seems a little off!

jason

Hey Jason, try this function on your texture maps and see if it helps. This was borrowed from Dan Maas's Siggraph paper “What the Ri Spec never told you” which has some good info on a linear workflow inside prman.


// decode from sRGB luma to linear light
float sRGB_decode_f(float f)
{
float lin;
if(f <= 0.03928)
lin = f/12.92;
else
lin = pow((f+0.055)/1.055, 2.4);
return lin;
}

vector sRGB_decode(vector c)
{
vector d;
d.r = sRGB_decode_f(c.r);
d.g = sRGB_decode_f(c.g);
d.b = sRGB_decode_f(c.b);
return d;
}
www.alan-warren.com
User Avatar
Member
1533 posts
Joined: March 2020
Offline
nice i'll give it a try as soon as i can
thx alan

jason
HOD fx and lighting @ blackginger
https://vimeo.com/jasonslabber [vimeo.com]
User Avatar
Member
2624 posts
Joined: Aug. 2006
Offline
we need gamma(LUT) control on texture nodes, LIKE XSI, this is really the way to go, at the moment i have to add a color correct in my shader to down gamma, i'm not sure that this is doing what it's supposed to, seems a little off!

Hey jason ,
Just plug the output of your texture vop into a pow function , then plug a constant into the into the pow function and give it a value of 1/0.454545 .That will de-gamma your textures correctly

xposure control for HDR images, how long have we known about high dynamic range?

I think if you created a spare parameter and used pow(2.0 , ch(“exposure”)) you could get your stops

Mplay could certainly be better , recently I have submitted a raft of RFE's that would make it useful for lighting / plate matching as in its current form its somewhat crippling for lighting.

r
Gone fishing
User Avatar
Member
665 posts
Joined: July 2005
Offline
Out of curiousity, what is a good viewer for lighting?
User Avatar
Member
12468 posts
Joined: July 2005
Offline
Linear colorspace workflow is common for large facilities and this the basic setup that I've used at two companies:

- All plates and textures are saved out in linear color. This makes the entire process less confusing and causes less redundant corrections. Also, you don't need a separate background correction in Mplay then.
- Textures in photoshop are painted in linear space, but using an ICC profile which applies a viewing LUT.
- All HDRs *must* be in linear colorspace – applying a gamma to an HDR doesnt make sense because there are values above 1 which will be destroyed.
- All textures are applies as-is (in linear space) so no on-the-fly correction is required.
- All renders are rendered in linear space (which is the default)
- All viewers (MPlay, Nuke, in-house compositors and viewers) apply the film color-cube (or reapply sRGB in the case of non-film projects like commercials) in the viewer. This is important! Also most viewers (except mplay, but a wrapper might kinda solve this in an ugly way) will detect a floating-point precision and assume the input is linear. If the information is 8-bit (like a .jpg), the image will be de-sRGB'd (ie. linearized) on the fly upon load- then the color-cube/lut gets applied by the viewer as usual.
- The compositor bakes in a color cube and reapplies the gamma when saving .dpx files for film-out.

The point here is to linearize everything, render in linear, composite in linear, and at the very last second, export in the space that the output demands.

I do agree that some features (like exposure control) are still outstanding. Supervisors like to talk in “stops”. And there are other features that lighters could use.


More later,
Jason
Jason Iversen, Technology Supervisor & FX Pipeline/R+D Lead @ Weta FX
also, http://www.odforce.net [www.odforce.net]
User Avatar
Member
1533 posts
Joined: March 2020
Offline
hey

yeah, i guess painting textures in linear is a good idea, but getting the texure artists to understand this might be an interesting task, especially when they use ref pics from digital photo's or the net…
at the moment we treat everything as linear from the lighting/shading stage through to comp, then output finals in the correct color space.

jason
HOD fx and lighting @ blackginger
https://vimeo.com/jasonslabber [vimeo.com]
User Avatar
Member
7721 posts
Joined: July 2005
Offline
A very important thing about a linear color workflow in Houdini is to go to the main menu and choose Edit > Color Settings > Color Correction and type in a Gamma value (or supply a LUT) that is appropriate for your display.
User Avatar
Member
575 posts
Joined: Nov. 2005
Offline
good to know. but this does not prevent the colorpicking issue, unfortunately

does anybody have a LUT for sRGB to share?

Martin
User Avatar
Member
575 posts
Joined: Nov. 2005
Offline
i still have a basic question about linear workflow.
for all renders, especially with global illumination, gamma 2.2 looks much more natural. at gamma 1 they do look much to dark.

now i tried to view a simple linear gradient. viewed at gamma 1 it looks better, than viewed at gamma 2.2. it just looks not evenly, I tried it in mplay and nuke( here I used sRGB). But with 2.2 gamma a linear gradient does not look correct. what do I missunderstand here?

Martin
  • Quick Links