Combining shaders

   9275   18   0
User Avatar
Member
2199 posts
Joined: July 2005
Offline
Having just explained to someone for the 11 millionth time how to combine two or more shaders together in VOPs, and having to acknowledge how complicated and inflexible it seems, I'm beginning to wonder why there is no way to do this at shader level. Ok it's not going to be useful for full on feature work where you problem want complete control over every shader and combination shaders but we and other small companies would get a lot of mileage out of having something like a blend shop that could take the outputs from multiply shaders and comp them together without having to turn the whole thing into yet another seperate shader.

From the outside it doesn't seem like it would be too difficult to say that this comp shader would just be able to evaluate Cf from each input shader and then combine them using a mask and choice of how to combine the values, (multiple add etc)

Would this sort of thing be possible? It would mean we could write much more moduler shaders. Or would it take mantra too far from renderman?
The trick is finding just the right hammer for every screw
User Avatar
Member
710 posts
Joined: July 2005
Online
Not a bad idea. The problem that I have currently is that I want multiple materials in one VOP net, and these different shaders are supposed to have different specular properties. But I can't use more than one Lighting Model VOP in the whole thing (at least it seems to me that way) because it screws up the lighting (certain parts of the shader are much more sensitive to light changes). A VOP for specular only or something like that would be neat, but of course if we could blend at the shop level that wouldn't be necessary (but still nice to have).
Hope I'm making sense. ops:
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Simon - I really hate to do the old “why don't you build it yourself” response, but…

With the powers of HDA why not build it yourself? To be more specific: setup a point attribute structure that you like, so you can paint(or whatever) generic shader attributes in a SOP HDA. Let's say you call the attributes Generic1, Generic2, etc., or better yet let the user name them. Then mimic the structure in a VOP HDA that accepts multiple inputs and blends between them based on those named attributes. You're right, doing it all from scratch once every 6 months *can* be complicated or easy to forget for many animators, but if you do the prebuild once you can make it easy to use.

DaJuice - AFAIK there's nothing stopping you from using multiple Lighting Models in a given shader - just be sure to assign the blending properly so you're only using what you want. You're right, though - the ultimate solution is to unwire specular from diffuse, which is why it was probably one of the first things Mario here did when he dove into VOPs. That's a good first step, SESI - and really easy to do - in addition to the Lighting Model, add Diffuse and Specular Model VOPs so that those not quite into the heavy coding phase can quickly build their own.

Cheers,

J.C.
John Coldrick
User Avatar
Member
2199 posts
Joined: July 2005
Offline
Hi JC,
“Why don't I do it myself” good question. Two answers, one I'd like it to be available to everyone so we can stop answering the question “how do you do it in VOPs” every couple of weeks :wink:

Secondly and more importantly I'm an artist not a programmer and can't use the HDK ops: If someone wants to write some really good tutorials on how to use it and makes 42 hrs in a day instead of the current maximum 24 then I might have a chance of learning it. Just thought it would be quicker and better for Sesi to do it.


Also I think you maybe slightly misunderstanding what I'm after. I know how to do it in VOPs but everytime you combine shaders in VOPs you end up with a completely new combined shader which means loads of redundancy.

For example: I have VOPs (VEX code actually, but it amounts to the same thing) for vessels, spots, pimples, muscle, ligament, fascia etc etc all things human and organic.

Then to combine them I either need to make an uber shader with every single element in it, which is massively difficult to manage and imposssible to get other artists to understand, or I have to make shaders with every possible combination, for only 12 basic VOPs this comes to 479001600 individual shaders, that a lot (Ok I'm slightly exagerating to make a point, but you see what I mean). Just doesn't seem like a very Houdini way of doing things, not procedural enough.

What I'd like to do is make each element a fully fledged shader in it's own right and then let the artists combine them on the objects by making a blended shader just with the elements they need, they wouldn't need to go anywhere near VOPs and I'd only need the 20 or so basic shader types. The shader sop would need to be extended so that as well as taking groups it would accept an attribute to use as a mask and everything else would be set up in the blend shader. Nice and simple. At least to me it seems simple, don't know the Houdini architecture impacts this would have.
The trick is finding just the right hammer for every screw
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Fair enough. Just to be clear, though, I wasn't suggesting the HDK in *any* way, shape or form. Of course that's only for TDs and those who happen to program. However, I understand your point about not having to use VOPs. IMHO VOPs *should* be usable by everyone - but if you really want to shield the endusers that much then yes, having an all-in-SOPs GUI built to blend Shaders would be nice. Of course, it would have to be based around a different technology than shader strings(what's being used now), since you can't assign “23% of a string”. Certainly doable, though, with some sort of attribute approach.

Cheers,

J.C.
John Coldrick
User Avatar
Member
2199 posts
Joined: July 2005
Offline
It's not entirely about sheilding users, it's much more about effirciency, much easier, quicker and more intuitive to wire 3 shaders into a blend op than it is to save three new vops and then combine them together and then write out a new shader, and then read the new shader back in and then setup the values back to you want to use, IMHO.

Also I'm not advocating combining shader string at all. I just want something that evaluates the Cf for each shader and then comps the result together based on attributes found on the geometry. Easy peasy. Of course doing things in that order might break the Renderman compliance so perhaps it's not possible, but in my mind it just seems neater. Probably just me though, as usual.
The trick is finding just the right hammer for every screw
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Renderman and mantra compliance, for that matter, so certainly nothing like that would be umplemented directly. However, no question another layer of software abstraction could be put over top to allow this…just a question of when/if. Really I think it comes down to demand, and how one answers the questions “how can you guys make this easier?”. That could take so many forms.

Cheers,

J.C.
John Coldrick
User Avatar
Member
648 posts
Joined: July 2005
Offline
perhaps something like a ‘vop merge’?
compiling with dependencies might be difficult,
maybe just shove everything into one shop. also
a pre-render ‘compile-all-and-use-temp-shops’
option might be good for test-rendering before the
vops are finished. uh, unless it does that already…

-cpb
User Avatar
Member
941 posts
Joined: July 2005
Offline
Hey Simon,

Well… I've been thinking about something like that since…oh; forever. And so has everyone else… remember “Looks”? (PRMan's early attempt at modular shading ). Tricky stuff, that – here are a couple of tiny issues for example:

1) At what point does a block of shading code become a “shader”?… Is a BRDF a building block, or is a lighting model “as small as you will go” in terms of what constitutes a module – IOW: what is an atomic shading unit?
2) Assuming you answered question #1; how does one go about parameterizing the little monster? How is consistency in parameter name/label/type/ui enforced? (surely we don't want to see “Kd” in one shader, and “Diffuse Amplitude” in another – talk about confusing the user!)

Let's say we had bunch'o these modular shaders that we used all the time; and let's say our target shader made use of 10 of these premade units. Now assume each unit requires an average of 8 parameters, for a total of 80 parameters. It's not unreasonable to think that for that number of “units” you'll probably be looking at more than 10% redundancy in your parameter set – i.e: 8 params will need special attention so they collapse into a lesser number (through indirect referencing, etc.). Of the remaining number (~ 70) you must choose which ones are truly important for this particular aplication so you may trim the rest…… I'll stop since I'm certain you can see where I'm going.

I'd love it if someone could come up with a nice clean solution… but I haven't seen one yet. Or at least; nothing that would be any cleaner that what we can already do with VOPs (there isn't much difference between blending 20 “shaders” and blending 20 VOPs) – unless you impose the unimaginable restriction that all these shading units are constant (no params)… but I'm sure no one wants that.

Not too long ago I was reading a bunch of articles on a C++ design technique called “Mixin Layers” – it's part of generic/generative programming, and concerns itself with solving a similar problem: defining an object through multiple layers of functionality and refinements, etc. And I'd say one of the biggest problems even there, is proper handling of parameters, and initialization of the various collaborative bits and pieces. oops… going off topic; sry. but check it out if you feel like it – interesting ideas.

I dunno…. we chose to modularize the functionality at the VOP level, and it seems to be working out OK. In some cases we'll screw up by over/under-estimating the level of refinement for a building block, but we let actual production usage determine what the granularity should be. Nothing's perfect, but VOPs have helped *tremendously* toward shop-wide reuse of shading components.

LOL; sorry to go on like that – it's an interesting topic, and one where the devil is very much in the details. I'd say “get ready for the 11-millionth-and-one explanation”!


Cheers!
Mario Marengo
Senior Developer at Folks VFX [folksvfx.com] in Toronto, Canada.
User Avatar
Member
2199 posts
Joined: July 2005
Offline
I don't know if it's me being dumb or everyone else missing my point.. ops:

1) At what point does a block of shading code become a “shader”?… Is a BRDF a building block, or is a lighting model “as small as you will go” in terms of what constitutes a module – IOW: what is an atomic shading unit?

At the point you set Cf, no?
All surface shaders eventually eval to Cf.

So yes a lighting model could be as long as the output from it is contained in Cf, and not a specular, diffuse and ambient components seperately.
A BRDF building block could be too. Just as you might render a pass that is just specular so a shader could be just a specular component. But the output from the BRDF would be wired into Cf. So it wouldn't matter what it was called internally to the shader.

Personally I was thinking of the building blocks being much more like complete shaders as the term means now - like wood or brick or stone. Each shader comes with a lighting model and a full compliment of parameters that define it as a real world surface.


2) Assuming you answered question #1; how does one go about parameterizing the little monster? How is consistency in parameter name/label/type/ui enforced? (surely we don't want to see “Kd” in one shader, and “Diffuse Amplitude” in another – talk about confusing the user!)

Well the point here is you just do what you do now, since each shader will evaluate to Cf you can just set up each shader individually, once they all work in their own right you move on to compositing them.



For example

1. Shader1 = brick
2. Shader1 = concrete

Shader comp Cf = brick Cf * brick mask + concrete Cf * (1- brick mask)

Brick can have its diffuse as Kd or foo or whatever on the interface its still called diffuse and by the time it gets to the shader comp all the parameters are already combined into Cf so who cares what they used to be called?

I suspect what I'm actually asking for is that shader comp auto creates an uber shader in the background and just appends the name of the input shader to all the parameters to keep them unique and then compiles the whole lot into a tmp shader for use only during the current render.

Or better that mantra changes to allow for multiple shader calls to be made for each micropolygon and the result stored, comped and finally outputted to the next bit of the render pipe.

As to whether it works ok in VOPs I'll just repeat myself. 20 VOPs can be combined in infinite ways and I don't want to manage infinite shaders.
However potentially having infinite possibilities that can be applied to a object is much more desirable.
Perhaps the thing here is that you think it's ok to to take a shader that already exists, convert it into a vop, build a vop network wire it up with all the other vops that are needed and then use that to make your final shader from. I think I'd rather have a library of shops and then just combine those. Personal preference maybe. It's a bit like saying why have the blendshapes sop when you can just make a vop to combine geometry…
The trick is finding just the right hammer for every screw
User Avatar
Member
941 posts
Joined: July 2005
Offline
Simon
I don't know if it's me being dumb or everyone else missing my point.. ops:

I would never characterize your posts as anywhere even close to “dumb”, and I might have missed your point somewhat (or rather, gone off into a bit of a tangent), so let me see if I can explain myself a little better here.

Simon
At the point you set Cf, no?
All surface shaders eventually eval to Cf.

Yes; and all VOPs evaluate to some output as well. Whether you call it “Cf” or “out1” is just a matter of semantics. In your model, the final assignment to Cf is done by a hypothetical “Shader Blender” OP, and in the VOP world, you'd use the “Mix VOP” (the lerp() function).

Wouldn't you agree that within our current toolset, “shaders” are nothing more than high level VOPs? If we agree on that, then we can just view both shaders and VOPs as parameterized blocks of shading functionality. With the important distinction that the VOP variety *can* be combined (because it represents non-compiled source code), whereas the shader type is, by definition, final – it is “the thing that gets assigned to the geometry” and so can't be refined any further. (also remember that the thing passed to the renderer is op-code, not VEX, that's why it needs to be final – so “shaders” as we know them are really compiled code, not VEX).

Simon
Well the point here is you just do what you do now, since each shader will evaluate to Cf you can just set up each shader individually, once they all work in their own right you move on to compositing them.

Yes; but at the end of the day, if I replaced “shader” with “VOP” and “Cf” with “out1” in that sentense, you'd have the functionality you're looking for, using the current toolset.


Simon
As to whether it works ok in VOPs I'll just repeat myself. 20 VOPs can be combined in infinite ways and I don't want to manage infinite shaders.

And one of those infinite possibilities is the linear combination you describe. Meaning that if you wanted to, you could just enforce the “blend” method (as a “Simon's Combiner” VOP), and thereby collapse the other infinity-minus-one possibilities.

Simon
However potentially having infinite possibilities that can be applied to a object is much more desirable.

More than “desirable”, I would call it “inescapable”, since you still need to be able to define exactly what *kind* of brick, and what *kind* of concrete … and do we want the pattern to be based on parametric coordinates, or rest positions… and is the dirt on the bricks controlled with *this* kind of noise, or *that* kind… and on, and on…
So those 80 parameters would still have to be visited whether you chose to go with a model such as the one you suggest, or just used VOPs directly.

Simon
Perhaps the thing here is that you think it's ok to to take a shader that already exists, convert it into a vop, build a vop network wire it up with all the other vops that are needed and then use that to make your final shader from.

Yes. But more importantly, I'm also suggesting that the idea of a “Shader Blender” OP (and many similar ones I've had of my own), wouldn't really simplify the process of shading beyond what an approach such as VOPs/ShadeTree/Slim already provide – where your blend OP turns into a “Mix” VOP, and where you choose what parameters get exposed to the user.

Simon
I think I'd rather have a library of shops and then just combine those. Personal preference maybe. It's a bit like saying why have the blendshapes sop when you can just make a vop to combine geometry…

I think you're thinking of shaders as constant, static things (like geometry), when they are really not that at all. If you hard-wired a brick shader's parameters in order to make it static, and then wanted the bricks to be slightly redder for the next job, you'd need to create a second brick shader – and this would soon explode into an unmanageable mess of shaders, with close to zero reusability, no? But maybe that's not what you mean.

Try having a library of VOPs and combine those instead. That's the way we ended up setting things up over here and I gotta tell you, the difference between doing that, and tweaking a bunch'o'shaders which you then “comp”, is practically nonexistent.

But yeah; I guess it *is* a matter of taste (or habit) in the end. I know it took me some time to start thinking in terms of VOPs and stop the inertia of 10+ years of thinking in terms of “shaders”.



Cheers!
Mario Marengo
Senior Developer at Folks VFX [folksvfx.com] in Toronto, Canada.
User Avatar
Member
2199 posts
Joined: July 2005
Offline
Mario Marengo
Simon
Try having a library of VOPs and combine those instead. That's the way we ended up setting things up over here and I gotta tell you, the difference between doing that, and tweaking a bunch'o'shaders which you then “comp”, is practically nonexistent.

But yeah; I guess it *is* a matter of taste (or habit) in the end. I know it took me some time to start thinking in terms of VOPs and stop the inertia of 10+ years of thinking in terms of “shaders”.

Ok, now I'm with you, and yes I could do that, but it would still be a solution that only works because I go to the trouble of changing all shaders into VOPs, and I think you are missing a important point of not doing that. Shaders provide the interface to the code or the VOP network, this is lost when you make them into VOPs. I try to write shaders that only expose the parameters that are actually needed to make the shader work, VOPs expose every parameter. It's like saying why have hda's when you can just have the networks with everything in. You put a front end on the network so that artists can concentrate on tweaking parameters that mean something until they get the visual result they want. They don't need to know how it's all wired up.

Do you see what I mean? It would be really nice to take shaders that are all compiled up with all the relavent parameters carefully set up, perhaps even with nice presets already made that they can just stick together straight away with next to no extra work. I'm making the assumption that the parameters on the shader already do the job of giving you the kind of control you speak of and so taking them back into VOP land just isn't needed.

More than “desirable”, I would call it “inescapable”, since you still need to be able to define exactly what *kind* of brick, and what *kind* of concrete … and do we want the pattern to be based on parametric coordinates, or rest positions… and is the dirt on the bricks controlled with *this* kind of noise, or *that* kind… and on, and on…
So those 80 parameters would still have to be visited whether you chose to go with a model such as the one you suggest, or just used VOPs directly.

True but I would need to have one big old shader with those 80 plus another 80 for cheese and another 80 for grass just in case someone wanted to combine concrete and cheese or grass. However combing them at shader level means they could load up a cheese shader and just add it into the brick, concrete comp if and when they wanted it. Also, a brick vop may have 80 parameters but only 40 of them need ever to be changed.

I think you're thinking of shaders as constant, static things (like geometry), when they are really not that at all. If you hard-wired a brick shader's parameters in order to make it static, and then wanted the bricks to be slightly redder for the next job, you'd need to create a second brick shader

Certainly not, why do you think I'm suggesting hardwiring all the shader parms? The shaders I'm talking about would look and work exactly as they do now. The brick shader would have a colour parm that would allow you to set it to any colour you wanted, no different to how they work now, that's the whole point. I see shaders as a filter, they filter out all the complication of vops and give you a nice interface with proper notation, help, named parameters, presets….. all of which you don't get in VOPs. Think of it from an artists point of view, you write a nice brick shader that he learns to use, tweaking parameters to get all the different types of brick he wants. Then you give him a concrete shader and he gets his head round that. Then he says I want both on this object. You say, here's a brick_concrete shader that does both, great he says, now add cheese, jam, grass, hair…… when do you stop adding tabs to your shader and say sorry you can't have that shader its too big? Or do you just say, sorry you can't use shaders any more convert them all into vops. Ok that's a good idea, but I'd call it a work around, suddenly you're saying shaders don't work you have to do it in VOPs. What I'm saying is add one extra type of shader, a compositor, and you totally extend the flexibility and usefulness of shaders without having to resort to VOPs.
The trick is finding just the right hammer for every screw
User Avatar
Member
941 posts
Joined: July 2005
Offline
Hey Simon,

You know; it's pretty funny to find myself advocating for VOPs, since I don't really use them to *write* the shaders – or at least not the complicated bits –, but rather to combine higher-level bits that I write directly in VEX (not unlike what you're talking about). And part of my insistence comes from how successful they have been for us at that level… (I can hear Cristin Barghiel having a nice chuckle in the background ).

Anyway; I *do* understand what you're saying, and am all for any method that simplifies the definition and assignment of material properties to an object (staying away from the word “shader” ‘cause it’s such a loaded term). And I also now understand that having a large set of legacy shaders that would be a pain in the butt to convert to VOPs is perhaps a big motivator – a problem we've never dealt with since Mantra/VEX is pretty much brand new to us (converting from a different language is a different kind of problem). So even though I'm not disagreeing with you, I just can't see it working out quite as neatly as you do (which means absolutely nothing since I'm in the habit of misjudging things on an hourly basis ).

It would be nice to hear an opinion which, unlike my own, is not based on pure speculation… Yup; I think it's time for some brave soul from SESI to join in the discussion… (come ooooon… you *know* you wanna )


Cheers!
Mario Marengo
Senior Developer at Folks VFX [folksvfx.com] in Toronto, Canada.
User Avatar
Member
648 posts
Joined: July 2005
Offline
perhaps a different shop network:
http://www.snotbubble.com/cpb/shopnet.gif [snotbubble.com]

-cpb
User Avatar
Member
12454 posts
Joined: July 2005
Offline
I don't have time to write too much here, but one thing to consider is that the architecture of mantra and the concept of importing variables using dimport when there might be multiple displacement shaders potentially writing to the same exported variable might be hard to deal with.

Even saying this, I'm a moderate fan of this idea but the technical limitations and questions it prompts make me a little uncomfortable. Before VOPs I was impressed by mental ray's phenomena.

Take care,
Jason
Jason Iversen, Technology Supervisor & FX Pipeline/R+D Lead @ Weta FX
also, http://www.odforce.net [www.odforce.net]
User Avatar
Member
2199 posts
Joined: July 2005
Offline
I think Jeff hinted that perhaps a time would soon come when you could simply assigned VOPs to geometry directly, thus forgoing the whole need for shaders. I think this would solve the problem in one fell swoop. The only outstanding problem being converting old code to VOPs. I've tried several just by using the automatic convert and none of the complex ones have worked. This means quite a bit of time is required to manually convert them to VOPs, I've sure it is all fixable though. I sent one off to Sesi for testing, don't know if they looked at it at all.
The trick is finding just the right hammer for every screw
User Avatar
Staff
2540 posts
Joined: July 2005
Offline
Yep looked at that Simon and it is in the bug database, assigned to me… For all to know, if I take the supermat shader and convert it to a vop type, it doesn't work.

If all the SHOPs were properly converted to VOPs (probably the real reason this entire thread was started), then one could build a real simple VOP network mixing two or more VOP shaders together and then quickly MMB on the three VOPs and choose “create input parameters” to build the interface. Currently using “create input parameters” looses many of the finer details one can put in to a VOP hda such as disable strings, ranges and more. It is an existing issue.

The nice thing is once you build a SHOP from this network, the SHOP is completely defined by the VOP network. You assign the SHOP to the surface and now the SHOP just becomes a parameter interface on top of your VOP network. I think I already have what I want.

I can create as many copys of that SHOP and change parameters as needed.

When I render though, the VOP network(s) will always compile and this can be time-consuming with complicated VOP networks. If you haven't noticed, VOPs get parsed with an optimizer as it is compiled in to .vex shader code. It is highly effective, many times optimizing the VOP code more efficiently than with written .vfl code! The con is that it takes time. Most users won't notice but if you have many different shaders that are complicated, it adds up.

Convenience features have been added to help. H7 has the uv quickshade VOP for those instances where I just want to slap on a texture map with simple specular shading so I don't have to build VOP nets just to place textures.
There's at least one school like the old school!
User Avatar
Member
2199 posts
Joined: July 2005
Offline
jeff
If you haven't noticed, VOPs get parsed with an optimizer as it is compiled in to .vex shader code. It is highly effective, many times optimizing the VOP code more efficiently than with written .vfl code!

I hadn't noticed that. Wow how is that possible? So it's actually better to write with VOPs than write code! I always figured it would be the other way round just because vops add so many extra variables into the code all the time.

Hmmmm…. interesting stuff.
The trick is finding just the right hammer for every screw
User Avatar
Member
2199 posts
Joined: July 2005
Offline
jeff
Yep looked at that Simon and it is in the bug database, assigned to me… For all to know, if I take the supermat shader and convert it to a vop type, it doesn't work.


I couldn't even get a vop network for a shader to compile into a new vop type. Works for simple network but nothing complex. If that doesn't work then it really isn't going to happen because every shader would have to be hand coded straight into a vop, kinda defeats the object of having vops in the first place. :?
The trick is finding just the right hammer for every screw
  • Quick Links