mtucker
USD is designed to allow a "Material" to encapsulate many renderer-specific "shaders". A shader consists of one or more connected UsdShade primitives. All UsdShade primitives have the same USD primitive type (UsdShadeShader), but each primitive also has a "source" which indicates the true nature of that specific primitive. UsdShadShader "sources" are discovered at runtime through ndr and sdr plugins. The USD spec includes a definition for USD Preview Surface shaders, and renderers are generally expected to understand and be able to interpret shader graphs authored using UsdShadeShaders prims of these "types". But most renderers also provide ndr and sdr plugins to describe their own UsdShadeShader sources. I don't know if your renderer has its own shader language or if you just want to use the UsdPreviewSurface specification. This decision will determine whether you need to write ndr/sdr plugins to describe your renderers UsdShadeShader sources to USD or not.
To author materials in Houdini, most renderers have a one to one correspondence between their ndr/sdr discovered shader sources and VOP node types. Houdini ships with VOP nodes that correspond to the UsdPreviewSurface shaders nodes. The Material Library takes these VOP nodes and translates them into UsdShadeShader prims. Other renderers (renderman, redshift, arnold) also provide their own VOP node types which get translated into UsdShadeShader prims to describe a material as a network of connects primitives.
Karma is the odd one out here, in that it takes your whole VOP network, uses it to generate VEX code, then embeds that VEX code into a single UsdShadeShader primitive. This happens as part of a custom translation process. If you are using a principled shader, the translation process also authors a UsdPreviewSurface shader that attempts to approximately match the VEX/karma shader. So when you put down a principled shader and say that your renderer is looking at the resulting HdMaterialNetwork (which is the hydra translation of the UsdShadeShader network), it's not clear to me if your renderer is looking at the auto-generated UsdPreviewSurface network or the actual karma shader network (with just one shader node). I'm assuming you're looking at the USD Preview Surface network? Again, if this is hwo you want to describe your materials going forward, you shouldn't need an ndr/sdr plugin, or to write your own VOP nodes.
As for the animated parameters not working on shaders, make sure you have turned on the toggle in the Material Library LOP that allows animated parameters in the VOP network. Generally doing this is a bad idea for performance reasons, and instead you should author your materials once at frame one, expose any parameters you want to vary with time, then use an Edit Properties node after the material library to animate the promoted parameters. It will run much faster than re-generating the material on each frame.
Hi thank you for explanation, it makes things much clearer to me.
The renderer I am working on plans to support two materials: principled shader and renderer's own shading language one. Currently, for principled shader I am getting the networkmap from using sceneDelegate->GetMaterialResource(GetId()) and get the surface shader with HdMaterialTerminalTokens->surface token. I believe in this way I can only get the actual karma shader as the "identifier" will be the complete VEX code of it (or simply "opdef:/Vop/principledshader::2.0?SurfaceVexCode" if no node is connected), while I don't know how to get the UsdPreviewSurface network. I do really want to be able to, though.
For the custom shading language shader, the way I am creating it is probably odd too, as it uses all the VEX features of houdini (inline code, mathematic nodes, etc.) to generate VEX code, which will be authored in the materials' identifier slot. Then, the Hydra plugin will convert this VEX code to renderer's own one and use it for rendering. I did some tricks to make sure the variables (textures for example) can be correctly passed this way, however probably this is a very odd approach. After all, this approach helps me to avoid creating an ndr plugin (I still don't know how to create ndr plugin and plug into houdini and how it works), which is quite convenient imo, thanks to Houdini's VEX system.
I think my current main question becomes how to get the real network (with relationships, so that I can construct the whole graph in my Hydra plugin) instead of the one-node karma shader. My material code is quite similar from RPR's (
https://github.com/GPUOpen-LibrariesAndSDKs/RadeonProRenderUSD/blob/master/pxr/imaging/plugin/hdRpr/material.cpp). I tried to read data from other renderer's graphs (redshift and renderman in particular), however no useful data can be fetched. It's probably due to my wrong code, though. Could you give some instruction on this? Thanks.
Also thanks for hint to animated parameters. The main purpose of this is to display flip-book textures on different surfaces (sometimes with loops) therefore it's necessary to be able to parse path with $F and arithmetic operators. I tried to toggle the parameter animation and can see it changes, but in Hydra I cannot access it properly for now. I need to take some more time to understand more about how a Hydra plugin should parse the materials.