Found 4497 posts.
Search results Show results as topic list.
Solaris and Karma » MTLX - Autobump workflow?
- jsmack
- 7770 posts
- Offline
It's a render feature because you shouldn't have to do anything for the first one to look like the last one if the renderer had the feature. Even without that feature, I'm not sure why the displacement only one looks so flat for you. karma should at least be dicing it to the pixel level by default. It's not as sharp as computing normals per sample, but should be much sharper than what you show.
Solaris and Karma » MTLX - Autobump workflow?
- jsmack
- 7770 posts
- Offline
MR SMITH
Not sure if there is a better technical term, but redshift calls it autobump. (https://help.maxon.net/r3d/cinema/en-us/Content/html/Tessellation+And+Displacement.html#TessellationAndDisplacement-EnableAutoBumpMapping)
What is the proper/efficient way to do this in MTLX ?
This is a renderer feature and wouldn't be something Mtlx has to outright support. Mantra has 'Add bump to ray-traced displacements', I'm not sure if karma does yet though. I imagine other renderers that support mtlx that support this feature will still support it even when using mtlx shaders.
Houdini Indie and Apprentice » texture to color questions with attributefrommap
- jsmack
- 7770 posts
- Offline
mgbaker
But I'm not sure about the general case, because I'm not sure I understand the bit about finding the texture path on one of the material node parameters. The material network node doesn't seem to have any parameters at all. And there is no material node -- unless I should add one? But I don't need to add more materials. I see the shop_materialpath on the primitive attribute for the principledshader, but that's not a filename. So I'm still confused about where Houdini actually keeps the file names of the textures it has unpacked.
The material node exists inside the gltf hierarchy subnet in the materials matnet. The texture path is right on the parameters of the principled shader.
Houdini Indie and Apprentice » texture to color questions with attributefrommap
- jsmack
- 7770 posts
- Offline
mgbaker
Oops -- I spoke too soon. The gltf node now imports the part with the red texture. But I'm still not referring to the texture correctly in the following attributefrommap. On the primitives in the gltf node I see the shop_materialpath "/obj/gltf_hierarchy1/materials/Single_Color_Texture" But that's the material, not the texture. How do I locate the name of the texture itself? The exported part merely looked reddish because that's some kind of default that is pinkish-red.
Thanks again for your help -- I hope I'm not driving you batty...
Mary
The gltf hierarchy node is an object level tool you can use to import gtlf/glb files. It's used at scene level, not geometry level. Pressing the build scene button will extract the textures in the glb file and save them next to it or in a custom location if set. Houdini cannot read the textures directly from the glb file, this is probably why you see missing texture pink. If you go to the material node, you should see the texture path on the material node in one of the parameters.
Solaris and Karma » Per-face animated visibility in LOPs
- jsmack
- 7770 posts
- Offline
geomsubsets cannot set visibility. Only prims can have animated visibility. change the path attribute of the faces that need to be hidden so they generate a new prim. Then set the visibility there.
Houdini Indie and Apprentice » Lighting according to normals setting?
- jsmack
- 7770 posts
- Offline
This is kind of impossible with the way renderers work. Most assume that light goes in every direction from the light source, so that only rays have to be sent from the camera. You would need a bi-directional renderer to properly handle a light source that is perfectly directional like that.
If you don't mind very slow renders or possibly blurry/splotchy results a point cloud based setup could be used to project the rays from the light source in sop first, then look up the illumination from the point cloud. The kernel would have to be an additive one and I'm not sure how to properly account for energy conservation. A similar approach called 'photon mapping' is used by some renderers to approximate caustics.
If you don't mind very slow renders or possibly blurry/splotchy results a point cloud based setup could be used to project the rays from the light source in sop first, then look up the illumination from the point cloud. The kernel would have to be an additive one and I'm not sure how to properly account for energy conservation. A similar approach called 'photon mapping' is used by some renderers to approximate caustics.
Houdini Indie and Apprentice » texture to color questions with attributefrommap
- jsmack
- 7770 posts
- Offline
mgbaker
Thank you, Enivob! For the pighead I now see that I can view the texture names by going to the Extra Files tab of the Type Properties, since it's an asset, and the opdef reference makes sense. Thank you for helping with that.
However, when I do the attributefrommap using that texture, I still don't seem to map reasonable colors onto the points. I get one solid color, regardless of which of the pighead textures I try to use. Also regardless of whether I try to use a color texture channel or not. So I think I'm still failing to set this up correctly. Perhaps you know the secret?
As you kindly suggested, I also include a simple glb object with a single red texture that imports into things like 3D Builder, but not with the gltf node in Houdini. It should sit in the same folder as the .hip file. With glb I think the texture is embedded.
Any suggestions you might have about why I'm failing to bring in the texture would be most welcome. I am using Houdini version 19.5.435 if that makes a difference.
Thank you again,
Mary
The glb loads for me with a red solid color. I used the gltf hierarchy node to unpack the embedded textures and load the scene graph within.
The attribfrommap scene with the pighead as an invalid image path for the texture map: (opdef:..?lowres.jpg) This is a relative path that is only valid within the pighead asset. When I change it to an absolute path, (opdef:Sop/testgeometry_pighead?lowres.jpg) the colors are correctly loaded from the map.
Solaris and Karma » Camera Defaults in Solaris
- jsmack
- 7770 posts
- Offline
The values are lifted directly from the interactive camera. It might be a bug that the unit conversion doesn't take place for the aperture and focal length.
Technical Discussion » Render from ACES color space to srgb directly?
- jsmack
- 7770 posts
- Offline
litote
I discovered that the option for Color Space: Bake to OpenColorIO Display/View is not available in my version of Houdini (18.5). It only has a LUT option. What version of Houdini are you using for this version of MPlay?
Any idea how I could use the LUT option instead? I don't know what LUT to use orwhere to get it. Some came with my ACE download.
This was added in Houdini 20.0
litote
Otherwise, presumably my only option is to try to convert it to .jpg in Nuke from the ACES .exr?
With 18.5 the options for export are pretty limited. There is no built in way to apply a display-view transform prior to 20. A config where the view is enumerated as a colorspace could be used with the colorspace transform offered in VEX/VOPs and use a composite network to apply the color transform. 3rd party tools would be the only other way to convert without using legacy methods.
Solaris and Karma » Karma transfer attributes
- jsmack
- 7770 posts
- Offline
If you don't have access to houdini 20, then you'll have to bake the pointcloud color to a texture map first. Houdini 20 added the karma pointcloud read node which would make this simple.
For baking, use your preferred method, mantra or COPs, or even a 2D volume (heightmap) and save it as an image. All of these would allow you to use pcfilter in a VOP network to filter the pointcloud attribute in texture space.
For baking, use your preferred method, mantra or COPs, or even a 2D volume (heightmap) and save it as an image. All of these would allow you to use pcfilter in a VOP network to filter the pointcloud attribute in texture space.
Solaris and Karma » "Render All Frames with a Single Process"
- jsmack
- 7770 posts
- Offline
mtucker
2. problems can occur when objects move from far away to up close - redicing may be required, but may not happen
Is there an option to force redicing every frame?
Solaris and Karma » How to denosie Ambientocclusion in materialX?
- jsmack
- 7770 posts
- Offline
Houdini Lounge » Simple Rendering Question
- jsmack
- 7770 posts
- Offline
tomtm
But why is it render out the USD File again, if I simply want to render double the resolution but all other parameters in the scene remain the same?
That information is stored in the file, so it written again it must be. Also, the previously used USD file was already deleted if the path wasn't changed from a non-temp one.
Houdini Indie and Apprentice » Dispersion look on clouds
- jsmack
- 7770 posts
- Offline
A hacky way would be to render slightly different volumes with color filters on the lighting and then recombine them in post. Aside from some scientific spectral rendering, I'm not sure if there's another way.
Technical Discussion » Copy-To-Point : Normals are not affected [{SOLVED}]
- jsmack
- 7770 posts
- Offline
olivierth
I've never heard of "tuples" before. But I can see a difference between my point normal and a normal sop. When I middle click my custom normals, I see N3flt. When I middle click the normal sop, I see N3flt(Nml). I'm guessing that's the "tuples" you talked about.
a tuple just means a set of multiple values. They're all tuples, but the one with(Nml) is qualified as being a 'normal'. I'm using the word tuple to distinguish a set of three numbers from a 'vector' which is also a qualified type.
A normal sop set to not calculate normals can add this qualifier to the attributes, but another way using just the wrangle is to use the 'setattribtypeinfo()' function to explicitly declare the type as normal or point or whatever.
the qualifiers affect transforms:
point: translate rotate and scale
vector: rotate and scale
normal: rotate
there are also some others, color and texture coordinate but neither of them would be transformed, so they behave the same as an unqualified 3 float in this context.
Technical Discussion » Copy-To-Point : Normals are not affected [{SOLVED}]
- jsmack
- 7770 posts
- Offline
olivierth
I'd still like to understand why a copy-to-point behaves differently with custom wrangle-created normals than those created by a normal sop...
when you create them with a wrangle, are they generic tuples named "N" or do they get set as type 'normal'? generic tuple attributes don't get transformed; normal, point, and vector type attributes do. You can see the type by middle mouse inspection.
Houdini Indie and Apprentice » Karma render error
- jsmack
- 7770 posts
- Offline
Was the default expression language to python in the preferences, or maybe that node's language setting? Those nodes have hscript expressions on them but they look like the language has been changed to python which would result in syntax errors.
Technical Discussion » Project image to UV space texture via a camera?
- jsmack
- 7770 posts
- Offline
raincoleCYTE
In the UV Texture SOP you can choose "Perspective from camera" as a texture type.
Cheers
CYTE
Sorry, I didn't make my question clear.
The mesh has UV already. I'd like to project an image onto it, then bake it to a texture file according to its existing UV. Basically the opposite of rendering.
UV Texture SOP generates a new uv coordination, so I'm not sure how helpful it's in this case.
I suppose I need COP? But I'm not familiar with COP enough to figure out the whole thing myself
I usually use COP for this, but you can use mantra baking too. Use the vopcop2generator with a snippet vop and write a shader to get from the target uv space to world space using uvdist with primuv getting the value of P from the intersection point. If the image has the camera matrix baked into it already, such as an exr produced from a 3d render, the camera matrix can be used to transform P to the uv space of the camera. This camera uv space would then feed a texture vop to read the color. This final color would be the output of the generator. If you have a camera object, the toNDC function can maybe be used instead, but I'm not sure if it works in COP context. If not, there is a perspective function for building a camera matrix from the parameters of the camera.
If you have Labs installed there is the labs maps baker, it uses some of these techniques internally. You can use it as a starting point to take apart and compute your own values.
Mantra baking should be a bit simpler since it does the uv unwrapping for you. You just have to get the camera coordinates into the texture.
Karma is another option, it even supports getting the coordinates from a camera using coordsys, unlike mantra where you would have to reconstruct the camera matrix yourself.
If your mesh is sufficiently dense with points, you could also try what CYTE suggested and just add a second uv set using the UV Texture SOP, but it will always have a bit of distortion when using vertex interpolation unless the geometry is perfectly flat and parallel to the camera plane.
Solaris and Karma » Karma OCIO file rules in XPU and differences between h19 and
- jsmack
- 7770 posts
- Offline
BrianHanke
For me Karma XPU ignores all rules in the config in all situations. CPU works fine.
weird, it works fine for me except for certain cases where an image that is Linear (rec709) is correctly loaded as Linear (rec709) even though the file rule says to load it as ACEScg. All my tests with ACEScc and sRGB worked as expected.
Solaris and Karma » What to expect going forward with Solaris?
- jsmack
- 7770 posts
- Offline
robp_sidefxJonathan de Blok
btw: the Karma ROP doesn't like /obj level path nodes
Thanks for flagging this. You can try using the attached as the starting point of a translator (put it in $HOME/houdiniX.Y/husdplugins/objtranslators). Please note it's definitely not complete in terms of all the flags/parms it should check in terms of "should this end up in the render or not"
Should a path node end up in a render? I thought it was more like a rig to create a path that would get used in some other sop geometry object.
Also, path node, in 2024?
-
- Quick Links