Found 4484 posts.
Search results Show results as topic list.
Houdini Indie and Apprentice » Dispersion look on clouds
- jsmack
- 7741 posts
- Offline
A hacky way would be to render slightly different volumes with color filters on the lighting and then recombine them in post. Aside from some scientific spectral rendering, I'm not sure if there's another way.
Technical Discussion » Copy-To-Point : Normals are not affected [{SOLVED}]
- jsmack
- 7741 posts
- Offline
olivierth
I've never heard of "tuples" before. But I can see a difference between my point normal and a normal sop. When I middle click my custom normals, I see N3flt. When I middle click the normal sop, I see N3flt(Nml). I'm guessing that's the "tuples" you talked about.
a tuple just means a set of multiple values. They're all tuples, but the one with(Nml) is qualified as being a 'normal'. I'm using the word tuple to distinguish a set of three numbers from a 'vector' which is also a qualified type.
A normal sop set to not calculate normals can add this qualifier to the attributes, but another way using just the wrangle is to use the 'setattribtypeinfo()' function to explicitly declare the type as normal or point or whatever.
the qualifiers affect transforms:
point: translate rotate and scale
vector: rotate and scale
normal: rotate
there are also some others, color and texture coordinate but neither of them would be transformed, so they behave the same as an unqualified 3 float in this context.
Technical Discussion » Copy-To-Point : Normals are not affected [{SOLVED}]
- jsmack
- 7741 posts
- Offline
olivierth
I'd still like to understand why a copy-to-point behaves differently with custom wrangle-created normals than those created by a normal sop...
when you create them with a wrangle, are they generic tuples named "N" or do they get set as type 'normal'? generic tuple attributes don't get transformed; normal, point, and vector type attributes do. You can see the type by middle mouse inspection.
Houdini Indie and Apprentice » Karma render error
- jsmack
- 7741 posts
- Offline
Was the default expression language to python in the preferences, or maybe that node's language setting? Those nodes have hscript expressions on them but they look like the language has been changed to python which would result in syntax errors.
Technical Discussion » Project image to UV space texture via a camera?
- jsmack
- 7741 posts
- Offline
raincoleCYTE
In the UV Texture SOP you can choose "Perspective from camera" as a texture type.
Cheers
CYTE
Sorry, I didn't make my question clear.
The mesh has UV already. I'd like to project an image onto it, then bake it to a texture file according to its existing UV. Basically the opposite of rendering.
UV Texture SOP generates a new uv coordination, so I'm not sure how helpful it's in this case.
I suppose I need COP? But I'm not familiar with COP enough to figure out the whole thing myself
I usually use COP for this, but you can use mantra baking too. Use the vopcop2generator with a snippet vop and write a shader to get from the target uv space to world space using uvdist with primuv getting the value of P from the intersection point. If the image has the camera matrix baked into it already, such as an exr produced from a 3d render, the camera matrix can be used to transform P to the uv space of the camera. This camera uv space would then feed a texture vop to read the color. This final color would be the output of the generator. If you have a camera object, the toNDC function can maybe be used instead, but I'm not sure if it works in COP context. If not, there is a perspective function for building a camera matrix from the parameters of the camera.
If you have Labs installed there is the labs maps baker, it uses some of these techniques internally. You can use it as a starting point to take apart and compute your own values.
Mantra baking should be a bit simpler since it does the uv unwrapping for you. You just have to get the camera coordinates into the texture.
Karma is another option, it even supports getting the coordinates from a camera using coordsys, unlike mantra where you would have to reconstruct the camera matrix yourself.
If your mesh is sufficiently dense with points, you could also try what CYTE suggested and just add a second uv set using the UV Texture SOP, but it will always have a bit of distortion when using vertex interpolation unless the geometry is perfectly flat and parallel to the camera plane.
Solaris and Karma » Karma OCIO file rules in XPU and differences between h19 and
- jsmack
- 7741 posts
- Offline
BrianHanke
For me Karma XPU ignores all rules in the config in all situations. CPU works fine.
weird, it works fine for me except for certain cases where an image that is Linear (rec709) is correctly loaded as Linear (rec709) even though the file rule says to load it as ACEScg. All my tests with ACEScc and sRGB worked as expected.
Solaris and Karma » What to expect going forward with Solaris?
- jsmack
- 7741 posts
- Offline
robp_sidefxJonathan de Blok
btw: the Karma ROP doesn't like /obj level path nodes
Thanks for flagging this. You can try using the attached as the starting point of a translator (put it in $HOME/houdiniX.Y/husdplugins/objtranslators). Please note it's definitely not complete in terms of all the flags/parms it should check in terms of "should this end up in the render or not"
Should a path node end up in a render? I thought it was more like a rig to create a path that would get used in some other sop geometry object.
Also, path node, in 2024?
Solaris and Karma » Karma OCIO file rules in XPU and differences between h19 and
- jsmack
- 7741 posts
- Offline
pawel-fin
Personally, I think that the metadata should not be used when the file rules are in place. With both active it will be very hard to predict the outcome and debug problems. I expect the rule that says "all EXRs are ACEScg" to take over everything else.
Additionally, controlling metadata that is coming from a variety of DCC apps is not easy.
If the file rules take over metadata like you described then I think this is the correct behaviour.
Ideally, metadata would be the main way to control color, however lacking a standard and that the required metadata would only be specific to the ocio config in use makes this not feasible. Metadata is the only way to correctly read an image with multiple image planes that are all in different color spaces. A single file rule or file name token cannot handle such a case. For mplay for example, metadata does take precedence because it must display multipart exrs that can be rendered to different colorspaces with husk. The main problem with how metadata is currently handled stems from the open image IO library creating a default value for the colorspace metadata even when no value exists in the file. This makes it impossible for the image consumer to know if the metadata is bogus or not. Maybe additional metadata could be used, such as a tag added by karma when rendering multi-part images to custom color spaces, something like "karma:ocio:usemeta" to distinguish between the fake "oiio:Linear" tag that's added to every file by oiio and a real "oiio:Linear (Rec709)" tag added by husk/karma.
Solaris and Karma » Karma OCIO file rules in XPU and differences between h19 and
- jsmack
- 7741 posts
- Offline
BrianHanke
I have a bit of progress to report: setting HOUDINI_OCIO_FILENAME_COLORSPACE = 3 enables the use of file name tokens in both CPU and XPU. All works as expected. However, this setting seems to ignore file rules in the config, so it's not perfect. I haven't found any way yet to get XPU to recognize those rules.
leaving it at default also works as expected. File rules in the config are followed for both CPU and XPU. I have seen one instance of a the extension file rule being ignored by XPU when tagging exr as ACEScg where metadata is not present in the image. CPU had correct behavior in every case. I'm trying to narrow down when exactly XPU doesn't follow the rules.
Technical Discussion » Houdini functionality
- jsmack
- 7741 posts
- Offline
skadbone1
Hi, I have a PC, about a year old, two NVIDIA RTX A4000 video cards, 64 gigs RAM; after loading a cached file, about 2Million points, it seems to load slow, after a couple of minutes, still waiting to view the geo in Solaris; -my objective is to render a 8-10 S animation with about 8M points, so I would layer this in AE. Is my wait time unusually long,? several minutes; is there any way to speed this up,?, thank you, Craig
by 2 million points do you mean 2 million instances which when imported to solaris become 2 million prims, or just a mesh with 2 million points or some point cloud/particle system? I wouldn't expect the latter two to take very long, but 2 million unique prims is going to take a very very long time to build a scene graph with. Using a point instancer as the import mode rather than native instances would prevent creating a graph of such complexity.
Solaris and Karma » How to get the world position of a light in Karma material
- jsmack
- 7741 posts
- Offline
freshbakedI need the transformed world position of a distant light in my karma material, not just the TX/TY/TZ
Can't you do this with a coordsys to the get the position of the shading point in light space, then transform to world space?
You can certainly pass information about the light to the material, but there is no mechanism for the material to evaluate the light shader directly. You would have to re-implement everything about the light's shading in your material, but you wouldn't be able to find out how much light is hitting because you can't calculate shadowing. From the original question, for a distant light, calculating the amount of light hitting an object while ignoring shadowing is easy, it's always 100% of the light intensity.
Edited by jsmack - Feb. 27, 2024 12:24:09
Solaris and Karma » How to get the world position of a light in Karma material
- jsmack
- 7741 posts
- Offline
materials can't get any information about lights because they are evaluated before lighting happens.
Solaris and Karma » Karma OCIO file rules in XPU and differences between h19 and
- jsmack
- 7741 posts
- Offline
BrianHanke
Wow, so it does. That is sneaky!
you have to relaunch houdini for any changes to OCIO to take effect.
Technical Discussion » strange geo look in viewport
- jsmack
- 7741 posts
- Offline
skadbone1
OK, thank you . Should this impact my caching or rendering? can you suggest what I can do to avoid this,?, or should this not be an issue, thanks, Craig
It should only affect display. You can increase the limit, if you have the hardware to handle it, in the display settings.
Technical Discussion » strange geo look in viewport
- jsmack
- 7741 posts
- Offline
Probably hit the polygon limit for instances so it's drawing bboxs for the rest. If you zoom in you can see that they're boxes.
Technical Discussion » HDA: use cops result in uvQuickShade (relative path?) {[SOLVED]}
- jsmack
- 7741 posts
- Offline
olivierth
...but How can I use a relative path? I tried this but it doesn't work:
op:/CopTriplanar/OUT_Texture
that's an absolute path. A relative path begins without a '/', usually '../' They might not work with op: references though. In that case use
op:`opfullpath("relativepath")`
You can check if it works by middle mouse clicking to see the expanded path.
Solaris and Karma » Karma OCIO file rules in XPU and differences between h19 and
- jsmack
- 7741 posts
- Offline
BrianHanke
Changing rules has no effect on texture appearance for me. Never has. Make a grid with a JPG texture attached, render in Karma CPU or XPU, looks fine. Change the rule to ACEScct (for example), nothing changes. No amount of resetting Karma or updating textures has any effect. I thought maybe Houdini needs a restart for the changes to take effect, but all the changes to rules are reverted after restarting. It's all very confusing.
The metadata in the image is used before any rules so if your image has incorrect metadata that can throw it off. The metadata kind of has to take precedence since it's the only way multiple color spaces in the same file can be supported.
Solaris and Karma » Karma XPU noise
- jsmack
- 7741 posts
- Offline
jumax
In the attached image have 512 path traced samples and 6 light samples even after 9 mins there is no resolution. in the previous examples I let it run for 30 mins and there was not enough improvement.
512 samples isn't very many. Some paths could take 10s to 100s of thousands of samples to clean up.
Technical Discussion » Issues with multiple UV maps
- jsmack
- 7741 posts
- Offline
toonafish
And when I use the UV_Layout SOP with an imported FBX that has 2UVsets and several parts ( like body and clothing ) I lose the UV's of the clothing after using UV_Layout.
I can confirm that this does not happen for me
Technical Discussion » A question for karma
- jsmack
- 7741 posts
- Offline
This page shows a detailed list of Karma XPU capabilities and limitations:
https://www.sidefx.com/docs/houdini/solaris/karma_xpu.html#features [www.sidefx.com]
https://www.sidefx.com/docs/houdini/solaris/karma_xpu.html#features [www.sidefx.com]
-
- Quick Links