Found 61 posts.
Search results Show results as topic list.
PDG/TOPs » TOPS get selected LOP nodes?
- AndreasWeidman
- 127 posts
- Offline
PDG/TOPs » TOPS get selected LOP nodes?
- AndreasWeidman
- 127 posts
- Offline
Any chance one could parse a bunch of selected LOP nodes to TOPS, either by python or by some node, or does it need to be specifically connected/fetched? Just getting my feet wet with TOPS so excuse the noob-questions.
PDG/TOPs » Slow USD ROP on TOP network
- AndreasWeidman
- 127 posts
- Offline
Did you figure this out? I'm currently gonna start doing the same thing so very curious.
Solaris and Karma » Time offset for instanced "references" not working (well)
- AndreasWeidman
- 127 posts
- Offline
Interessting. Thank you.
However, I just discovered that this is a bug with 18.5. "instancable reference" work like a charm in 19 without buggy behavior. Does'nt solve my problem though as the show is using 18.5 so I'll look into it.
However, I just discovered that this is a bug with 18.5. "instancable reference" work like a charm in 19 without buggy behavior. Does'nt solve my problem though as the show is using 18.5 so I'll look into it.
Solaris and Karma » Time offset for instanced "references" not working (well)
- AndreasWeidman
- 127 posts
- Offline
I'm using the instancer node with "reference" option to instance a bunch of animated references (not the "instancable reference" option due to buggy variants and disappearing geo), and need to offset the animations randomly. Cannot figure out a way though.
- The "retime instances"-node does not work if "reference" was selected in the instancer.
- Using a for-each loop with a "time shift"-node works but is extremely slow to use as it evaluates every frame.
- I'm also not able to use "edit property" to change the prototype "frame offset" in the reference node, as there is none in the USD schema.
I feel I am missing something obvious here? Is wrangle the way to go perhaps?
- The "retime instances"-node does not work if "reference" was selected in the instancer.
- Using a for-each loop with a "time shift"-node works but is extremely slow to use as it evaluates every frame.
- I'm also not able to use "edit property" to change the prototype "frame offset" in the reference node, as there is none in the USD schema.
I feel I am missing something obvious here? Is wrangle the way to go perhaps?
Edited by AndreasWeidman - Feb. 3, 2022 05:32:36
Solaris and Karma » MaterialX limited nodes & roadmap
- AndreasWeidman
- 127 posts
- Offline
jsmackAndreasWeidman
I assume you are basically using different MaterialX nodes within a subnet? While great for defining useful presets and combining things, I have yet been able to create something useful. I’ve tried creating things like alligator pattern and such by manipulating the UV/Place2D scale/rotation etc, but in essense, without using some kind of image, it seems pretty impossible. There seems to be no way to generate 2D shapes.
You can create nearly any possible shape with min, max, add, mult, div, and trig, pow, exp/log functions.
MaterialX can ONLY define nodes using subnets containing these basic building block nodes. There's no reason you couldn't have a library of nodedefs for higher level convenience nodes. The AMD material library contains some of these.
One of materialX's render targets is GLSL, so I would hope that viewport to render parity is closer than ever before.
That is good news, I hope you’re right.
Perhaps it would make sense to have a Houdini ”open source” section so somehow share these nodedefs as a community. I think we all pretty much want the same kinds of noises, ramps and other nodes. While I’m decent with code, creating shapes by doing complex math with nodes makes me want to do the dishes and clean the garage. Could be wrong, but I imagine it being extremly slow and frustrating
Solaris and Karma » MaterialX limited nodes & roadmap
- AndreasWeidman
- 127 posts
- Offline
BrianHanke
I agree that the current implementation is a bit annoying since MaterialX only gives you the most basic of building blocks. I asked on the MatX GitHub about integrating SideFX's ramps and they said that's coming in the next release. In the meantime, I messed around with making my own set of more advanced ramps and noises using basic MatX nodes. Worked out ok, although the ramps are not portable to other MatX apps since they use SideFX's Houdini-only ramps. The noise one I made is portable though. I tried bringing it into Clarisse 5.5 and it worked fine.
Ramp demo here: https://twitter.com/brianhanke/status/1471123861127237639 [twitter.com]
Cool experiments, Brian, and good that the sideFX ramps will be official in the future.
I assume you are basically using different MaterialX nodes within a subnet? While great for defining useful presets and combining things, I have yet been able to create something useful. I’ve tried creating things like alligator pattern and such by manipulating the UV/Place2D scale/rotation etc, but in essense, without using some kind of image, it seems pretty impossible. There seems to be no way to generate 2D shapes.
I could be wrong, and I don’t want to sound to negative, but it feels like the materialX approach is pretty much to use textures for every channel and create nothing procedural. What’s worse is that even with such a limited node selection, we still can’t view anything but plain color/textures in the viewport. It feels like we are going backwards. The dream of having blender Eevee or unreal Lumens performance in the viewport in Houdini seems further away that ever tbh.
Solaris and Karma » MaterialX limited nodes & roadmap
- AndreasWeidman
- 127 posts
- Offline
jason_iversen
Minor quibble: SideFX say Karma CPU will continue to support VEX along with Mtlx for the foreseeable future. Mtlx is only an exclusive path for XPU.
True, I should probably have stated that we are Redshift based, so GPU/XPU is where our interests are. One could argue that the idea of MaterialX being universal, as in being able to be used with Karma cpu/xpu, arnold, substance, is what makes it attractive, and I assume any vex nodes would not translate as universal.
I hope you are correct in that SideFx may help change things, but seeing as MaterialX has been around for a while now, I’m hesitant that the specification would change in the next decade, unless being combined with perhaps USD shade or something. Of course, this is mainly my disappointment talking, as I had so much hope for Karma XPU.
Solaris and Karma » MaterialX limited nodes & roadmap
- AndreasWeidman
- 127 posts
- Offline
Having experimented with MaterialX a bit, it seems very limited. No radial ramp, facing ratio, wire shader, slope/curve/edges, and no real choices in noises among other things.
My question becomes, since these cannot be used with vex nodes, what is the idea going forward with MaterialX? The official documents warn against writing custom c++ shaders and suggests anything built should be made with native MaterialX nodes. That seems not only extremely difficult, (if not impossible?), but also a huge time waster if every studio is to build their own basic nodes.
As it seems MaterialX will be the standard in Houdini Karma going forward, this would, at least for us, have an impact if we choose to use karma at all, and would be nice to know what the roadmap is.
My question becomes, since these cannot be used with vex nodes, what is the idea going forward with MaterialX? The official documents warn against writing custom c++ shaders and suggests anything built should be made with native MaterialX nodes. That seems not only extremely difficult, (if not impossible?), but also a huge time waster if every studio is to build their own basic nodes.
As it seems MaterialX will be the standard in Houdini Karma going forward, this would, at least for us, have an impact if we choose to use karma at all, and would be nice to know what the roadmap is.
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
mtucker
This is a parameter on the LOP Network node. The default value does have $HIPNAME as part of the database file path, but that value is supposed to get "baked" into a HIPNAME_independent value as soon as you add an image to the gallery.
I can see that this is as you say a filepath for the LOP network node. However, not sure if it matters, but we are not using LOP networks, but rather the default stage. Doesnt seem to be a way to access that same info on the regular stage.
Solaris and Karma » How to extract lights to a single usd file?
- AndreasWeidman
- 127 posts
- Offline
If you want only the lights, and absolutely nothing else, you can also use the graft branches node with only the right side input, select the lights group, and write that out.
Another way in 19 is to use restructure scenegraph to get what you want.
Another way in 19 is to use restructure scenegraph to get what you want.
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
mtuckerAndreasWeidman
Sadly the rendergallery images is only saved per file, so versioning up in our pipe means loosing all previous images, making it not usable for anything for us.
This should not be the case... The same set of render gallery images can be shared across many hip files as long as they all have the same parameter value pointing to the render gallery database. This is a parameter on the LOP Network node. The default value does have $HIPNAME as part of the database file path, but that value is supposed to get "baked" into a HIPNAME_independent value as soon as you add an image to the gallery. If this is not the behavior you're seeing, please submit a bug report. Thanks!
Will do, that is not what we are experiencing. Thanks for pointing that out!
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
Not sure which render view you are referring to, Solaris Hydra render? Or the /obj render?
Perhaps we need to see to see hydra and rendergallery as two separate renderers, just like the old viewport-ipr and mantra renderview.
Rendergallery would be the place for a/b comparison and background rendering, and should ofc have a good region render and focus render. Sadly the rendergallery images is only saved per file, so versioning up in our pipe means loosing all previous images, making it not usable for anything for us.
The hydra renderview is great as is. Sure, would be nice with some extra toggles, but we have our own custom viewport rendersettings anyways. All we really need is the existing pan&scan region render there to work without needing to switch to panscan first, and that the rest of the viewport doesnt get cleared.
Perhaps we need to see to see hydra and rendergallery as two separate renderers, just like the old viewport-ipr and mantra renderview.
Rendergallery would be the place for a/b comparison and background rendering, and should ofc have a good region render and focus render. Sadly the rendergallery images is only saved per file, so versioning up in our pipe means loosing all previous images, making it not usable for anything for us.
The hydra renderview is great as is. Sure, would be nice with some extra toggles, but we have our own custom viewport rendersettings anyways. All we really need is the existing pan&scan region render there to work without needing to switch to panscan first, and that the rest of the viewport doesnt get cleared.
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
While click to focus is great for the renderer when not in region render mode, (although that would be a hydra decision, not houdini specific I think), one big purpose of the region render is to have your ref in the viewer (or other render to compare), and work your lookdev/light identical to the ref by having the render-region in a specific spot. You wouldn't want it to move or expand.
The other use case is of course if its a heavy render and you just need to dial something small in.
The other use case is of course if its a heavy render and you just need to dial something small in.
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
jason_iversenAndreasWeidman
I see what you mean, but I’m thinking there’s nothing stopping you from going into pan&scan anyways if the region render was outside of it.
How would you see a pan&scan working in 3D? It's conceivable create a 2D zoom effect in 3D with windows and offsets, etc - but that would cause a re-render to occur so a region zoomed in this way would still render every pixel and not just be a 2d scaling on the resultant render. 2D scale of renders is useful to prevent you squinting at your screen when working at your target resolution and I don't see how you can formulate a consistent UX to make that happen in the 3D nav mode.
I'm thinking that a region render icon outside of pan&scan would behave the same as in pan&scan mode, but if you want to adjust the pan&scan, only then you switch to that mode. The region render would render whatever is on the screen, zoomed or not, like it always does. The only difference is that you don't have to jump through pan&and scan hoops for a quick region render.
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
I see what you mean, but I’m thinking there’s nothing stopping you from going into pan&scan anyways if the region render was outside of it.
Think the biggest two issues is that it requires multiple secret clicks to get there, so not very intuitive. The second is that it doesn't reset with the usual reset button. Frankly I ended up screwing up my viewport multiple times trying it out, with only a restart to fix it. Also you cant see the geo outside the region. Even clarisse solved that with a wire screengrab.
Somehow I missed this gem during beta. But I'd say this feature is more in alpha state as is.
I love houdini, but dear lord is there some halfbaked stuff going on with these things.
The image view (nr 5), for viewing your image, is still a mess and not usable for anything. The rendergallery is a completely different way of viewing and rendering the image. And the viewport is another.
And the red handles around the image when selecting the camera still doesnt do anything (would have been great as a region render...hint hint..). And lastly the mess that is trying to get functional overscan with the data window settings and expand/crop settings being weird a.f.
Why not remove these things until they are actually usable. Cause once (if) they become functional in like H28 most people wont notice, its not like its a headline feature hehe.
I'd be happy with a release with zero big features and just some much needed polish on things.
Think the biggest two issues is that it requires multiple secret clicks to get there, so not very intuitive. The second is that it doesn't reset with the usual reset button. Frankly I ended up screwing up my viewport multiple times trying it out, with only a restart to fix it. Also you cant see the geo outside the region. Even clarisse solved that with a wire screengrab.
Somehow I missed this gem during beta. But I'd say this feature is more in alpha state as is.
I love houdini, but dear lord is there some halfbaked stuff going on with these things.
The image view (nr 5), for viewing your image, is still a mess and not usable for anything. The rendergallery is a completely different way of viewing and rendering the image. And the viewport is another.
And the red handles around the image when selecting the camera still doesnt do anything (would have been great as a region render...hint hint..). And lastly the mess that is trying to get functional overscan with the data window settings and expand/crop settings being weird a.f.
Why not remove these things until they are actually usable. Cause once (if) they become functional in like H28 most people wont notice, its not like its a headline feature hehe.
I'd be happy with a release with zero big features and just some much needed polish on things.
Edited by AndreasWeidman - Dec. 3, 2021 19:19:22
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
raincole
There is already a pretty good render region UI in /obj... I really don't understand what's wrong with it? Why not making /stage render region work like that? Just the exact same UI, except the dropdown menu doesn't choose from ROPs but delegates?
I'd be happy if they just added that icon from /obj for the render region they've just built. No need for the /obj toolbar or options. Seems like the fastest solution.
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
It also does’nt say in the viewport HUD instructions either.
While I want to throw massive amounts of digital hugs at the developer doing this, I would strongly suggest not making it part of the 2D pan & scan. It is not user friendly or easy to keep having to jump between camera tumble and pan scan. It also doesnt respect the reset function.
A better way would be having its own icon so it can be used fast. Just an on/off toggle like the appriciated paus button just added. Under the magnifier/lupe icon in the left side toolbar would make sense to me.
Why it doesnt fully support redshift is also a problem. To me, render region is basically a temporary camera crop, so I find it hard to understand why the implementation would not work across all renderers.
Last thing, when using render region, it is important to be able to see the houdini GL viewport for anything outside of the render region. Gets very disorienting trying to find the region you want otherwise.
While I want to throw massive amounts of digital hugs at the developer doing this, I would strongly suggest not making it part of the 2D pan & scan. It is not user friendly or easy to keep having to jump between camera tumble and pan scan. It also doesnt respect the reset function.
A better way would be having its own icon so it can be used fast. Just an on/off toggle like the appriciated paus button just added. Under the magnifier/lupe icon in the left side toolbar would make sense to me.
Why it doesnt fully support redshift is also a problem. To me, render region is basically a temporary camera crop, so I find it hard to understand why the implementation would not work across all renderers.
Last thing, when using render region, it is important to be able to see the houdini GL viewport for anything outside of the render region. Gets very disorienting trying to find the region you want otherwise.
Solaris and Karma » Parent constrain with handle. Getting usd transform
- AndreasWeidman
- 127 posts
- Offline
Hey, working on an HDA where I'm parent constraining an object(A) to another(B).
In the HDA I want to add transform handles that matches the constrained position, but when adding the handles from "A" I realize only the object is constrained while leaving the handles in origo.
Since "B" is a usd anim being loaded, I have no way of using that as basis for handles. And my brain exploded when trying to get the transform values from the A/B usd objects with python, since all I get is a matrix with no native node to apply the matrix with.
I can probably deconstruct how to apply the matrix to an xform with python or vex and use that as handles, but just thinking there might be an easier way to get translate values?
In the HDA I want to add transform handles that matches the constrained position, but when adding the handles from "A" I realize only the object is constrained while leaving the handles in origo.
Since "B" is a usd anim being loaded, I have no way of using that as basis for handles. And my brain exploded when trying to get the transform values from the A/B usd objects with python, since all I get is a matrix with no native node to apply the matrix with.
I can probably deconstruct how to apply the matrix to an xform with python or vex and use that as handles, but just thinking there might be an easier way to get translate values?
Solaris and Karma » H19 undocumented region render
- AndreasWeidman
- 127 posts
- Offline
On the top of many peoples wishlist is a region render for hydra. Technically you can kindof do it when switching to image view instead of perspective, but that view is not really functional as far as I can tell.
However, while playing around with the camera pan & scan I noticed that shift+left mouse button will allow you to draw a region for rendering. Works well in karma, and sometimes even in redshift. Buggy, a bit dangerous to use, and not super user friendly, but it is there. Does this mean that there is hope in the near future to get a region render for the viewport? That some wonderful sidefx dev is testing the waters here? (please say yes)
However, while playing around with the camera pan & scan I noticed that shift+left mouse button will allow you to draw a region for rendering. Works well in karma, and sometimes even in redshift. Buggy, a bit dangerous to use, and not super user friendly, but it is there. Does this mean that there is hope in the near future to get a region render for the viewport? That some wonderful sidefx dev is testing the waters here? (please say yes)
-
- Quick Links