Hi Mark,
Thanks for your reply.
It seems the writing out of the layer USD files is necessary to keep the structure intact.
Is there a way to override the layer save path syntax for everything in a hip? It seems to default to $HIP/primPath/OUT.USD
Found 15 posts.
Search results Show results as topic list.
Solaris and Karma » Layer save path / flatten layers
- jacobvfx
- 15 posts
- Offline
Solaris and Karma » Layer save path / flatten layers
- jacobvfx
- 15 posts
- Offline
Hi,
We occasionally run into issues, where Solaris decides to write out parts of the script as it's own USD files. It could be a subnet including multiple sopcreate nodes or a range of Light nodes.
Is there a way to always flatten these layers created in the script and avoid Solaris trying to write out temp USD files for different parts of the script? And what is the logic behind when it's including everything in the main USD we write out and when it's trying to split out the layers.
We have the main USD rop node set to "Flatten Implicit Layers" and we don't have any Layer Save Path's defined anywhere in the script.
We occasionally run into issues, where Solaris decides to write out parts of the script as it's own USD files. It could be a subnet including multiple sopcreate nodes or a range of Light nodes.
Is there a way to always flatten these layers created in the script and avoid Solaris trying to write out temp USD files for different parts of the script? And what is the logic behind when it's including everything in the main USD we write out and when it's trying to split out the layers.
We have the main USD rop node set to "Flatten Implicit Layers" and we don't have any Layer Save Path's defined anywhere in the script.
Solaris and Karma » Write mute state of layers to USD output
- jacobvfx
- 15 posts
- Offline
So there's no, none destructive/flattening way, to disable/mute a layer in the Solaris graph and have that propagate to the resulting USD file, that will be passed to HUSK?
I guess a workflow could be to save the state of each USD layer as metadata within the Stage and use a pre-render script within HUSK to mute layers based on that.
I guess a workflow could be to save the state of each USD layer as metadata within the Stage and use a pre-render script within HUSK to mute layers based on that.
Solaris and Karma » Write mute state of layers to USD output
- jacobvfx
- 15 posts
- Offline
Hi,
Is there a way to keep the mute state of a USD layer, when writing out a Stage to file?
Let's say you sublayer in two USD files (fileA.usd and fileB.usd).
In the graph, you mute fileB.usd using the Configure Stage lop node.
Then you write out this composed stage to a new file myStage.usd.
When loading in myStage.usd both the fileA.usd and fileB.usd are active and it doesn't look like the muting of fileB.usd has been preserved.
Is there a way to keep the mute state of a USD layer, when writing out a Stage to file?
Let's say you sublayer in two USD files (fileA.usd and fileB.usd).
In the graph, you mute fileB.usd using the Configure Stage lop node.
Then you write out this composed stage to a new file myStage.usd.
When loading in myStage.usd both the fileA.usd and fileB.usd are active and it doesn't look like the muting of fileB.usd has been preserved.
Solaris and Karma » Select primpaths in the Scene Graph Tree with Python
- jacobvfx
- 15 posts
- Offline
Solaris and Karma » Select primpaths in the Scene Graph Tree with Python
- jacobvfx
- 15 posts
- Offline
Hi,
Is it possible to select and expand prim paths in the Scene Graph Tree with Python.
Something like:
Is it possible to select and expand prim paths in the Scene Graph Tree with Python.
Something like:
selectPrimPaths(['/world/animation/ship', '/world/fx/blah'])
Edited by jacobvfx - July 1, 2022 06:28:50
Solaris and Karma » HUSK and custom ArResolver
- jacobvfx
- 15 posts
- Offline
Solaris and Karma » HUSK and custom ArResolver
- jacobvfx
- 15 posts
- Offline
Yes, that the behavior I'm getting when opening the file with the USD api or in Solaris.
Solaris and Karma » HUSK and custom ArResolver
- jacobvfx
- 15 posts
- Offline
Hi,
We are running into issues with rendering through HUSK and using our custom ArResolver plugin.
It seems like HUSK doesn't use the resolver and the CreateDefaultContextForAsset() method doesn't get called at the render step.
The strange thing is that if we add something like this to the preRenderScript:
I can see that the resolver does work, just not when it comes to the actual rendering.
We copy the env from the Houdini session where everything is working correctly.
This is the command line we are using:
Are there any additional arguments that needs to be parsed to HUSK?
We are running into issues with rendering through HUSK and using our custom ArResolver plugin.
It seems like HUSK doesn't use the resolver and the CreateDefaultContextForAsset() method doesn't get called at the render step.
The strange thing is that if we add something like this to the preRenderScript:
# Re-open stage to trigger USDResolver stage: Usd.Stage # set by husk before this python file is run stageFilePath = stage.GetRootLayer().realPath stage = Usd.Stage.Open(stageFilePath) stage.Reload()
We copy the env from the Houdini session where everything is working correctly.
This is the command line we are using:
hfs19.0.498/bin/husk -f 1048 --renderer Karma --usd-input ./file.usd -o ./file.exr --make-output-path
Are there any additional arguments that needs to be parsed to HUSK?
Edited by jacobvfx - June 2, 2022 07:09:17
Solaris and Karma » Flush USD StageCache
- jacobvfx
- 15 posts
- Offline
Let's continue the conversation here.
The issue I'm running into is when I'm switching between different shots that are sharing the same underlying layers.
Let's say I have shotA.usda that's sublayering "sharedLayout_v001.usda". The resolve context for shotA tells it to resolve that layer up to v003.
Then I switch to shotB.usda that's also sublayering that same "sharedLayout_v001.usda" file. In the resolve context for shotB it should resolve the layer to v002. But since the layer is already loaded it doesn't get re-resolved.
What I'm looking for is a way to flush/clear the StageCache for all loaded layers when I'm switching context/shot to make sure I get the correct result.
The issue I'm running into is when I'm switching between different shots that are sharing the same underlying layers.
Let's say I have shotA.usda that's sublayering "sharedLayout_v001.usda". The resolve context for shotA tells it to resolve that layer up to v003.
Then I switch to shotB.usda that's also sublayering that same "sharedLayout_v001.usda" file. In the resolve context for shotB it should resolve the layer to v002. But since the layer is already loaded it doesn't get re-resolved.
What I'm looking for is a way to flush/clear the StageCache for all loaded layers when I'm switching context/shot to make sure I get the correct result.
Edited by jacobvfx - May 17, 2022 15:58:58
Solaris and Karma » Flush USD StageCache
- jacobvfx
- 15 posts
- Offline
I tried the StageCache() method from UsdUtils, but that doesn't seem connected to the layers loaded in Solaris.
Solaris and Karma » Flush USD StageCache
- jacobvfx
- 15 posts
- Offline
Hi,
We are running into issues when switching between different shot context that are sharing USD layers, where we want to Resolve to different version of the layer.
Is there a way to release/flush the USD Stage Cache in Solaris through Python? Similar to Maya that flushes all caches when closing a Maya scene.
We are running into issues when switching between different shot context that are sharing USD layers, where we want to Resolve to different version of the layer.
Is there a way to release/flush the USD Stage Cache in Solaris through Python? Similar to Maya that flushes all caches when closing a Maya scene.
Edited by jacobvfx - May 16, 2022 11:42:59
Solaris and Karma » HUSK: Rendering with Houdni GL
- jacobvfx
- 15 posts
- Offline
Okay, thanks for the clarification.
The main reason I want to use Houdini GL is that I have a hard time getting HUSK with Karma to render fast enough for quick previews of our stages.
A simple 100 frames 640x480 render of a rotating rubbertoy, takes over 3mins on my workstation, even in the worst rendersettings, I could come up with.
USD's usdrecord takes 4 secs for the same sequence.
Am I doing anything wrong or is there just a big overhead in rendering through HUSK?
I have attached the hip file:
https://drive.google.com/file/d/1lr1X7z_GImt62DQvLCccUuuqjRomIoI2/view?usp=sharing [drive.google.com]
The main reason I want to use Houdini GL is that I have a hard time getting HUSK with Karma to render fast enough for quick previews of our stages.
A simple 100 frames 640x480 render of a rotating rubbertoy, takes over 3mins on my workstation, even in the worst rendersettings, I could come up with.
USD's usdrecord takes 4 secs for the same sequence.
Am I doing anything wrong or is there just a big overhead in rendering through HUSK?
I have attached the hip file:
https://drive.google.com/file/d/1lr1X7z_GImt62DQvLCccUuuqjRomIoI2/view?usp=sharing [drive.google.com]
Edited by jacobvfx - Nov. 16, 2021 05:25:31
Solaris and Karma » HUSK: Rendering with Houdni GL
- jacobvfx
- 15 posts
- Offline
Hi,
I'm trying to render out an image sequence from a USD file using HUSK from a shell.
I can get it to work using the Karma delegate, but not GL or Houdini GL.
I'm using this cmd:
Which gives me this error back:
I also don't see Houdini GL in the list of renderers on the USDRender lop node in Solaris.
I'm trying to render out an image sequence from a USD file using HUSK from a shell.
I can get it to work using the Karma delegate, but not GL or Houdini GL.
I'm using this cmd:
./hfs18.5.696/bin/husk --renderer "Houdini GL" --usd-input ./animTest.usd -f 1 -n 10 -o /tmp/husk.%06d.exr -r 1920 1080
Which gives me this error back:
[14:35:18] Hydra plugin HD_HoudiniRendererPlugin is not supported [14:35:18] Unable to load render plugin: Houdini GL
I also don't see Houdini GL in the list of renderers on the USDRender lop node in Solaris.
Solaris and Karma » Reference filepath - linked channel
- jacobvfx
- 15 posts
- Offline
Hi,
We are working on developing a multi shot workflow in Solaris, and has run into an issue.
We have created a HDA that wraps a reference lop node to switch between files, based on which context the artist is in.
The context is defined using the Context Options Variables.
The reference node inside the subnet has the filepath attribute linked to an attribute on the HDA node.
The setup works, but feels flaky. Switching between different contexts sometimes doesn't update correctly in the viewport. Sometimes the camera selection in the top right corner breaks, and has to be selected again.
Is there a more robust way of handling filepath switching in Solaris? Would it be better to handle it with an ArResolver?
Best,
Jacob
We are working on developing a multi shot workflow in Solaris, and has run into an issue.
We have created a HDA that wraps a reference lop node to switch between files, based on which context the artist is in.
The context is defined using the Context Options Variables.
The reference node inside the subnet has the filepath attribute linked to an attribute on the HDA node.
The setup works, but feels flaky. Switching between different contexts sometimes doesn't update correctly in the viewport. Sometimes the camera selection in the top right corner breaks, and has to be selected again.
Is there a more robust way of handling filepath switching in Solaris? Would it be better to handle it with an ArResolver?
Best,
Jacob
Edited by jacobvfx - Oct. 9, 2020 12:50:43
-
- Quick Links