We've encountered an issue where if user-A has referenced/sublayered a USD file in solaris from our file server, then user-B tries to overwrite that USD file, they can't because the file is locked. As you can imagine, this makes a collaborative pipeline based on USD problematic. If you have a pipeline where someone doing lighting is referencing a layout USD, which is in turn referencing a bunch of model assets, the only way to update an asset is if everyone in the whole pipeline closes houdini so that asset file can be unlocked and overwritten.
From my understanding, the problem is that Pixar code keeps referenced files opened, and SMB security doesn't allow open files to be altered in any way. There is another thread where this problem came up and it sounds like some people might have worked around it by switching their file sharing from SMB to NFS. I tried this and it did seem to eliminate the file locking issue, but introduced a host of other issues. I experienced slower performance and several seconds long lag when accessing NFS shares. I was getting a lot of crashing in houdini, I suspect caused by that lag, although I can't be sure. The crashing did go away when I switched back to SMB though.
I have two questions for the forum:
1 - What is the current thinking on this problem? Do we know if Pixar is aware it and working on a solution?
2 - Has anyone configured their pipeline to work around this problem? Maybe by referencing copies of USD files, so as not to lock the originals, or something more clever?
This is another thread where people are talking about this issue, as well as others. This thread was getting confusing, so I wanted to start a clean one to focus just on the locking of referenced files on SMB shares.
https://www.sidefx.com/forum/topic/70960/?page=1 [www.sidefx.com]
Thanks,
Brad T.
Found 64 posts.
Search results Show results as topic list.
Solaris and Karma » USD pipelines and file locking
- BradThompson
- 64 posts
- Offline
Technical Discussion » Add VFX to real life camera footage?
- BradThompson
- 64 posts
- Offline
This is a question with no single answer. Can it be done entirely in houdini? Yes, but probably no one would do it that way. The first question is, is your camera static? If so, things are a bit easier. If not, you'll need to track the motion of the camera and get that data into houdini. The magic projectile effect could be created in houdini. That kind of thing is it's specialty. The effect would be rendered out from the tracked camera, then brought into a compositing program like Nuke, Fusion, or AfterEffects to be integrated into the film footage. Those are the basics, but depending on the specifics of the original footage and how realistic you want the effect to be, it could be vastly more vastly more complex than I've described.
Brad T.
Brad T.
Technical Discussion » Error saving USD to network folder
- BradThompson
- 64 posts
- Offline
I was pretty far down the path of moving our studio toward USD when I ran into this problem. It seems to be a complete showstopper for using USD in a collaborative pipeline. As a test, I tried switching our shares over to NFS, as suggested in this thread. This seemed to -sort of- fix this specific issue, but created lots of others. NFS connections were exhibiting performance and stability issues, and I think this lead to instability and crashing in Houdini also. I'm not an networking expert but NFS security seems like it might be lacking as well. I'll be switching back to SMB today.
Has there been any further progress or thinking on this?
Brad T.
Has there been any further progress or thinking on this?
Brad T.
Solaris and Karma » Husk and Hqueue - Status reporting bug?
- BradThompson
- 64 posts
- Offline
I wrote a tool to submit USD/husk render jobs to hqueue. It's basic but mostly functional. One major problem is that, if a render node experiences an error, the task is still marked as "successful" by hqueue. Does hqueue not understand husk error reporting or is there something I need to do in my job submission tool? Attached is a screenshot from hqueue, showing a "sucessful" failure.
Thanks!
Brad T.
Thanks!
Brad T.
Solaris and Karma » Is component builder just for static geometry?
- BradThompson
- 64 posts
- Offline
Thanks Goldleaf,
It does make sense and I'll give it a try. Is this the preferred/suggested way of exporting animated assets for USD?
It does make sense and I'll give it a try. Is this the preferred/suggested way of exporting animated assets for USD?
Solaris and Karma » Solaris and hqueue - out of the box
- BradThompson
- 64 posts
- Offline
I'm new to USD and Solaris, so I could be wrong, but I think that if you are fetching the USD render ROP, then the first job is generating USD's. It's very similar to generating IFD's, so it can take a long time.
I made an HDA for sending husk jobs to hqueue. It's super bare-bones. Someone else has probably done the same more elegantly. You still have to generate the USD somehow. Then my tool sends husk commands to hqueue, telling it to render that USD. So far, I've only used it to run a very basic test and I have no idea whether it will work with more complicated scenes or outside of my environment. You are welcome to give it a try. I make no guarantees that it will work.
I made an HDA for sending husk jobs to hqueue. It's super bare-bones. Someone else has probably done the same more elegantly. You still have to generate the USD somehow. Then my tool sends husk commands to hqueue, telling it to render that USD. So far, I've only used it to run a very basic test and I have no idea whether it will work with more complicated scenes or outside of my environment. You are welcome to give it a try. I make no guarantees that it will work.
Image Not Found
Solaris and Karma » Is component builder just for static geometry?
- BradThompson
- 64 posts
- Offline
I'm still trying to wrap my head around USD workflow in Houdini. I have an animated L-system tree. It's got changing topology, animated attributes, the works. I'm trying to understand the best way to get this into disk, then into a USD stage.
Component builder only seems to export a static USD file, even if I enable "Topology Attributes: Animated" and "Author Time Samples if SOP is time dependent".
USD Render ROP seems to work, but is that the suggested way to do it? It seems less elegant than the component builder.
Thanks!
Brad T.
Component builder only seems to export a static USD file, even if I enable "Topology Attributes: Animated" and "Author Time Samples if SOP is time dependent".
USD Render ROP seems to work, but is that the suggested way to do it? It seems less elegant than the component builder.
Thanks!
Brad T.
Solaris and Karma » Husk can't find USD file
- BradThompson
- 64 posts
- Offline
Solved.
On each render node that was failing, in services, I had to change the login credentials for HQueueClient to a user with permission to access the network shares. The reason it was working on some nodes and not others was because it had been changed in the past on the older nodes, but not the newer ones.
Brad T.
On each render node that was failing, in services, I had to change the login credentials for HQueueClient to a user with permission to access the network shares. The reason it was working on some nodes and not others was because it had been changed in the past on the older nodes, but not the newer ones.
Brad T.
Solaris and Karma » Husk can't find USD file
- BradThompson
- 64 posts
- Offline
I'm trying to render a USD file through Hqueue with Husk. It's working fine on some render nodes, but others are returning this error:
Unable to load USD file '//core2/dfsprojects/Hqueue/projects/KarmaTesting/KarmaTesting/usd/test.usd'
Welcome to the Houdini 19.0.455 command line tools.
The Houdini environment has been initialized.
Starting command prompt...
More confusing is that if I launch HMCD.exe on the render node, and paste the husk command to that, it renders fine. There is no trouble browsing to the location of the usd file from the render node either.
This is an example of the command:
husk -f 3 //core2/dfsprojects/Hqueue/projects/KarmaTesting/KarmaTesting/usd/test.usd -o //core2/dfsprojects/Hqueue/projects/KarmaTesting/KarmaTesting/render/usdAnim_0003.exr -V a
I understand this isn't much to go on, but does anyone have suggestions on how to troubleshoot this issue?
Thanks!
Unable to load USD file '//core2/dfsprojects/Hqueue/projects/KarmaTesting/KarmaTesting/usd/test.usd'
Welcome to the Houdini 19.0.455 command line tools.
The Houdini environment has been initialized.
Starting command prompt...
More confusing is that if I launch HMCD.exe on the render node, and paste the husk command to that, it renders fine. There is no trouble browsing to the location of the usd file from the render node either.
This is an example of the command:
husk -f 3 //core2/dfsprojects/Hqueue/projects/KarmaTesting/KarmaTesting/usd/test.usd -o //core2/dfsprojects/Hqueue/projects/KarmaTesting/KarmaTesting/render/usdAnim_0003.exr -V a
I understand this isn't much to go on, but does anyone have suggestions on how to troubleshoot this issue?
Thanks!
Solaris and Karma » Solaris material gallery?
- BradThompson
- 64 posts
- Offline
What's the process for building a library or gallery of solaris/karma/usd materials? For mantra, we have the material gallery. Saving a materialX subnet to the gallery doesn't seem to work for me (even adding keywords). Is there a solaris specific gallery or a way to use the existing gallery?
Thanks!
Brad T.
Thanks!
Brad T.
Technical Discussion » Houdini 18 Karma Custom Fisheye Lens Shader Revisited
- BradThompson
- 64 posts
- Offline
Hi @jlapre,
It's hard to say. When I used VCC to compile the HDA, it would only show up in /mat context. I'm not sure why that is. I suspect it's because I don't know how to use VCC. Either way, it didn't work like that.
What worked for me was adding the original OTL that Stuart Levy from NCSA wrote to $home/otls/. I don't know how he created that otl or whether he compiled it. Once the otl is in that folder, it shows up in /shop after a restart. Then you can use a camera override to point to it in /shop. That worked for me. A few caveats:
-I'm on windows. I don't have any macs to test.
-I haven't tested network rendering yet. It works in mplay, which uses husk, so -in theory- network rendering should work.
It's hard to say. When I used VCC to compile the HDA, it would only show up in /mat context. I'm not sure why that is. I suspect it's because I don't know how to use VCC. Either way, it didn't work like that.
What worked for me was adding the original OTL that Stuart Levy from NCSA wrote to $home/otls/. I don't know how he created that otl or whether he compiled it. Once the otl is in that folder, it shows up in /shop after a restart. Then you can use a camera override to point to it in /shop. That worked for me. A few caveats:
-I'm on windows. I don't have any macs to test.
-I haven't tested network rendering yet. It works in mplay, which uses husk, so -in theory- network rendering should work.
Technical Discussion » Render farm and karma
- BradThompson
- 64 posts
- Offline
In H19, I've gotten it to work using the technique in this video: https://www.youtube.com/watch?v=kNrW9SYuxkk [www.youtube.com] . In summary, drop a fetch node in /OUT context, point the fetch to the render rop inside the /STAGE karma render node, then connect the fetch node to an Hqueue Render node.
This works on machines with full Houdini licenses, but not on the rest of the farm that only has renderer licenses. I believe it's because this setup calls hython to generate the USD file for Husk and that consumes a license. I'm assuming that what we really need is something similar to IFD workflow, where we use a USD render rop on a licensed machine to generate the USD files, then ask husk/hqueue to render the USD directly.
Am I on the right track? Is there a way to do this without having to roll my own submission tool?
This works on machines with full Houdini licenses, but not on the rest of the farm that only has renderer licenses. I believe it's because this setup calls hython to generate the USD file for Husk and that consumes a license. I'm assuming that what we really need is something similar to IFD workflow, where we use a USD render rop on a licensed machine to generate the USD files, then ask husk/hqueue to render the USD directly.
Am I on the right track? Is there a way to do this without having to roll my own submission tool?
Technical Discussion » Houdini 18 Karma Custom Fisheye Lens Shader Revisited
- BradThompson
- 64 posts
- Offline
If I put the original OTL file into $HOME/houdini19.0/otls, create that asset in /SHOP and point my usd camera to it, it does appear to be working in mplay and husk now. Thanks for that.
The instructions in the H19 manual that I linked earlier don't seem to work for me though. If I put the shader code into a text file, compile it with vcc, put the resulting HDA into $HOME/houdini19.0/otls/lens_shader.hda, the asset can only be created in the /MAT context. Putting that into a /STAGE material library and pointing my USD camera to that doesn't work. I don't know if I'm doing something wrong (likely) or the docs are wrong.
The instructions in the H19 manual that I linked earlier don't seem to work for me though. If I put the shader code into a text file, compile it with vcc, put the resulting HDA into $HOME/houdini19.0/otls/lens_shader.hda, the asset can only be created in the /MAT context. Putting that into a /STAGE material library and pointing my USD camera to that doesn't work. I don't know if I'm doing something wrong (likely) or the docs are wrong.
Technical Discussion » Houdini 18 Karma Custom Fisheye Lens Shader Revisited
- BradThompson
- 64 posts
- Offline
I see there are instructions for creating lens shaders for Karma in the H19 docs now: https://www.sidefx.com/docs/houdini/solaris/karma_lens_shader.html [www.sidefx.com]
Even following those instructions, compiling the cvex code with vcc, I get the same errors Max mentioned above, when rendering to mplay or with the karma rop:
<b style='color: red;'>Error loading lens shader: Unable to load shader 'opdef:/Vop/domelens?CVexVflCode'</b>
<b style='color: red;'>Error loading lens shader: must have P and I as outputs.</b>
It works in the viewport though. Has anyone gotten dome or VR rendering working with H19 Karma yet?
Even following those instructions, compiling the cvex code with vcc, I get the same errors Max mentioned above, when rendering to mplay or with the karma rop:
<b style='color: red;'>Error loading lens shader: Unable to load shader 'opdef:/Vop/domelens?CVexVflCode'</b>
<b style='color: red;'>Error loading lens shader: must have P and I as outputs.</b>
It works in the viewport though. Has anyone gotten dome or VR rendering working with H19 Karma yet?
Houdini Lounge » Setting up a multi-facility, multi-software render farm?
- BradThompson
- 64 posts
- Offline
We render too much for cloud rendering to make economic sense. It would be nice to have as an emergency backup though.
Houdini Lounge » Setting up a multi-facility, multi-software render farm?
- BradThompson
- 64 posts
- Offline
Has anyone done this? Is it even a good idea? The studio I work at, along with two others were recently acquired by a larger company. I've been asked to look into whether or not it makes sense to combine or connect each location's render farms. We use Houdini/mantra and Fusion, primarily. Other studios use 3ds Max, blender, and unreal. We have around 20 nodes in our studio. Others are about the same +/-
I guess I'm wondering, how complicated is this? I guess some sort of asset path conversion would have to happen. That probably means customization of submission scripts. Maybe there is a farm manager that makes this easy? Maybe it's better to run multiple managers but allow cross-farm submision? I have so many questions. I'm not sure where to start investigating.
All recommendations or links to info are appreciated.
Thanks.
I guess I'm wondering, how complicated is this? I guess some sort of asset path conversion would have to happen. That probably means customization of submission scripts. Maybe there is a farm manager that makes this easy? Maybe it's better to run multiple managers but allow cross-farm submision? I have so many questions. I'm not sure where to start investigating.
All recommendations or links to info are appreciated.
Thanks.
Edited by BradThompson - 2021年6月2日 08:17:00
Houdini Lounge » Easiest ACES-CG to SRGB Jpeg?
- BradThompson
- 64 posts
- Offline
Awesome! Thanks. This is exactly what I was hoping for. The screen snip tip is a good one too.
Thanks again!
Thanks again!
Houdini Lounge » Easiest ACES-CG to SRGB Jpeg?
- BradThompson
- 64 posts
- Offline
Thanks Ifree, that's what I was hoping for, but can you elaborate? Which LUT for ACES-CG to SRGB-2.2?
Houdini Lounge » Easiest ACES-CG to SRGB Jpeg?
- BradThompson
- 64 posts
- Offline
Thanks all,
I'm familiar with the standalone utilities and the OCIO nodes in COPS and MAT. I was hoping for something that doesn't involve extra steps. For example, I can't think of a case where it makes sense to save a .jpg from the render view or mplay in ACES color. Jpeg's, being only 8-bit, can't even properly support ACES. I was hoping maybe there was an option or preference I was missing, or perhaps an ACES-CG to SRGB-2.2 LUT that could be applied on export. If not, maybe this should be RFE?
I'm familiar with the standalone utilities and the OCIO nodes in COPS and MAT. I was hoping for something that doesn't involve extra steps. For example, I can't think of a case where it makes sense to save a .jpg from the render view or mplay in ACES color. Jpeg's, being only 8-bit, can't even properly support ACES. I was hoping maybe there was an option or preference I was missing, or perhaps an ACES-CG to SRGB-2.2 LUT that could be applied on export. If not, maybe this should be RFE?
Houdini Lounge » Easiest ACES-CG to SRGB Jpeg?
- BradThompson
- 64 posts
- Offline
Is there a simple, straightforward way to save an image rendered in houdini with ACES color as standard SRGB jpeg without going through another program like Nuke, Fusion, Affinity, etc?
I'm trying to find a way for my coworkers to easily share preview stills with each other over Teams or e-mail, or save WIP stills to document progress. Mplay has the option to export an image with a LUT applied, so maybe that's an option if there is an ACES-CG to Output-SRGB or Output-rec709 LUT. I'm just trying to minimize the steps they need to go through to share quick previews.
Thanks!
I'm trying to find a way for my coworkers to easily share preview stills with each other over Teams or e-mail, or save WIP stills to document progress. Mplay has the option to export an image with a LUT applied, so maybe that's an option if there is an ACES-CG to Output-SRGB or Output-rec709 LUT. I'm just trying to minimize the steps they need to go through to share quick previews.
Thanks!
-
- Quick Links