PDG Work Items generating locally, not remotely

   1464   6   0
User Avatar
Member
51 posts
Joined: 4月 2011
Offline
I am trying to build a pipeline tool for use on a remote farm, but am having some issues. I know the code I am trying to use generally works perfectly well both locally and on the farm, but the work items aren't generated remotely.

Here is the code which is used to process the remote scene. It is in the preframe script of a ROP. It has worked flawlessly in the past.
import hou
import pdg

node = hou.node('/obj/ropnet1/topnet1/Partition')
print(node)
list(map(lambda x: x.dirty(True), node.getPDGGraphContext().graph.nodes()))
node.cookWorkItems(block=True, generate_only=True)
print(list(map(lambda x: x.workItems, node.getPDGGraphContext().graph.nodes())))
pdgNode = node.getPDGNode()
print(pdgNode.workItems)
print(f'Starting Renders')
pdgNode.context.cookItems(True, [wi.id for wi in pdgNode.workItems if wi.index == hou.frame()], pdgNode.name)
print('PDG Network Cooked')

and here is the expected output. The last few lines are from a python node inside of an HDA built for TOPs.
12:20:48: localscheduler: Local Scheduler: Max Slots=7, Working Dir=D:/workingDir
[[<GenericData name='Convert_genericgenerator1_850', id=850 at 0x0000000007b86b00>], [<GenericData name='Convert_switch1_852', id=852 at 0x0000000007b86080>], [<GenericDat
a name='Partition_854', id=854 at 0x0000000007b87580>], [<GenericData name='Convert_attributecreate1_851', id=851 at 0x0000000007b84b80>], [<GenericData name='Convert_p
ythonprocessor1_853', id=853 at 0x0000000007b84100>], []]
[<GenericData name='Partition_854', id=854 at 0x0000000007b87580>]
Starting Renders
12:20:58: localscheduler: Local Scheduler: Max Slots=7, Working Dir=D:/workingDir
C:/PROGRA~1/SIDEEF~1/HOUDIN~1.561/bin/iconvert.exe
D:/workingDir/test00001.rat
Converting D:/workingDir/test00001.png to .rat format
PDG Network Cooked

And here is what I get on the remote server:
16:33:48: localscheduler: Local Scheduler: Max Slots=15, Working Dir=/data/input/workingDir/
[[], [], [], [], []]
[]
Starting Renders
16:33:58: localscheduler: Local Scheduler: Max Slots=15, Working Dir=/data/input/workingDir
PDG Network Cooked
Edited by Adam F - 2022年8月9日 12:49:36
User Avatar
スタッフ
585 posts
Joined: 5月 2014
Offline
If you're using the same graph and work items aren't being generated, it sounds like one of the nodes in your graph has errors. If you're cooking via a script in a headless session, the easiest way to check for that is to set HOUDINI_PDG_NODE_DEBUG=2 in your process environment, which will enable print outs for any node errors/warnings, as well as node cook status messages. For example, in a simple graph I created with an expression error in the first node:

[13:07:12] PDG: STATUS NODE ERROR (genericgenerator1)
Unable to evaluate expression (Expression stack error (/obj/topnet1/genericgenerator1/itemcount)).

[13:07:12] PDG: STATUS NODE GENERATED (genericgenerator1)
[13:07:12] PDG: STATUS NODE COOKED (genericgenerator1)


Without seeing your .hip file, it's hard to provide any additional suggestions.
Edited by tpetrick - 2022年8月9日 13:10:53
User Avatar
Member
51 posts
Joined: 4月 2011
Offline
tpetrick
If you're using the same graph and work items aren't being generated, it sounds like one of the nodes in your graph has errors. If you're cooking via a script in a headless session, the easiest way to check for that is to set HOUDINI_PDG_NODE_DEBUG=2 in your process environment, which will enable print outs for any node errors/warnings, as well as node cook status messages. For example, in a simple graph I created with an expression error in the first node:

[13:07:12] PDG: STATUS NODE ERROR (genericgenerator1)
Unable to evaluate expression (Expression stack error (/obj/topnet1/genericgenerator1/itemcount)).

[13:07:12] PDG: STATUS NODE GENERATED (genericgenerator1)
[13:07:12] PDG: STATUS NODE COOKED (genericgenerator1)


Without seeing your .hip file, it's hard to provide any additional suggestions.

Honestly, this is exactly what I have been trying to figure out how to do. Is this documented anywhere? I had figured out months ago how to get an actual text file of all of the processes to dump out, but haven't been able to find the docs since.

As for the file, unfortunately it is under NDA, so I can only share sanitized messages and code.
User Avatar
スタッフ
585 posts
Joined: 5月 2014
Offline
Yep, it's documented in the list of env vars: https://www.sidefx.com/docs/houdini/ref/env#houdini_pdg_node_debug [www.sidefx.com]

There are a number of PDG-specific debug switches that can be enabled -- they should all be described on that page as well.
User Avatar
Member
51 posts
Joined: 4月 2011
Offline
tpetrick
Yep, it's documented in the list of env vars: https://www.sidefx.com/docs/houdini/ref/env#houdini_pdg_node_debug [www.sidefx.com]

There are a number of PDG-specific debug switches that can be enabled -- they should all be described on that page as well.

Ok, so I have turned on several of the debug flags in my environment locally and cannot figure out where the logs are supposed to be going. I should be getting inundated with values somewhere, but I am not getting any log files and my python console is not getting anything else. I have Verbose Logging checked on the scheduler, but nothing seems to be getting output.

Attachments:
Screenshot 2022-08-09 141236.png (4.4 KB)

User Avatar
スタッフ
585 posts
Joined: 5月 2014
Offline
Those logs are printed to the standard output of the shell that launched the Houdini process, not to a file on disk. They also need to be set in the environment when the Houdini process starts up.
Edited by tpetrick - 2022年8月9日 14:34:00
User Avatar
Member
51 posts
Joined: 4月 2011
Offline
tpetrick
Those logs are printed to the standard output of the shell that launched the Houdini process, not to a file on disk. They also need to be set in the environment when the Houdini process starts up.
Thanks. Those logs showed up locally, but also did not show up on the farm. I know I had found a way to dump logs to file in the past and had it work. Definitely one of those settings that I found once and haven't been able to find since.
  • Quick Links