Trigger Python SOPs

   1478   16   3
User Avatar
Member
385 posts
Joined: July 2018
Offline
i would like to trigger a file cache and some Python SOPs in order - independent of timeline.

File cache i guess is easy? pressbutton() function inside another python SOP to press the "Save to Disk" button. But python SOPs have to wait until filecache is done

After that, first python SOP needs to be evaluated every frame.

Second python SOP needs to calculate once after the first one is done.

How can i automate it so that i just press a button to cook everything in the correct order?

The way i am doing it now is manually,

1)saving the cache

2)then enabling the display flag on the python node and pressing play on the timeline.

3)when that is finished enabling the display flag on the second python.

Not practical at all.

Any tips to automate this would be greatly appreciated. avoiding TOPs if possible.

Attachments:
trigger_pythonSOPs.hiplc (267.0 KB)

User Avatar
Member
253 posts
Joined: July 2013
Offline
The quick and dirty solution is to make a comment line in python and use backticks to inject the current frame in to source using an expression. That will trigger a recook because it thinks the source updates.

Same concept as this where transforming the camera trigger a python recook: https://www.sidefx.com/forum/topic/90339/?page=1#post-392629 [www.sidefx.com]
More code, less clicks.
User Avatar
Member
8554 posts
Joined: July 2007
Offline
you should not use Python SOP at all if you don't intend to make modifications to current geo at the current frame, but want to execute unrelated generic python code, it's not worth to tie such code execution to node cooking

especially because currently you seem to rely on manual playback of the timeline in correct order
just use normal Python Script from shelf, HDA module or even exec from string parm to execute all your steps including the cache pressbutton, loop over frames, export files, etc...

why are you trying to avoid TOPs? it can do a lot of heavy lifting for you
Edited by tamte - Feb. 6, 2024 00:24:41
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
253 posts
Joined: July 2013
Offline
Yeah what Tomas says. If you just want to run a series of actions the node graph is not the place for that.

Besides the mentioned shelf and HDA it's also possible to make your own python panels where you can use the power of Qt for fancy UIs if required.

And another thing, if you need to sample values at certain times it's better to use the "..AtFrame" version of certain methods, for example the: "someParm.evalAsFloatAtFrame(frame)"
More code, less clicks.
User Avatar
Member
385 posts
Joined: July 2018
Offline
tamte
why are you trying to avoid TOPs? it can do a lot of heavy lifting for you
i was just trying to keep everything in one place visually, but also not sure how to transfer my code to a Python Script TOP. The current script appends a new line of position data in a text file every frame. If i used TOPs this would be calculated in parallel so how do i ensure the order of the frames (lines) is correct . That is my first question and then accessing SOP attributes from TOPs has always been a tricky thing in my mind where i had to read geo from disk in order to do that?
if i used TOPs which to be honest yeah seems like the reasonable thing to do, then i suppose i wont have to loop over frames in my python code which is a plus.
Do you have an example of how to read geo attributes in a Python Script TOP?
User Avatar
Member
385 posts
Joined: July 2018
Offline
Jonathan de Blok
And another thing, if you need to sample values at certain times it's better to use the "..AtFrame" version of certain methods, for example the: "someParm.evalAsFloatAtFrame(frame)"

So using this "someParm.evalAsFloatAtFrame(frame)" its possible to loop over frames without having to actually hit play on the timeline?

for frame in range(0,240):
  myparm = parm.evalAsFloatAtFrame(frame)
User Avatar
Member
385 posts
Joined: July 2018
Offline
So using TOPs i get the problem i mentioned
"If i used TOPs this would be calculated in parallel so how do i ensure the order of the frames (lines) is correct"

as you can see if you run the pdg which will export a txt file and then read back the txt file with the pos data in SOPs. python script did not add the pos data in the correct order. How can i force it to go frame by frame?
Edited by papsphilip - Feb. 6, 2024 14:07:54

Attachments:
trigger_pythonSOPs.hiplc (268.8 KB)

User Avatar
Member
8554 posts
Joined: July 2007
Offline
papsphilip
as you can see if you run the pdg which will export a txt file and then read back the txt file with the pos data in SOPs. python script did not add the pos data in the correct order. How can i force it to go frame by frame?
can't check right now, but
- you should be able to specify Batch size on the node so that all the workitems within batch are executeed on a single machine in order
- or use Wait For All to create a single partition with all workitems and then use your code to go over the workitems within the partition and write out your file
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
385 posts
Joined: July 2018
Offline
tamte
- you should be able to specify Batch size on the node so that all the workitems within batch are executeed on a single machine in order
- or use Wait For All to create a single partition with all workitems and then use your code to go over the workitems within the partition and write out your file

-first option did not work. Set up a custom scheduler with one slot, also job parms scheduling set to single and 1 slot again. Still writes my data to text file in the wrong order, but i can see the attribute array has the correct order and if i cook individual workitems from the python script they are correct. something happens when i try to cook the whole thing

-second option not sure how to do that. tried Wait For All but not sure how to handle my attribute array, since the point count will be different on every frame. also seems to slow down a lot

This is my pdg network so far

Attachments:
1.JPG (53.2 KB)
2.JPG (50.7 KB)
3.JPG (47.0 KB)
4.JPG (88.4 KB)
5.JPG (70.5 KB)
6.JPG (70.4 KB)

User Avatar
Member
8554 posts
Joined: July 2007
Offline
papsphilip
-first option did not work. Set up a custom scheduler with one slot, also job parms scheduling set to single and 1 slot again. Still writes my data to text file in the wrong order, but i can see the attribute array has the correct order and if i cook individual workitems from the python script they are correct. something happens when i try to cook the whole thing
I just meant Batch Parameters on the node, nothing like custom schedulers etc

papsphilip
-second option not sure how to do that. tried Wait For All but not sure how to handle my attribute array, since the point count will be different on every frame. also seems to slow down a lot
Wait for all just bundles all workitems into one array
so then you can iterate on them in loop and do your per workitem code
should be no different from iterating over frames you were doing in SOPs, also can't imagine it'd be slower

EDIT: ah, I see you are creating dynamic graph per file, and importing each point as a workitem, that sounds realy inefficient
I'd just have Wait For all right after Rop Fetch and let your code get points per geo and do all the stuff you were doing in SOPs
Edited by tamte - Feb. 7, 2024 10:12:17
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
8554 posts
Joined: July 2007
Offline
here is your file with modified TOP net
but essentially it is

ROP Fetch
Wait For All
Python Script

import hou

filepath = self['filepath'].evaluateString(work_item)

with open(filepath, 'w') as f:
    geo_files = work_item.inputFiles
    for geo_file in geo_files:
        geo = hou.Geometry()
        geo.loadFromFile( geo_file.path )
        
        positions = geo.pointFloatAttribValues("P")
        line = ",".join( [ str(p) for p in positions ] )
        f.write( line )
        f.write( '\n' )

Attachments:
trigger_pythonSOPs_fix.hipnc (365.7 KB)

Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
253 posts
Joined: July 2013
Offline
btw.. for simple one-after-the-other you best use the 'inProcesScheduler'. It just runs things in the current Houdini, even on the main thread if you want, and it's the safest option for when you're uing TOPs as a glorified for-loop
More code, less clicks.
User Avatar
Member
8554 posts
Joined: July 2007
Offline
Jonathan de Blok
btw.. for simple one-after-the-other you best use the 'inProcesScheduler'. It just runs things in the current Houdini, even on the main thread if you want, and it's the safest option for when you're uing TOPs as a glorified for-loop
it may be more flexible to do this per node, since I assume you still want the file cache to be parallel if you have farm, but then yes, the Python Scripts can be set to Cook (In-Process) like in the example
Edited by tamte - Feb. 7, 2024 15:36:28
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
385 posts
Joined: July 2018
Offline
finally had some time to go through this problem again and following what you said i had to create a separate in process scheduler for the python node while having the python node set to Cook in process.
This is working now and the filecache has also a different scheduler and remains parallel. i will share my example file, let me know if this is the recommended workflow for this type of thing.
Edited by papsphilip - March 8, 2024 10:50:23

Attachments:
trigger_pythonSOPs_fix.hipnc (254.1 KB)

User Avatar
Member
385 posts
Joined: July 2018
Offline
trying to continue my example case. my next step would be after i save out my txt file i read it back do something in SOPs and then again using python export a new txt.
Do i have to cache the geometry from SOPs like the first time?
Here is the example attached.

Attachments:
trigger_pythonSOPs_fix2.hipnc (357.3 KB)
Capture.JPG (116.1 KB)

User Avatar
Member
385 posts
Joined: July 2018
Offline
tamte
here is your file with modified TOP net
but essentially it is

ROP Fetch
Wait For All
Python Script

import hou

filepath = self['filepath'].evaluateString(work_item)

with open(filepath, 'w') as f:
    geo_files = work_item.inputFiles
    for geo_file in geo_files:
        geo = hou.Geometry()
        geo.loadFromFile( geo_file.path )
        
        positions = geo.pointFloatAttribValues("P")
        line = ",".join( [ str(p) for p in positions ] )
        f.write( line )
        f.write( '\n' )

Your code has helped me a lot understand how to access point attribs. How would you go about grabbing a detail attrib as well in this example? detail attrib does not change per workitem though so the for loop is probably not needed?
User Avatar
Member
8554 posts
Joined: July 2007
Offline
papsphilip
How would you go about grabbing a detail attrib as well in this example? detail attrib does not change per workitem though so the for loop is probably not needed?

myDetailAttrib = geo.attribValue("myDetailAttrib")
Or there are also some more specific ones https://www.sidefx.com/docs/houdini/hom/hou/Geometry.html [www.sidefx.com]
The only for loop thats not needed is the one for list comprehension unless the detail attrib value is an array or dict and you want to process that further for example
You still need the for loop per geo_files

You can also probably just use Geometry Import TOP if the only thing you are loading is detail attribs per file and then you will get them as workitem attributes
Edited by tamte - April 2, 2024 15:57:21
Tomas Slancik
FX Supervisor
Method Studios, NY
  • Quick Links