Hey,
how can I merge work items from other TOP networks? I'm aware of TOP Fetch but that seems to go through a scheduler, losing the generate/cook duality.
Thanks in advance,
Mate
Found 65 posts.
Search results Show results as topic list.
PDG/TOPs » Something like object merge for TOPs
- gadfly16
- 65 posts
- Offline
Technical Discussion » Global, scene-wide time override
- gadfly16
- 65 posts
- Offline
I solved the problem, but I realized that the question wasn't very accurate. So my problem was that I wanted to render a scene file with a global time warp. Finally I did it in PDG. I generated a file range with the desired number of frames, then with a `generic generator` remapped the `Frame` attribute to the warp curve by `chf`, than I rendered the remapped work items with a `rop fetch`. Of course `$F` was changed to `@pdg_index+1` in the output file name.
This works like charm, if you have subframe interpolation, which I had because of motion blur anyway. (Volume caches need extra love) The only thing that's worth mentioning that you shouldn't generate a `range` attrib. This defaults the rop fetch to work from the `Frame` attrib, but results in one frame batches, which can be less than ideal if you want to render the sequence locally.
Cheers,
Mate
This works like charm, if you have subframe interpolation, which I had because of motion blur anyway. (Volume caches need extra love) The only thing that's worth mentioning that you shouldn't generate a `range` attrib. This defaults the rop fetch to work from the `Frame` attrib, but results in one frame batches, which can be less than ideal if you want to render the sequence locally.
Cheers,
Mate
Technical Discussion » Removing collision from RBD Bullet Solver
- gadfly16
- 65 posts
- Offline
Technical Discussion » alembic file
- gadfly16
- 65 posts
- Offline
Hi,
You need to convert the packed alembics to polygons with the `Convert` node. An unpack before that can be useful if you want to transfer the `path` attribute to the polys.
Cheers,
Mate
You need to convert the packed alembics to polygons with the `Convert` node. An unpack before that can be useful if you want to transfer the `path` attribute to the polys.
Cheers,
Mate
Edited by gadfly16 - March 1, 2022 08:48:34
Technical Discussion » Global, scene-wide time override
- gadfly16
- 65 posts
- Offline
PDG/TOPs » Group work items into batches
- gadfly16
- 65 posts
- Offline
Hi everyone!
Is there a generic way to batch work items into batches so they are processed as one job? I'm aware the ROP Fetch nodes ability to do this, but I'm curious how is it possible to generalize this idea the easiest way.
Thanks in advance,
Mate
Is there a generic way to batch work items into batches so they are processed as one job? I'm aware the ROP Fetch nodes ability to do this, but I'm curious how is it possible to generalize this idea the easiest way.
Thanks in advance,
Mate
PDG/TOPs » Custom PDG Node in Python
- gadfly16
- 65 posts
- Offline
Thanks! It works. I thought Generic Generator was just for generating work items and not for processing them.
PDG/TOPs » Custom PDG Node in Python
- gadfly16
- 65 posts
- Offline
Hi Everyone,
I would like to process some images with `oiiotool`, and I decided to give it a try in TOPs.
My obviously naive approach was to:
Surprisingly enough it kind of works, but it's obvious that I don't do it the right way. If I run the subrocesses "in-process" then they are not governed by the scheduler, houdini just spawns all the workitems all at once. If I try to run it "out-of-process" then it chrashes with error "AttributeError: 'WorkItem' object has no attribute 'stringAttribute'", which was fine in process.
So it feels and seems like that I don't know what I'm doing. My question is: is there any documentation regarding custom command line tool integrations into TOPs?
Thanks in advance!
Mate
I would like to process some images with `oiiotool`, and I decided to give it a try in TOPs.
My obviously naive approach was to:
- Make work items with the `filepattern` node
- Filter these with a custom `pythonprocessor`
- Run `oiiotool` with Python's subrocess module from `pythonscript`
Surprisingly enough it kind of works, but it's obvious that I don't do it the right way. If I run the subrocesses "in-process" then they are not governed by the scheduler, houdini just spawns all the workitems all at once. If I try to run it "out-of-process" then it chrashes with error "AttributeError: 'WorkItem' object has no attribute 'stringAttribute'", which was fine in process.
So it feels and seems like that I don't know what I'm doing. My question is: is there any documentation regarding custom command line tool integrations into TOPs?
Thanks in advance!
Mate
Technical Discussion » Setting Current values as "default value" for all parameters in OTL or digital asset
- gadfly16
- 65 posts
- Offline
Also, right clicking on the parameter name you can update the default values selectively.
Technical Discussion » Event handler for HDA unlocking
- gadfly16
- 65 posts
- Offline
Dear Houdinistas!
Is it possible to assign an event handler to the event of unlocking an asset? Like when someone selects the "Allow Editing of Contents" menu item on an HDA.
Thanks in advance,
Mate
Is it possible to assign an event handler to the event of unlocking an asset? Like when someone selects the "Allow Editing of Contents" menu item on an HDA.
Thanks in advance,
Mate
Technical Discussion » Referencing HDA's location
- gadfly16
- 65 posts
- Offline
Thanks Mkps,
This is exactly what I was looking for. The final content of the top level parameter on the asset that can be referenced from the inside is:
I used the awkward join/split combo, because os.path expects backslashes on windows while Houdini returns slashes.
Cheers,
Mate
This is exactly what I was looking for. The final content of the top level parameter on the asset that can be referenced from the inside is:
'/'.join(hou.pwd().type().definition().libraryFilePath().split('/')[:-1])
I used the awkward join/split combo, because os.path expects backslashes on windows while Houdini returns slashes.
Cheers,
Mate
Technical Discussion » Referencing HDA's location
- gadfly16
- 65 posts
- Offline
Dear all!
I'm trying to archive an asset for a client which has outside file references (textures mainly). While the asset was living in our project structure it was sufficient to have absolute file names, but now I would like to archive it to a self contained directory structure. The most elegant way would be to reference these files relative to the hda file, which defines the asset. Is it possible?
Thanks in advance,
Mate
I'm trying to archive an asset for a client which has outside file references (textures mainly). While the asset was living in our project structure it was sufficient to have absolute file names, but now I would like to archive it to a self contained directory structure. The most elegant way would be to reference these files relative to the hda file, which defines the asset. Is it possible?
Thanks in advance,
Mate
Technical Discussion » Importing `houdini.env` into the shell
- gadfly16
- 65 posts
- Offline
Technical Discussion » Importing `houdini.env` into the shell
- gadfly16
- 65 posts
- Offline
Thanks for the insightful answer!
Unfortunately by “export” I mean exporting not printing: Exporting Variables [bash.cyberciti.biz] .
To be more exact I would like to have a bash shell where the environment variables defined in `houdini.env` are set. In my understanding this is only achievable by setting these variables and start a sub-shell, since a sub-process can not export to its parent. But those who know the answer will aware of this detail..
I hope this further clarifies my question.
Thanks again,
Mate
Unfortunately by “export” I mean exporting not printing: Exporting Variables [bash.cyberciti.biz] .
To be more exact I would like to have a bash shell where the environment variables defined in `houdini.env` are set. In my understanding this is only achievable by setting these variables and start a sub-shell, since a sub-process can not export to its parent. But those who know the answer will aware of this detail..
I hope this further clarifies my question.
Thanks again,
Mate
Edited by gadfly16 - Aug. 12, 2020 03:53:50
Technical Discussion » Importing `houdini.env` into the shell
- gadfly16
- 65 posts
- Offline
Hi,
is there a tool, that parses and exports the variables defined in the `houdini.env` file into a shell? (To bash in my case.)
Thanks,
Mate
is there a tool, that parses and exports the variables defined in the `houdini.env` file into a shell? (To bash in my case.)
Thanks,
Mate
PDG/TOPs » More than one frames per batch problem
- gadfly16
- 65 posts
- Offline
PDG/TOPs » More than one frames per batch problem
- gadfly16
- 65 posts
- Offline
Thanks!
Unfortunately I'm on 17.5 and it seems it does not have that node. One more reason to bump. In the meantime (and out of curiosity), is there a way I can manually set the range? (and pdg attributes in general)
Thanks again,
Mate
Unfortunately I'm on 17.5 and it seems it does not have that node. One more reason to bump. In the meantime (and out of curiosity), is there a way I can manually set the range? (and pdg attributes in general)
Thanks again,
Mate
PDG/TOPs » More than one frames per batch problem
- gadfly16
- 65 posts
- Offline
Hi,
if I set `ROP Fetch` node's `Frames per Batch` parameter other than 1, the node generates the work items based on the range attribute instead of the incoming work items. (Or at least, this is what I think it's doing..)
Here is what I'm trying to achieve:
What happens is:
Of course the last ROP Fetch is set to ‘single-frame’ mode.
Is it the intended behavior? Is there a workaround?
I can put a `Wait for All` before the last `ROP Fetch` and use ‘frame-range’ mode, but in that case the network waits for all the frames in vain.
Hope it makes sense and thanks in advance,
Mate
if I set `ROP Fetch` node's `Frames per Batch` parameter other than 1, the node generates the work items based on the range attribute instead of the incoming work items. (Or at least, this is what I think it's doing..)
Here is what I'm trying to achieve:
- Write out a simulation with preroll frames. (I need the preroll frames for debugging the sim.)
- Filter out the preroll frames by `Filter by Expression`. (`@pdg_frames<$STARTFRAME` works just fine)
- Render an OpenGL preview without the preroll frames, but in batches larger than 1 to save on the startup time. (It's an OpenGL render, where it's not negligible.)
What happens is:
- `ROP Geometry Out` produces 240 work items.
- `Filter by Expression` filters out 48, remains 192.
- And here comes my problem: if batch size is 1 `ROP Fetch` stays with the 192, but if it's more than 1, then it goes back to 240.
Of course the last ROP Fetch is set to ‘single-frame’ mode.
Is it the intended behavior? Is there a workaround?
I can put a `Wait for All` before the last `ROP Fetch` and use ‘frame-range’ mode, but in that case the network waits for all the frames in vain.
Hope it makes sense and thanks in advance,
Mate
Edited by gadfly16 - July 30, 2020 13:21:32
PDG/TOPs » How PDG decides if a file on disk is dirty or valid
- gadfly16
- 65 posts
- Offline
Thanks a lot!
I was missing the Cache Mode parameter. Getting there..
Best regards,
Mate
I was missing the Cache Mode parameter. Getting there..
Best regards,
Mate
PDG/TOPs » How PDG decides if a file on disk is dirty or valid
- gadfly16
- 65 posts
- Offline
Hi,
I'm wondering if there is a place in the documentation, that describes the exact logic how PDG/TOPs decide if a file on disk is a valid finished work item, or a dirty leftover from a previous run?
With my - pretty limited - hands on experience I couldn't come up with a working mental model for this. It seems it's not a `delete to dirty` type primitive approach, nor a checksum based `see if upstream deps changed` type one.
What do I miss here?
Thanks in advance,
Mate
I'm wondering if there is a place in the documentation, that describes the exact logic how PDG/TOPs decide if a file on disk is a valid finished work item, or a dirty leftover from a previous run?
With my - pretty limited - hands on experience I couldn't come up with a working mental model for this. It seems it's not a `delete to dirty` type primitive approach, nor a checksum based `see if upstream deps changed` type one.
What do I miss here?
Thanks in advance,
Mate
-
- Quick Links