Houdini 20.0 Nodes TOP nodes

Service Block Send TOP node

Sends code to a service block to execute.

Since 17.5

You need to use this node inside of a service block like the Python Service Block, Houdini Service Block, Maya Service Block, or Nuke Service Block.

This node sends a command or custom code to the block, depending on what the server accepts. For example the Houdini Service Block runs Python code that uses the HOM API and the Maya Service Block uses MEL or Python.

When executing Python code, a work_item object is automatically made available. You can use this to read or write data from PDG. This work_item object supports a subset of the pdg.WorkItem API.

myattr_value = work_item.stringAttribValue('myattr')
file_path = save_my_file(myattr_value)
work_item.addResultData(file_path)

By default the command script is evaluated in a context that’s shared between all work items that run in the same service block. This means that module imports and variables that are declared in the script are accessible to all subsequent script evaluations. If you want to avoid modifying the global context, you can set the Evaluation Context parameter to Shared, Discard Changes instead. Any imports or declared variables will only be valid for the duration of the script, but the script will still have access to variables from tasks in other nodes. It’s also.possible to change the behavior of the script so that it runs in its own completely standalone context by setting the Evaluation Context parameter to Standalone.

Note

When using this node with a Maya Service Block, the command script is assumed to be Python code unless //mel is the first line of the script, in which case the script text is treated as MEL code. The last line of the MEL script determines the value that is added to the Service Block Send work_item.

Parameters

Command

Generate When

Determines when this node will generate work items. You should generally leave this set to “Automatic” unless you know the node requires a specific generation mode, or that the work items need to be generated dynamically.

All Upstream Items are Generated

This node will generate work items once all of the input nodes have generated their work items.

All Upstream Items are Cooked

This node will generate work items once all of the input nodes have cooked their work items.

Each Upstream Item is Cooked

This node will generate work items each time a work item in an input node is cooked.

Automatic

The generation mode is selected based on the generation mode of the input nodes. If any of the input nodes are generating work items when their inputs cook, this node will be set to Each Upstream Item is Cooked. Otherwise, it will be set to All Upstream Items are Generated.

Cache Mode

Determines how the processor node handles work items that report expected file results.

Automatic

If the expected result file exists on disk, the work item is marked as cooked without being scheduled. If the file does not exist on disk, the work item is scheduled as normal. If upstream work item dependencies write out new files during a cook, the cache files on work items in this node will also be marked as out-of-date.

Automatic (Ignore Upstream)

The same as Automatic, except upstream file writes do not invalidate cache files on work items in this node and this node will only check output files for its own work items.

Read Files

If the expected result file exists on disk, the work item is marked as cooked without being scheduled. Otherwise the work item is marked as failed.

Write Files

Work items are always scheduled and the expected result file is ignored even if it exists on disk.

Evaluation Context

This parameter determines which context/namespace the script code should run under. By default the script runs in a shared namespace, so any declared variables or imported modules will be available to subsequent Service Block Send work items. It’s also possible to run the script in its own local context, which is isolated to the work item.

Shared, Keep Changes

The work item’s script is run in a shared, global context. Any changes made such as declaring variables or importing modules are avaialble to all other subsequent work items.

Shared, Discard Changes

The work item’s script is run in a shared, global context. Variable declarations and imports are discarded once the work item finishes, and are not visible to subsequent work items.

Standalone

The work item’s script runs in its own local context. It has access to standard modules like pdgjson, pdgcmd, and hou, but otherwise runs independently of any previous or subsequent work tiems.

Copy Spare Parms to Attributes

When enabled, all spare parms are copied to generated work items as attributes. If there are parms that you want to prevent being copied apply the tag pdg::nocopy.

Remote Script

Specifies the code to send to the shared server.

If the server is a Houdini Service Block this script should contain Python code and you can make use of the HOM API in it. If the server is a Maya Service BLock, this script should contain python code for the Maya API.

Output

Expected Output From

Determines how the expected output file paths are added.

You can supply the expected output file paths to the node if you want to make use of caching and/or access these paths downstream.

None

No expected outputs are added.

Attribute

The given attribute name is evaluated and added as expected outputs. This can be a file or string array attribute.

File List

The given output file paths are added.

See also

TOP nodes