ROP Fetch: Distribution options as a separate node

   6935   32   3
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Clear enough.

Let's focusing on your example as it is far not so bad solution.
Your example is better because of port/ip sets automatically and we don't need to care about the shutdown.

But how to make it work?
- it doesn't work in my case because “Attribute Create” node create empty ‘trackaddress’ parm early then tracker is started.
- “Dynamic” generation is helping to create attrs in the right time but ROPFetch is saying: “When not generating static items, batch mode can only be used when the ”Full Range for Upsteam Items“ is selected”
- “Python Script” node still doing it after ‘onGenerate’ callback
User Avatar
Staff
585 posts
Joined: May 2014
Offline
I've recreated/fixed the example. You're right that the attrib create doesn't work - when I was testing that I must have had something cached so it appeared to work, or I uploaded the wrong file.

The working solution isn't very clean, though. It relies on the fact that the rop fetch work items have access to the input lists, and extracts the tracker information using pdginput(..) at cook time. You can see that on the /obj/AutoDopNetwork/DISTRIBUTE_fliptank_CONTROLS node.

The way the built-in distributed sim handles this is it copies the tracker information onto the slice work items immediately before cooking, so the correct tracker port/ip are available and passed to the sim jobs. It would be a simple change on our end to add an option to do that for externally created trackers as well. For example, a check box on the rop fetch/rop geometry to tell the rop fetch to use a manually created, upstream tracker.

Attachments:
distributed_flip.hip (3.5 MB)

User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
That is works!
But how did you set “ResultData” for “Generic Generator” from CMD execution? How to do it for some other command?
User Avatar
Member
603 posts
Joined: Sept. 2016
Offline
The example executes sharedserver.py with the genericgenerator. This uses the result data api which is in pdgcmd.py. The function is called reportResultData. Jobs generally report result data back to pdg using that api, which is using XMLRPC under the hood.

In theory any job could do that as long as they were able to formulate the correct XMLRPC client call (from C++ or whatever)
Edited by chrisgreb - June 14, 2019 11:02:09
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Clear. I thought that info for ResultData is taken from cmd output (somehow).
Thank you for explanation
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
I found another problem regarding multiple distributed simulations.

When I use multiple PDG sharedserver (at the same time) for tracker purpose (the same as your example but with adding the second simulation) then second simulation is producing not correct data (FLIP noise).

Looks like it is not an issue of the trackers itself because when I use two predefined trackers then everything is fine.
Edited by Ostap - June 19, 2019 07:08:35
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
I wondering how it could be possible that two sharedservers interference between ich other?
User Avatar
Member
603 posts
Joined: Sept. 2016
Offline
Could you attach a repro .hip for this?
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Yes of course.
If you turn off “Distribute Pressure Solver” then everything is back to normal (without FLIP noise/interference)
Edited by Ostap - June 21, 2019 02:18:42

Attachments:
pdg_dist_for_SideFX_v01.001.hip (3.3 MB)

User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Can you please take a look at this file (above)?
User Avatar
Staff
585 posts
Joined: May 2014
Offline
This is a TOPs bug with batch item inputs, which is fixed in tomorrow's build. If you want to continue using the current build, you can set the “Cook Batch When” parameter on the rop fetch to “All Frames are Ready” instead of “First Frame is Ready”.
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Thank you for the info.
17.5.303 - is it tomorrow build? I just can't find this fix in “Changelog”
User Avatar
Staff
585 posts
Joined: May 2014
Offline
Yep, 303 should have the fix.
  • Quick Links