joshua_krause
Oct. 14, 2019 04:51:35
I am running a very high quantity of tasks and want to split them up to avoid overloading my Deadline farm. For example, instead of submitting 5,000 tasks, I want to feed my farm 100 tasks and wait until they finish before submitting 100 more.
I've had some luck using a feedback loop and querying my output files to weed out completed tasks. I feel there is probably a smarter way to do this – probably using a Partition by Range – but I haven't quite figured it out.
Any suggestions on best practices to achieve this?
Thanks!
shadesoforange
Oct. 14, 2019 05:38:57
For frame based stuff you can use the “Frames per Batch option”. I don't think you can use any partitioning, since this will only modify the work items after they were actually generated.
You basically have to look into the job itsself executing multiple tasks as one work item. Depending on whatever your job might me, this probably just takes a small for loop.
Good luck.
joshua_krause
Oct. 14, 2019 07:13:33
Thanks for the tips! Unfortunately I'm not doing anything with rendering or frames. The job is basically opening several thousand 3Ds Max files and running scripts on them.
I've noticed that if I output too many tasks at time, Deadline and PDG stop communicating due to a timeout. Firing short bursts is my current approach, but maybe I just need to manually set my ports.
shadesoforange
Oct. 14, 2019 08:09:56
Maybe the generic server adjusted to 3ds max could be an option for you?
Seems like a rough one though. Good luck.
seelan
Oct. 15, 2019 09:23:32
joshua_krause
Thanks for the tips! Unfortunately I'm not doing anything with rendering or frames. The job is basically opening several thousand 3Ds Max files and running scripts on them.
I've noticed that if I output too many tasks at time, Deadline and PDG stop communicating due to a timeout. Firing short bursts is my current approach, but maybe I just need to manually set my ports.
Was there a timeout error message that you can post here? We can probably fix it on the scheduler side of things if it is a timeout causing this.