grant adam

pooverfish

About Me

Principal Software Developer, Framestore Melbourne
専門知識
Developer
INDUSTRY
Film/TV

Connect

LOCATION
Melbourne, Australia
ウェブサイト

Houdini Skills

Availability

Not Specified

Recent Forum Posts

HARS.exe still growing after cooking HDAs sequentially 2019年11月6日10:03

seelan
Could both of you describe the nodes and any tools you are using within your HDAs? Perhaps there is a particular node/asset that is leaking memory somewhere, and getting a list from both of you can help narrow it down.

Also does the same tool cause Houdini memory to grow as well?

I set up a simple test to demonstrate - file_read->pack->unpack->attribWrangle->null.

File read is loading a bgeo, wrangle is adding a width attribute, pack/unpack is just there because I usually unpack (but this particular bgeo isn't packed).

Cooking that null in a loop and extracting some data from the geometry does not leak memory, repeated cooks do not increase beyond my use of the extracted data.

If I split that wrangle out to 10 nulls (each connected to the wrangle) and cook each one, it then uses additional memory that won't seem to release - but is released when I close the session (I'm printing used memory before/after to stdout).


Exploring this a little further in my search for how to release the memory, I tried turning off the ‘copyinput’ param of each cooked null after I was finished with it and cooking that again before moving on - that cooks to nothing and released the memory: the before/after session close memory report was very similar and I was about 5GB down from my previous peak in this test.

Which seems like it might be something of a workaround in the meantime for the issue - it is improving my memory use in a far more complex setup, and while there is some memory not released until the session is closed it's significantly down.

I don't believe this is causing any issues in a UI session of houdini.


thanks,
grant

HARS.exe still growing after cooking HDAs sequentially 2019年11月1日11:19

tpastyrik2k
Hi dpernuit,

so I have tried following methods, none of which showing increasing amount of data in my session:

Tom

I have a similar problem when using HAPI, memory is not being released. I can see the memory being used when calling each Cook, but cannot find a way to easily release. The only way I have to release so far is to close the session and reopen a new one, but this as Tom says is costlier than I would like. It's not a small amount of memory, with my current dataset it's over 20GB not being released, each session run, and it's enough of a spike to limit what I can do.

I have tried clearing the sop cache (and all the caches) on the fly, and deleting all created nodes with no success. Clearing sop did make some difference, but not significant.

thanks,
grant