seelan
Could both of you describe the nodes and any tools you are using within your HDAs? Perhaps there is a particular node/asset that is leaking memory somewhere, and getting a list from both of you can help narrow it down.
Also does the same tool cause Houdini memory to grow as well?
I set up a simple test to demonstrate - file_read->pack->unpack->attribWrangle->null.
File read is loading a bgeo, wrangle is adding a width attribute, pack/unpack is just there because I usually unpack (but this particular bgeo isn't packed).
Cooking that null in a loop and extracting some data from the geometry does not leak memory, repeated cooks do not increase beyond my use of the extracted data.
If I split that wrangle out to 10 nulls (each connected to the wrangle) and cook each one, it then uses additional memory that won't seem to release - but is released when I close the session (I'm printing used memory before/after to stdout).
Exploring this a little further in my search for how to release the memory, I tried turning off the ‘copyinput’ param of each cooked null after I was finished with it and cooking that again before moving on - that cooks to nothing and released the memory: the before/after session close memory report was very similar and I was about 5GB down from my previous peak in this test.
Which seems like it might be something of a workaround in the meantime for the issue - it is improving my memory use in a far more complex setup, and while there is some memory not released until the session is closed it's significantly down.
I don't believe this is causing any issues in a UI session of houdini.
thanks,
grant