Memory leak bug?

   3705   5   1
User Avatar
Member
33 posts
Joined: Jan. 2015
Offline
Hello,

I've been working on photo mosaics. My HIP file below reads in a file (made with Python in another HIP). For each prim, there's a texture file referenced so in a big loop, I load each texture into "attribute from map", promote Cd to a detail attribute with 'average' setting and store out to disk. So essentially each image on my disk now has an average color associated with it.
It works, only there seems to be a memory leak, my Houdini footprint goes from 500Mb all the way to 10Gigs and it incrementally slows down. On a dataset of 6000 images it just works but even after caching to disk, the memory stays bloated until I restart and then I can work from the cache no problem.
The trouble is I want to this with 200,000 images and Houdini will just crash after about 20,000 or so.

It seems to be storing every texture in memory and then some for each iteration of the loop until memory runs out.

I've run this on 18.5.462 and the latest build 511. Any help would be greatly appreciated!

Attachments:
color process rebuild.hiplc (150.8 KB)

User Avatar
Member
33 posts
Joined: Jan. 2015
Offline
I managed to get warmer but still no fix:
Invoking the "Reload Texture" button on the Attribute from Map node clears out the texcache. The memory gets released immediately. It makes sense that Houdini is caching ever texture loaded into that node forever.
I tried using texcache a- off in the Hscript Textport. It did something and reduced the load by 3 GIGS of ram but it's still creeping up to 7 Gigs of ram. So something else is going on.

After turning off texcache, the Cache Manager window doesn't show that anything is being cached anywhere. I'm clueless as to where these extra GBs are getting allocated.
User Avatar
Staff
2591 posts
Joined: July 2005
Offline
If you're able to convert your images to either .rat or MIP Mapped .exr files (make sure the files are tiled and MIP Mapped, not just standard .exr files), you might find better performance if you're using VEX to read the textures.

You can use imaketxto convert the images.

Tiled, MIP Mapped .exr and .rat image files are built to handle large numbers of texture maps in a single session.
User Avatar
Member
33 posts
Joined: Jan. 2015
Offline
Thank you, Mark. In the meantime I inserted a python sop to call texcache -c and sopcache -c every loop cycle which slows the process down considerably but the memory now holding steady. 7 hours of cooking to go...
User Avatar
Member
900 posts
Joined: Feb. 2016
Offline
mark
If you're able to convert your images to either .rat or MIP Mapped .exr files (make sure the files are tiled and MIP Mapped, not just standard .exr files), you might find better performance if you're using VEX to read the textures.

You can use imaketxto convert the images.

Tiled, MIP Mapped .exr and .rat image files are built to handle large numbers of texture maps in a single session.


On a side node, would viewport performance benefit of .rat images instead of regular png?
I need to display in the viewport thousands of textured cards, with alpha too, and noticed low fps when they overlap a lot.
(using the principled shader to load the textures)
User Avatar
Member
33 posts
Joined: Jan. 2015
Offline
I finally rebuilt the system using TOPS to run through each image as a work item. The Attribute from Map SOP still causes serious CPU load and it's exponentially slower the more it runs. I'm having to do batches of 10,000 images at at time, restarting houdini each run to clear the memory (or processes?). It definitely feels like a bug and there is no caching going on anywhere that I can see.
  • Quick Links