Memory Leak in H13 with Flips maybe ?

   25292   27   6
User Avatar
Member
334 posts
Joined: July 2007
Offline
Trying out latest build now and it works. My flip sim that previously would only sim in linux uses the same memory i windows now

Woohooo
www.gimpville.no
User Avatar
Member
4189 posts
Joined: June 2012
Offline
edward
On Windows, Houdini 13.0.322 now uses the latest TBB version (4.2 update 2), which contains the improved memory allocator.


Out of interest are there particular nodes in simulations that benefit more than others with this?
User Avatar
Member
7709 posts
Joined: July 2005
Online
MartybNz
Out of interest are there particular nodes in simulations
that benefit more than others with this?

Sorry, I missed this one. Er, I'm not sure to be honest. The patterns that the old TBB allocator had problems with were situations in which one thread would allocate memory that another thread would end up freeing.
User Avatar
Member
5 posts
Joined: July 2013
Offline
Having the same issues on Windows as everybody else and wanting to get into large scale Flip simulations I decided to install Ubuntu 14.04 Lts alongside my Windows 7 Installation. Thought this would be the key to solving all my problems. It took me a few days to get my head around all this and finally I could start a simulation which I already tested on Windows.

A simulation with a constant particle count of about 40 to 46 million starts occupying about 20 GB of my 48gb of ram fills up the memory completely within about 100 frames and crashes Houdini. (I did not activate a swap partition because I thought I would not need it for now). Now in Windows the Ram also fills up but to a far lesser amount then in Linux. With the exactly same sim it started with 15gb an went up to 35 GB at frame 160.

I also noticed that the .sim files written out on Linux are bigger then the ones written out on Windows and really the only thing I changed in the Houdini file is the location of the Explicit Cache.

I already set the Cache Memory to 0 and enabled Explicit cache.
As I am totally new to Linux I`m sure that I`m missing some settings.

Help would be appreciated.

Cheers.
https://vimeo.com/130328238 [vimeo.com]
User Avatar
Member
250 posts
Joined: Feb. 2013
Offline
Hi,

I'm sorry I know that it is an old subject… But it seems that I'm having this kind of issue with H14 on Windows 7, with the pyro solver. I only have 16gbs of ram, the cache size is set to 0, I turn on ‘caching to disk’. My simulation works well but when it's reaching 16gbs of ram it's too slow…

Is there a way to avoid that ? Is Windows 10 a solution ?

And the resolution is only 220…
https://vimeo.com/obreadytom [vimeo.com]
User Avatar
Member
1799 posts
Joined: Oct. 2010
Offline
I have noticed this behaviour in 14.0.346 as well where

- I run a pipeline once which consumes about 7 GB when done (and houdni sticks to those 7 GBs after completion. I presume due to the TBB mallocs wanting to have the memory allocated)

- starting a new scene does NOT release the memory

- after a while idle with a new scene (I do not know exactly how long, but a long time) memory is released

- closing houdini releases the memory

- starting a new instance of the pipeline and running it does NOT release the memory

its a bit of a scary behaviour as we are trying to run our pipelines in farm machines with not a ton of ram, so this is becoming quite limiting

does the 12.5 tbb dll hack still works? or is there a different way to address this issue (anyway to force to release the memory? )
-G
User Avatar
Member
250 posts
Joined: Feb. 2013
Offline
Houdini returns an error at startup when replacing the 12.5 file…
https://vimeo.com/obreadytom [vimeo.com]
User Avatar
Member
250 posts
Joined: Feb. 2013
Offline
I also notice that using the upres pyro seems to be using less memory when doing a batch.
https://vimeo.com/obreadytom [vimeo.com]
  • Quick Links