How to avoid memory error in COPs?

   7096   10   2
User Avatar
Member
252 posts
Joined: July 2005
Offline
Okay, I am getting tired of constantly crashing Houdini when running COPs. Sure I can keep “Clearing Cache” but when I forget to do that in the excitement of getting work done I get the dreaded crash, especially when I am caching animated stuff in COPs.

The weird thing is I seem to run out of memory when my machine has a whole bunch of RAM left.

Is it my Compositing Settings? What should they be (and why aren't they set more sensibly like Shake or Fusion that don't crash all the time)? Can COPs be made to be smart and actually use what my machine has?!?

Thanks,
Craig
User Avatar
Member
4140 posts
Joined: July 2005
Offline
What are your memory settings(include both the Prefs/Composing/Cache and the “d” over the comp viewport setting) and how much memory are you running, along with OS?

I do agree that it handles memory with far less success hands-off over apps like Shake(which never needed tweaking once in my experience - it simply never crashed over memory at all ) - although more often than not it's because of overzealous settings in the prefs. To be fair, it doesn't help that Houdini is a multi-tiered app and not a dedicated compositor, which typically is something that's often run alone due to it's hungry memory footprint.

What I would prefer is a default method that is really dumbed down - something like “aggressive” and “sharing”. Auto-sensing how much memory your system has, whether it's 32 or 64 bit, is dead simple to code, and it should just deal with it. The end user really has no idea how to approach setting these values. There should be a default assumption that “sharing” assumes you are using other contexts of Houdini and/or other apps, and “aggressive” means you're just running it like a dedicated compositor.

Also, the issue of memory for compositing ops and memory for flipbooking in the comp needs to be managed together, not as separate functions. Nuke deals with it by moving all flipbooks to disk only.

Cheers,

J.C.
John Coldrick
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
Auto-sensing how much memory your system has, whether it's 32 or 64 bit, is dead simple to code, and it should just deal with it. The end user really has no idea how to approach setting these values.

This is pretty much what happens. The cook cache uses 1/2 your mem as a max usage cap, with a “share” of 1/32th (when you're not using COPs, it reduces to this amount to return memory to the rest of Houdini). For 32bit OS's, these are capped at a maximum of 1GB to avoid virtual memory address space problems (which seem to be a lot more prevalent with WinXP than Linux). For 64 bit OS's, there is no cap.

So, if you have 2GB, these values will be set at 1GB and 64MB, respectively. You can try reducing the max cap to 256MB if you're experiencing problems, though to be honest, I'm not sure why you'd be running out of memory even with it set to 1GB if you have 2+GB installed.

The display cache is set to 10MB, which is useful for a lot of small swatches and 1 or 2 main displayed images.
User Avatar
Member
345 posts
Joined:
Offline
craiglhoffman
Sure I can keep “Clearing Cache” but when I forget to do that in the excitement of getting work done I get the dreaded crash


you can put in the COPs post-render script

compfree

which gets rid of all of your cached data.

hope that helps
kuba
User Avatar
Member
4140 posts
Joined: July 2005
Offline
though to be honest, I'm not sure why you'd be running out of memory even with it set to 1GB if you have 2+GB installed.

Me neither, but even here with 4G I seem to have constant problems with memory being gobbled up. Quit the houdini session, and it all returns.

Still, when Houdini ships there are specific values in the prefs, right? Do they automatically change depending on the system you run it on? I'm unclear how it's hands-off.

Cheers,

J.C.
John Coldrick
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
If no hcomposite.pref file is found, which is the usual case when Houdini is first installed, it autodetects the memory. Once you edit your prefs, these are saved to the preference files. Unfortunately, we have no way to save only the prefs the user changed currently, so these values get locked in.

This should only become a problem if your memory changes after installing Houdini (and really, only if it goes down). I suppose if you copy your ~/houdiniX.X dir to another machine you could have issues if that machine has less mem.

Regardless of how much physical memory you have, 32bit apps will always get bit by the virtual address space problem. The usuable address space is usually between 1.3 and 1.6GB (theoretically 2GB) but this varies on your OS and kernel settings. 32bit apps running on 64bit OSes can sometimes utilize up to 4GB if compiled to do so.
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Obviously you see the support end more than I do, but it sure seems to me there's a post every few months from someone complaining about memory problems with comping. I wonder if it's worth revisiting and making it less user-destructible?

Good to know about the prefs - I'll delete mine immediately.

Houdini sure seems to snag all my memory if given the chance. Not sure how much of that is the “compiled to do it” you mentioned or just confusion between the comp and the viewport memory.

Cheers,

J.C.
John Coldrick
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
One thing I could probably do is add an ' Automatically Set Memory Limits' pref. Then, if this gets saved, I can ignore the other mem prefs and just recompute them each session. One way to get around the issue. I could also reduce the max 32bit cap to something even more conservative, say 512MB.

Certainly with cached data it'd be nice if they played together well in Houdini (POP cache, Object transform cache, SOP cache, COP cache, Texture Cache, etc). Currently, they are independent entities. I've always wanted them to ‘slice up the memory pie’ and be a little more considerate to one another in terms of balancing the load. Unfortunately, not at all an easy thing to do right now, but the Cache Manager is a start.
User Avatar
Member
252 posts
Joined: July 2005
Offline
Sorry I was out yesterday…

Anyway, I have tried with settings like this:

Default Doubled

Cache Size 768 MB 1500 MB
Reduce Cache size when inactive 512 MB 1024 MB
Tile Size 200x200 400x400
Resolution Limit 100000 x 100000

My PC is running Windows XP and is a Dual Proc 3.6 Xeon with 3 GB of RAM.

Houdini crashes when it is quite far from filling up my RAM.

As for the post-render script thing, it isn't a problem when rendering, it is a problem while interacting with COPs.

Thanks for any help.
Craig
User Avatar
Member
252 posts
Joined: July 2005
Offline
I didn't know about the hitting “d” over the COPs window. (Sorry I have only been using PRISMs/Houdini for about 14 years… )

Anyway, I see my Image Cache size as set to 10 MB. What?!? So I am going to bump that WAY up and also set some of those Video Card settings (that I have never seen before!) to take advantage of my NVidia 7800!

Thanks for all the help! I will report back if this stops my crashing!

Cheers,
Craig
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
The Image Cache in the viewport isn't nearly as important as the cook cache is, which is why it's set to such a low value. It is very important to MPlay, though, but the popup flipbook is separate process anyway (mplay!) with its own image cache settings.

Doubling the cache settings definitely won't fix your problems - it'll make them worse. I'd try halving them instead. Leave the tile size & resolution limit as is. The 200x200 tile size was determined to be the optimal size for performance in almost all applications - a good balance between tile processing overhead and dicing. The res limit is just a safeguard to prevent massive images on type-o's to scales or resolutions, like one too many 0's. Any image larger than the limit will cause an error.
  • Quick Links