memory leaks with COPs?

   6076   8   2
User Avatar
Member
55 posts
Joined: July 2005
Offline
Hi guys,
I am rendering a few Copnet. and houdini really often crash with a message like “memory allocation problem” (I do not remember the exact message ) and it souns like a memory leak, right?
Has someone had a similar problem?
User Avatar
Member
4140 posts
Joined: July 2005
Offline
I haven't put the latest through the paces yet(hopefully today), but in 6.0 and in quick tests with 6.1, I might accuse COPs of a few things, but not “memory leaks” per se. I think there may have been a couple of specific squashed bugs where many repeated ops of the same thing had a small “leak” (the term tends to be improperly used ), but again not that I've noticed.

Memory management was problematic, but again preliminary tests in 6.1 seem to show it behaving much better.

I think you'll need to do a little more investigating…OS? Types of operations? Houdini version number? Memory? Sequence lengths and sizes?

A beta tester's job is never done…

I'm going to try to bust things today or tomorrow and I'll post here if I'm able to reproduce anything remotely like a mem leak…

Cheers,

J.C.
John Coldrick
User Avatar
Member
55 posts
Joined: July 2005
Offline
buuaaaa, this happened again. the message was “a memory allocation error probably due to insufficient memory”. But I have 1Go!!
and okay, I am under win2000, and my sequences is 750 frames long. This may be a problem.
and most of the operations I am making are zcomp, houdini version is 6.1.101 and frame size is 640*480 ( non commercial version )

and is the problem you described me a known bug? or something like a mystery that noone really knows where it comes from?
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Depending on what you're doing, it's not surprising that you would gobble up a gig with 750 frames @ vid res. Just for the record, a memory leak is when there's some pretty serious errors in the software that don't release memory they've allocated. It's rare nowadays since that's normally all handled by the OS - you ask the OS for mem and it's reclaimed when it's done. A lot of coders don't really like hearing that term.
So anyway, you're not out of luck - you need to configure Houdini to use your memory the most efficiently. This is definitely something I need to test for myself, but a quick look showed that it's better behaved over 6.0. What I don't like is that there's *two* caches, somewhat manually managed from different locations, which is a hassle. Anyway, Check your Prefs/Compositing/Cache and fiddle with those values until they make more sense with your configuration. Personally I'm just running a gig too and I have it set to 750 meg…but again I need to test that…

Cheers,

J.C.
John Coldrick
User Avatar
Member
55 posts
Joined: July 2005
Offline
ok, I will test the comp preferences, thx for the advice.
But, about the memory leak completely managed by the OS, are you absolutly sure about that? I am a computer sciences engineer, and developped a few application myself, mainly in C++ under linux or windows, and I have never heard of that. I was always told that memory leaks are caused by programming errors, when class instances are not deleted. This is done automaticly with some languages such as Java, as there is a garbage collector, but I have never heard of such a think in C++ (and I assume Houdini is made with C++, mainly)
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Yes, of course, they're caused by programming errors, but I meant that it's not as common as it once was, that's all, due to the OS. And it's also the first thing people say when they run out of memory.

Cheers,

J.C.
John Coldrick
User Avatar
Member
55 posts
Joined: July 2005
Offline
ok, then we agree
I have set my cache at 750Mb as you said, but how did you decided this value? is it always 3/4 of your RAM? on my second computer with 500Mb, do I set 375Mb?
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Actually I need to test that setting more rigourously with this release. I found that it was simply intolerable in 6.0 no matter where I set it - either I was running out of mem and swapping, or feedback was ridiculously slow. However, quick tests show that it was respecting that number quite nicely in 6.1. I'm not professing that this is necessarily the best, but it's a good start. I think a little experimenting is in order.

As an RFE I've already stated I've always thought this whole business should be invisible to the end user. It's trivial for Houdini to query the system on your available memory, and at most it should boil down to an “aggressive” or “loose” memory config.

Maybe 6.2…?

Cheers

J.C.
John Coldrick
User Avatar
Member
196 posts
Joined: July 2005
Offline
I was having problems like this compositing hundreds of images into one frame. The memory was getting swallowed up until the “memory allocation” error popped up.

Using XP, 6.1.208 and nice expensive ram.

In control panel->system properties->advanced tab->performance options->advanced tab and change the memory usage to system cache instead of programs.

Now when the memory gets dangerously close to running out it starts paging it out.

Lucky considering the range of options regarding this in windows.
Henster
  • Quick Links