"Insufficient Memory" error

   24497   24   2
User Avatar
Member
325 posts
Joined: July 2005
Offline
Hi,

I am having “Insufficient Memeory Allocation” error when memory usage reaches 1GB. I made a couple of tests, subdividing a polygonal sphere/cube so that the polygon count exceeds 1mln. Another test is birthing particles from a volume (I used cube as a SOP for a POP Source) - under Windows x64 I was able to produce about 16 000 000 particles (with the memory usage around 1 GB) and about 6 000 000 under Windows XP (32 bit ver.) (also around 1GB RAM was used). At this point I was getting “Insufficient memory” error.
I wish I could test this behaviour under Linux 64, but I couldn't yet.
Is such behavior normal or is it an issue of Houdini builds <= 8.0.518?

Houdini 8.0.518
Windows XP/x63 Professional
AMD X2 64 4600
4Gb RAM (DDR Corsair 3500 LL PRO)
nVidia 7800 GTX 512 (latest nVidia drivers)

Thanks.
I liked the Mustang
User Avatar
Member
4140 posts
Joined: July 2005
Offline
I'm running a 32 bit compile of Houdini on a 64 bit Linux system, 4G mem, and I've never had Houdini crash while maxing things out, unless I've literally used up all my hard mem and the swap. System survives. I know there's been posts from the SESI staff here on occasion about how Houdini is only able to address yea-much memory, but I've personally never experienced it on this system. For example, right now I'm running your subdivided sphere example and I've got 4.6 million polys gobbling up pretty well the whole works, but houdini responds fine(albeit a little slow ).

I always thought you needed to run the 64 bit version specifically to be able to max out our sort of systems, but I'm definitely running the 32 bit compile. What you're experiencing seems to be some sort of Houdini/windows issue, however. It's one of the reasons we don't run windows in production here, but still - win64 should allow Houdini 64 to address more than 1 G. I'm curious about all this, too.

Cheers,

J.C.
John Coldrick
User Avatar
Member
268 posts
Joined: July 2005
Offline
I've experienced similar problems on windows a while back. On linux it was fine though (both were 32bit versions)
User Avatar
Member
509 posts
Joined: July 2005
Offline
AdamJ
I've experienced similar problems on windows a while back. On linux it was fine though (both were 32bit versions)

i was also impressed about this.. I “tried” to “bend houdini's back” by allocating ridicoulous amount of memory … running it on Linux Core 4 on my laptop (2Gib ram)… i allocated around 3Gib ram.. the system was unusable (disk slow and swapping too much).. but houdini never died….

pretty impressive.. i'll stick on Linux for sure. dunno if that's been a coincidence.. but that was enought for me to stick on Linux now.

cheers.
JcN
VisualCortexLab Ltd :: www.visualcortexlab.com
User Avatar
Member
325 posts
Joined: July 2005
Offline
all right, all right.

I did understand that windows OS sux by default (and normally I use Linux too, I am just having some problems with my perfectionistic-masohistic approach and desire to install latest unstable release of debian 64 which I failed to install so far ops: )
What I wished to find out by this post is whether there exists some cure to such an issue (other than wiping out the OS and installing Linux) =0) ?

Cheers.
I liked the Mustang
User Avatar
Member
4140 posts
Joined: July 2005
Offline
I doubt there's anything *you* can do about it - I was more hoping for a SESI post to answer some of the questions. I'm curious too, even though I'm aware of the differences in mem management.

If it's a critical thing, I'd go the tech support route. I'm fairly sure there's not some little setting in Windows that would affect this.

Cheers,

J.C.
John Coldrick
User Avatar
Member
31 posts
Joined:
Offline
few notes based on my personal experience …

windows 2000 is hopeless
in winXP you need to add /3Gb ( or /4GB ) and /PAE flag in the boot.ini file - the /PAE thing activates the 64 bit memory controler ( turned off by default ) - in this way you can use up to 64 Gigs of ram with 32 bit WindowsOS
winServer2003 - this is an amazing OS, no word - rock solid memory management, better than any windows/linux i have ever tried
Linux32 - the biggest difference from Windows ( after tweaking it's boot.ini ) is the performance of the virtual memory. Recovering back to operational state from a heavy memory push is just faster here.

never had chance to play with OSX and UNIX, but will … one day.
User Avatar
Member
2199 posts
Joined: July 2005
Online
That's very interesting, does it require Houdini to specifically utilise this extended memory, in which case does it? Have you actually tried Houdini with these settings and found that it can work above 1Gb on Windows?
I'm hitting these limits at the moment and probably going to go dual boot but this sounds like it might provided another possible solution.
The trick is finding just the right hammer for every screw
User Avatar
Member
4140 posts
Joined: July 2005
Offline
This gives a clear indication of the PAE thing:

http://www.microsoft.com/whdc/system/platform/server/PAE/PAEmem.mspx [microsoft.com]


Cheers,

J.C.
John Coldrick
User Avatar
Member
2199 posts
Joined: July 2005
Online
I saw that, but I was confused by this bit

Executables that can use the 3-GB address space are required to have the bit IMAGE_FILE_LARGE_ADDRESS_AWARE set in their image header. If you are the developer of the executable, you can specify a linker flag (/LARGEADDRESSAWARE).

So I'm wondering if Houdini has this set, and if it doesn't would turning this on make any difference. My next step was simply to try it I guess….
The trick is finding just the right hammer for every screw
User Avatar
Member
4140 posts
Joined: July 2005
Offline
I've seen masses of anecdotal reports on this - people who tried it and it didn't seem to do anything, lots of talk about your BIOS settings, etc etc. I'd be skeptical of it doing a lot on an XP workstation(note this is intended more for a server). I'd be curious if it works, please post the results, thanks….

Since those with pure 64 bit windows running 64 bit Houdini seem to be having trouble with large address spaces, this might indicate why, however.

Cheers,

J.C.
John Coldrick
User Avatar
Member
31 posts
Joined:
Offline
It does work.

The last time i tried it, i created 16 000 000 real polys in Maya.
About 25 000 000 real polys in XSI ( XSI has better memory efficiency than Maya ).
And i believe 12 000 000 ( or maybe 18 000 000 ) in Houdini.
Cant remember the exact numbers anymore.

I have only 2 Gigs of RAM in my home PC and i'm using the free versions of the applications - maybe the full versions require little more RAM.

I dont think that a 32 bit compiled programs can allocate more than 4 gigs or RAM, not an expert in this area, maybe somebody else can bring some light here. But i think that if you have available 64 gigs of RAM you can open multiple sessions of Houdini, Maya, XXX, etc without killing your PC ( maybe not ).

What i noticed is that at the moment Houdini reaches the limits of the phisical ram, the application immediatelly crashes.
Maya under Windows continutes dumping stuff in the virtual memory.
Maya under Linux reports something like “memory excetion thrown” and refuses to execute the operation ( adding more polys to the scene ), but the application is still alive and you can save your butt saving the scene or even continue your work.
XSI under Windows just continues dumping stuff in the virtual memory. What is really good about XSI (5+) is that you can actually render whatever you have in the scene - if you can see it in the viewport - you can render it. Maya and Houdini just cant do that.

After some point all apps i tried became useless - waiting too mutch for the next iteration of copy/paste polygons
Never reached the limits of the virtual memory. Qurious to see what happens then. Maybe one day will give it a try

Sounds little nerdish, brute force entertainment, but it was interesting to see how far i can go with the available software these days.
User Avatar
Member
325 posts
Joined: July 2005
Offline
JColdrick
Since those with pure 64 bit windows running 64 bit Houdini seem to be having trouble with large address spaces, this might indicate why, however.

Cheers,

J.C.

64bit Houdini for 64bit windows, hugh? Is it something accessible to the chosen ones, or am I searching in the wrong place? Coz I think there is only 32 bit Windows Houdini version aviable, or am I wrong?

And, btw, as I stated above, I still couldn't go beyond 1 Gig of RAM even under Windows x64, though the ammount of particles was 16 000 000 (much more than under Windows XP 32 bit). And yes, my system does see all my memory aviable (which is 4Gigs + 4Gigs of cache)
I liked the Mustang
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Doh. Can you tell I don't run Houdini on Windows?

You're quite right, there's no win 64 bit distro available yet, and all this discussion is meaningless because if you're running a 32 bit app on a 64 bit platform, it's still a 32 bit app.

Peship, you say “it works”, and then follow up by saying you're running a 2G machine. I'm not sure we're talking about the same thing.

Cheers,

J.C.
John Coldrick
User Avatar
Member
325 posts
Joined: July 2005
Offline
btw, maybe I have missed something - but actually is Houdini build for Debian 64 bit version - is Houdini itself a 32 bit or 64 bit compile?
I liked the Mustang
User Avatar
Member
4140 posts
Joined: July 2005
Offline
The 64 bit version is a 64 bit compile, yes.

Cheers,

J.C.
John Coldrick
User Avatar
Member
325 posts
Joined: July 2005
Offline
ok, thanks.

64 bit Houdini compile is cool, but why does it exist for debian 64 only? (at least what is availible for public download). Does SESI plan to release 64bit compiles for, say, Fedora Core amd64bit as well? Or is it because of some dev.issues that can not be easily resolved?

What do you think?
I liked the Mustang
User Avatar
Member
31 posts
Joined:
Offline
Yes, that's right, i have only 2 gigs of RAM at home.

As you read in the article you linked to, from your previous post, there is a flag needed to let the software be a large address aware. If the application is compiled this way, then you only need to tweak the boot.ini thing of Windows and you are ready to go.
If the software is not compiled this way, then it doesn't matter how many flags you will put in boot.ini.

This exactly was the case with Maya.
All the versions prior Maya6 for win32, just die at the moment they reach 1.5 gigs of RAM.
Maya6 was compiled with the large address option included and now it just works. I tried it again ( few minutes ago ) and it still works - stopped the experiment at fully occupied 2 gigs of RAM + filled-up 2 gigs of virutal memory in addition.
If you try the same with Maya5, for example, it will die on 1.3-1.5 gigs of RAM without even touching the virtual memory.

But, wait. I cant reproduce the same with my XP and Houdini8 anymore.
H dies on 1.5 gigs.
Now that's strange.
User Avatar
Member
4140 posts
Joined: July 2005
Offline
MAD - I think the different release candidates are based on the particular platforms SESI has in their development cycle. Hopefully, of course, the systems they're running run on far more Linux distros(and they do). It's a pretty good cross section. I'm running SUSE 32 and 64 bit, and I can get workable releases for both of those platforms with the options they offer. Give it a try. Plus, as 64 bit becomes more the standard(and more useful, for that matter), I'm sure you'll see the 64 bit compiles increasing, such as the windows version. Most software companies are in the same boat - it's a process, not a switch.

Cheers,

J.C.
John Coldrick
User Avatar
Member
31 posts
Joined:
Offline
For now i'm locked on 32 bit hardware ( both at studio and home ), cant comment on 64 bit distros at all.
  • Quick Links