Memory leak on Mac ML with latest stable build

   2734   5   0
User Avatar
Member
379 posts
Joined: 12月 2006
Offline
Not sure why is this happening, but one scene where I imported two obj models (aprox 25mb each), eats lot of my RAM, around 8gb, where same scene on Windows does not take more than 1.5gb. When I import model everything looks OK, but as soon as I start working on model, selecting, creating groups, Houdini starts slowly eating RAM. So scene that used 1.5 GB of RAM at beginning, uses 8gb at the end. Clearing cache does not help at all.

How can I see if this is some memory leak or bug?

Might be some OpenGL problem, I noticed after just selecting polygons it eats 1gb of RAM …. I have custom card, Nvidia GTX570 on it.
User Avatar
Member
379 posts
Joined: 12月 2006
Offline
Same is happening on daily builds. And it is related to GL2.1. When I switch to H11 problems disappear.

Am I only one experiencing this?
User Avatar
スタッフ
5185 posts
Joined: 7月 2005
Offline
Which OSX version are you running? And which Houdini version?
User Avatar
Member
379 posts
Joined: 12月 2006
Offline
Mountain Lion, 10.8.2. Latest daily build, and latest stable build.
User Avatar
スタッフ
5185 posts
Joined: 7月 2005
Offline
How did you get an Nvidia 570 running on a Mac? It's not an officially supported card by Apple, and Nvidia doesn't have any OSX drivers for that card. If you look in Houdini's Help > About Houdini dialog and click Show Details, is it showing the renderer as “Nvidia 570”, or “Software renderer”?
User Avatar
Member
379 posts
Joined: 12月 2006
Offline
It works under OSX without problem. Houdini recognize it, as any other 3d app I have, even CUDA works perfectly.

OpenGL Renderer: NVIDIA GeForce GTX 570 OpenGL Engine
OpenGL version: 2.1 NVIDIA-8.0.61
  • Quick Links