Houdini 12 and AMD 7970 - disappointment

   24707   20   4
User Avatar
Member
5 posts
Joined: June 2012
Offline
Since I read great things about Houdini 12 and the improved viewport, I installed both the latest build and the production solid one. They both crash on my system (Fatal error: Segmentation Fault)

Doing some research, it seems Houdini 12 does not play nice with the later drivers, and vice versa. I tried 12.4, and 12.6. Downgrading to a lower version is not an option for me.

What is it these days with all the driver/opengl issues and 3d software? Are we forced to buy overly priced Quadros, which in my experience do not improve the viewport performance at all, unless special drivers are written (such as for Max and Maya)?

I downloaded Houdini with great expectations, and I have been downloading trials/free educational versions of all commercial 3d applications to decide on which one to purchase for a complex project. Houdini is the *only* one not to work. Please fix this Side Effects - you are losing customers.
User Avatar
Member
102 posts
Joined: March 2012
Offline
ive changed my 7970 to gtx680 4gb a month ago because of this crash.
seems like nothing has been changed with 7970
User Avatar
Member
7024 posts
Joined: July 2005
Offline
I'd recommend not blaming Houdini. I know that SESI have done a lot of work to make Houdini play better with ATI(AMD) graphics. The reality is: Professionals use Nvidia, and Houdini is a professional application. The reason is, Nvidia's drivers are designed for production use. ATI (AMD) have consistently, for years and years, issued bad drivers intended for home use.

I'm sorry you're having trouble, are you able to return the card and get an Nvidia card? You'll be a lot happier

Cheers,

Peter B
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
You can also try switching to the H11 renderer, which uses far older OpenGL calls that may work on the current AMD driver. Edit the $HOME/houdini12.0/houdini.pref file and change:
glrenderer.val := “GL3”
to
glrenderer.val := “H11”
If houdini.pref does not exist, simply create that file in a text editor with the line above it in.

We are working on getting Houdini 12 running correctly on AMD cards. It's been a bit of a 2 steps forward, 1 step back process, unfortunately. AMD has announced that they are slowing their monthly driver release down to “when needed” releases, so my hope is that the quality will improve.
User Avatar
Member
5 posts
Joined: June 2012
Offline
So, explain this to me:

In SoftImage I get 160fps in shaded mode with the benchmark scene script found here (~4million verts/polys):
http://www.xsibase.com/forum/index.php?board=9;action=display;threadid=45465;start=240 [xsibase.com]

The Quadro 4000 and 580gtx achieve a paltry 60fps. Softimage runs rock-solid on my card. Actually, all the 7xxx cards run wonderfully fast in SoftImage.

Blender: again, great performance. I can go up to 34million faces, and still have a workable viewport. On consumer Nvidia cards (4xx up to 6xx) double sided lighting is no longer hardware accelerated (thanks Nvidia), so unless you turn off double sided lighting for all meshes, performance is on par, or worse, with an ancient 8800gt. And even with that turned off, Nvidia (consumer) cards cannot compete with ATI cards in solid and wireframe mode (though textured mode is faster on Nvidia cards in Blender)

At my work I tested Quadros 2000/4000 in Blender, and their performance was lackluster (granted, rather “low-end” models). Though they functioned well for both Maya and Max (special drivers).

Cinebench: I get 70fps, which is faster than a Quadro 6000.
http://www.cgchannel.com/2011/10/review-professional-gpus-nvidia-vs-amd-2011/ [cgchannel.com]
Actually, Cinema4d always has favoured ATI/AMD cards in regards to viewport performance, so c5d users recommend those over Nvidia.

Lightwave 11: good performance, and no issues at all.
Modo: same.

I have been working with Cinema4d, Blender and Lightwave, as well as other 3d apps, for the last five years, and found no issues with ATI/AMD cards at all (except for a selection lag in Blender which was quickly resolved by a patch: selection of objects is now fastest in Blender, even compared to Maya, Max, Modo and c4d).

Anyway, I am ranting here. I have been working professionally with most of these apps for some time, and found no major issues with AMD cards: quite the contrary, the current “consumer” 7970 outperforms all the graphics cards I worked with so far - including several Quadros at work.

Unless special drivers exist (such as the ones for max and maya), Quadros may not be the best answer.

Which brings me back to my original question: I tried the tip to downgrade the viewport opengl mode, and for a couple of minutes it seemed to work. However, I experience ati driver crashes all the time, so it is not really a solution.

Hopefully this will be resolved when the next drivers come out (in a couple of months?). I do understand Houdini is using more modern opengl functions, so it might be worth the wait. It is just a shame that one of the most powerful graphics cards on the market (with opencl performance blowing Nvidia cards out of the water as well) cannot be used by Houdini at this point.

pbowmar
I'd recommend not blaming Houdini. I know that SESI have done a lot of work to make Houdini play better with ATI(AMD) graphics. The reality is: Professionals use Nvidia, and Houdini is a professional application. The reason is, Nvidia's drivers are designed for production use. ATI (AMD) have consistently, for years and years, issued bad drivers intended for home use.

I'm sorry you're having trouble, are you able to return the card and get an Nvidia card? You'll be a lot happier

Cheers,

Peter B
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
Simply put,the issue with AMD has never been their hardware – it's always been good. Subjectively, though, they've had more regressions in their OpenGL drivers than Nvidia has, though about 2 years ago they started putting serious effort into stabilizing them.

The thing with driver bugs is that they are usually very specific in nature, so it's possible for many applications to avoid them by the simple fact that they never happen to run into that codepath. This doesn't make them any less serious, though. Houdini's entire UI is drawn in OpenGL, so I can only speculate that this puts more strain on the OpenGL system than most.

However, I do understand your frustration. Tracking down driver issues and patching them with workarounds is no fun on this end, either.
User Avatar
Member
648 posts
Joined: July 2005
Offline
Maybe you can build a tool that automatically tests these code-paths against Houdini functions… and call this tool the wailer.
User Avatar
Member
523 posts
Joined: July 2005
Offline
Herbert123
Since I read great things about Houdini 12 and the improved viewport, I installed both the latest build and the production solid one. They both crash on my system (Fatal error: Segmentation Fault)

Doing some research, it seems Houdini 12 does not play nice with the later drivers, and vice versa. I tried 12.4, and 12.6. Downgrading to a lower version is not an option for me.

What is it these days with all the driver/opengl issues and 3d software? Are we forced to buy overly priced Quadros, which in my experience do not improve the viewport performance at all, unless special drivers are written (such as for Max and Maya)?

I downloaded Houdini with great expectations, and I have been downloading trials/free educational versions of all commercial 3d applications to decide on which one to purchase for a complex project. Houdini is the *only* one not to work. Please fix this Side Effects - you are losing customers.

hello,

does modifying the houdini pref file help?

thanks,

bern
User Avatar
Member
599 posts
Joined: May 2011
Offline
Herbert123
In SoftImage I get 160fps in shaded mode with the benchmark scene script found here (~4million verts/polys):
http://www.xsibase.com/forum/index.php?board=9;action=display;threadid=45465;start=240 [xsibase.com]

The Quadro 4000 and 580gtx achieve a paltry 60fps. Softimage runs rock-solid on my card. Actually, all the 7xxx cards run wonderfully fast in SoftImage.

Both NVidia cards getting 60fps is very likely due to v-sync being enabled, either by Softimage or through the NVidia control panel – and it being disabled on the ATI card. Try turning it off in either the NVidia control panel or in the Softimage viewport preferences.

When I was at Softimage, we spent an insane amount of time to work around horrifying OpenGL spec-breaking bugs in the ATI drivers, which ATI only very reluctantly fixed. An incredible amount of good man-hours wasted on very poor technology, something that could've been better spent on more productive things.

I can't un-recommend ATI cards enough for production work.
Halfdan Ingvarsson
Senior Developer
Side Effects Software Inc
User Avatar
Member
5 posts
Joined: June 2012
Offline
no, vsynch was turned off in those tests.

Anyway, as far as I am concerned the consumer line of cards of AMD currently outperform the Nvidia ones in most 3d apps with opengl based viewports - unless we take the Quadro 6000 with app-specific drivers into account. This, at least, is my conclusion after heavy testing.

I do agree that the situation was very different 4~5 years ago - but now, with Nvidia disabling hardware acceleration for certain opengl features in their consumer cards (4xx-6xx), such as double sided lighting for polygons, and the current AMD drivers for 5xxx-7xxx seemingly non-problematic in most opengl viewports, the boundaries are much more blurry. Things are not made more transparent by most applications still using deprecated opengl functions, or, as in Houdini's case, actually using more modern features. My experience is that mileage may vary quite a bit depending on the 3d app used. It's a veritable mine field, with one app working better with AMD, and the next with Nvidia. And let's not talk about MacOs X and its opengl.

And with GPU rendering things are currently more geared towards CUDA, rather than OpenCL (AMD drivers still ‘suck’ in that regard) - but again, depending on the render engine it may not be the case (for example, Luxrender works better with AMD opencl).

Of course, you could argue that for production work one should buy a Quadro - but not all of us have that option financially.
User Avatar
Member
5 posts
Joined: June 2012
Offline
Good news: the latest version works. and does not crash.

Bad news: testing with a semi-complex object downloaded from
http://graphics.cs.williams.edu/data/meshes.xml [graphics.cs.williams.edu]
(furry ball object, 1.4 million vertices, 2.8 million faces)

I get about 2/3 fps in the viewport in object mode. That can't be right: LW11, Cinema4d, Blender and Softimage all have no problem at all with this object (blender: 100fps). Softimage is a bit silly with 22 of these still running a smooth viewport. Even Blender's viewport runs at 10fps with 12 of these.

Also, selection is extremely laggy: in object mode it takes up to 7~8 seconds before the furry ball gets selected. In all the other apps this is not an issue either.

And to make matters worse: trying to enter edit mode, and select a vertex causes a pause of a minute or two, after which Houdini causes the AMD drivers to crash, which crashes Houdini as well (obviously).

So, what can I do to improve the viewport performance? I tend to have to deal with complex objects, and I wonder if there is another setting I am missing? I looked for a VBO (vertex buffer object) setting, and tried all the different view modes, but all of those run at the same 2/3 fps - this must be cpu-limited, not gpu limited.

Or perhaps do Houdini and AMD gpu just don't play well at all? Next week I will check the performance of Houdini on the Xeon workstations at work (Nvidia quadro 2000).

Too bad - but at least I am able to test some of the functionality of Houdini, which is promising.
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
I get about 2/3 fps in the viewport in object mode. That can't be right: LW11, Cinema4d, Blender and Softimage all have no problem at all with this object (blender: 100fps). Softimage is a bit silly with 22 of these still running a smooth viewport. Even Blender's viewport runs at 10fps with 12 of these.

That does seem like an AMD/Houdini specific problem. A GEForce 670 gets ~55fps on my system with 12 instances (object level shaded), while 22 gets around 30fps. My guess is that the GLSL being executed is not very optimized by the AMD driver, since the GCN architecture is still very new.

And to make matters worse: trying to enter edit mode, and select a vertex causes a pause of a minute or two, after which Houdini causes the AMD drivers to crash, which crashes Houdini as well (obviously).

We submitted this issue to AMD this past week. It only seems to affect the new 7000 series Radeons.
User Avatar
Member
5 posts
Joined: June 2012
Offline
Yes, I thought so myself - thank you very much for looking into this.

AMD has its work cut out for itself, since CUDA is also better supported in most applications.
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
Try build 12.0.713 or 12.1.37. This bug also affected the new FirePro W series boards, and I've fixed the problem on that hardware, so I believe the Radeon 7000 series should be fixed as well.
User Avatar
Member
383 posts
Joined: June 2010
Offline
Are there just viewport problems or does opencl also make troubles ?
I think about buying the 6 GB version (490$) just for opencl acceleration and the viewport is rendered by my good old slow quadro fx 3800 .. do think this will work satisfactory ?
www.woogieworks.at
User Avatar
Member
258 posts
Joined: July 2006
Online
Can anyone confirm the AMD 7950 or equivelant works properly and fast in H12, I am a few days from buying it, I dont want Nvidia as nowadays it has abysmal performace on Maya. Thanks in advance
Head of CG @ MPC
CG Supervisor/ Sr. FX TD /
https://gumroad.com/timvfx [gumroad.com]
www.timucinozger.com
User Avatar
Member
30 posts
Joined: July 2005
Offline
are there any studios that use amd cards?
User Avatar
Member
383 posts
Joined: June 2010
Offline
vtwinracersp2
are there any studios that use amd cards?
For opencl nvidia is no real option ..
For Viewport AMD is no real option ..

I guess it depends .. we plan to buy amd cards just as second card for opencl acceleration ..
www.woogieworks.at
User Avatar
Member
102 posts
Joined: March 2013
Offline
I guess it depends .. we plan to buy amd cards just as second card for opencl acceleration ..

excuse me. can I question how to change which card for opencl acceleration?
thank you!
User Avatar
Staff
5156 posts
Joined: July 2005
Offline
excuse me. can I question how to change which card for opencl acceleration?
thank you!

If you have two cards, set the environment variable HOUDINI_OCL_DEVICENUMBER to 0 to use the primary card (the one doing the rendering) or 1 for the secondary card. You can check which one is being used in the Help > About Houdini, Show Details dialog (after the GL extensions section).
  • Quick Links