Consumer nVidia GPU configuration (up to 7xxx) for Houdini

   49721   29   4
User Avatar
Member
33 posts
Joined:
Offline
I will put up a configuration guide for the 8xxx series in a little while or maybe tomorrow (whereupon this line will be replaced with a link to that thread)

OK,
Many of you may for one reason or the other opted for a consumer GPU instead of a Quadro.
The following will require about a dozen system restarts almost after every step you make so as to ensure everything is registered within windows properly.

————————————————————————————–
Though virtually nothing in this guide can cause permanent damage to your hardware or software I should state the following:
Before you begin you should know you are responsible for anything that may go wrong with your hardware or software and I am in no way responsible for anything you do while following instructions from this guide.
————————————————————————————–

It is known for a fact that current nVidia forceware drivers don't “play well” with OGL apps like Houdini.
You either get annoying lag when you navigate your viewport, your screen may be garbled or you suffer from refresh issues with any form of interactive part of the UI (example) [3dspark.com]
If you have any card up to and including the 7xxx series all you have to do is obtain RivaTuner 2.0 [guru3d.com] (or higher) as well as the nVidia forceware graphics driver [nvidia.com] suited for your system.
Install the driver for your card but DO NOT DELETE the temp folder (usually x:\NVIDIA\Winxxx\driver.version) as you will need it later.
After the obligatory system restart install RivaTuner (typical installation) and restart again.


Run RivaTuner and click on


And then select


Click on the NVStrap driver Tab
And click Install, click OK and restart your system as prompted.
After a successful restart return to the NVStrap Tab and from the drop-down menu for Graphics adapter identification select “Custom” and click Apply.


A new option box will appear where using the up and down arrow buttons you need to select the device ID corresponding to the Quadro equivalent of your GeForce graphics card.
At this point it won’t hurt your system to try until something works (select a device ID and then restart the system) but here’s a couple I know from experience that work.
“GeForce 7900 GTX” can become a “Quadro FX 4500” with ID 009D
“GeForce 7900 GX2” can become a ”Quadro FX 4500 X2“ with ID 029F
”GeForce 7950 GX2“ can become a ”NVIDIA Quadro FX 5500" with ID 029C
After you select the desired Device ID you need to restart your system as prompted.

Upon a successful system restart right click on My Computer, click on “Properties”, select the “Hardware” Tab and click on Device Manager.
On the Display Adapters section select your graphics card (most probably will have an exclamation mark next to it), right click and select “Update Driver…”
A new Hardware Update Wizard window will come up presenting you with a few options. Select the radio button marked “No, not this time” and click Next.
Next you will be asked what you want the wizard to do, select the radio button marked “Install from a list or specific location (Advanced)” and click Next.
On the next step choose the installation option radio button marked “Don’t search, I will choose the driver to install” and click Next.
You are now prompted to select the device driver you want to install. Uncheck the box marked “Show compatible hardware” and select NVIDIA from the Manufacturer list. On the right hand side you will be presented with a (rather long) list of Models. Scroll through it to locate the Quadro you want your GeForce to pretend to be. Click on Next and follow the prompts to restart your system.

You should now be able to run Houdini without any GPU/display driver related issues.
User Avatar
Member
2199 posts
Joined: July 2005
Offline
Interesting stuff.

Are you saying this is a better way to run a consumer nVidia card or just a way to fix any driver issues?
The trick is finding just the right hammer for every screw
User Avatar
Member
33 posts
Joined:
Offline
It is intended MOSTLY as a fix.
Now there is a possibility you might get better OGL performance but your systems' performance will be lowered as your hardware is treated as a Quadro, not meant for gaming stuff.
User Avatar
Member
2199 posts
Joined: July 2005
Offline
ok. I've run all manner of nvidia cards and never had any problems as long as you don't use the latest drivers. 61.82 seems to work fine in most situations. Though I haven't tried it on the latest cards mostly 5xxx series
The trick is finding just the right hammer for every screw
User Avatar
Member
1192 posts
Joined: July 2005
Offline
Simon
61.82 seems to work fine in most situations.
The problem is that you cannot run new-generation cards (like 7xxx or 88xx) with 61.82 drivers…
I hope Houdini becomes more flexible regarding the graphic card (instead of blaming nVidia or ATi).

Dragos
Dragos Stefan
producer + director @ www.dsg.ro
www.dragosstefan.ro
User Avatar
Member
7715 posts
Joined: July 2005
Offline
digitallysane
I hope Houdini becomes more flexible regarding the graphic card (instead of blaming nVidia or ATi).

I think the fact that things suddenly work fine when you configure the drivers from GeForce to Quadro should tell you something.
User Avatar
Member
1192 posts
Joined: July 2005
Offline
Of course, that Houdini doesn't play nice with consumer graphic cards. It should.

Dragos
Dragos Stefan
producer + director @ www.dsg.ro
www.dragosstefan.ro
User Avatar
Member
7715 posts
Joined: July 2005
Offline
But you're running the same hardware. It's just different drivers. Why are the drivers different? Who stands the benefit from that? Certainly not Houdini.
User Avatar
Member
1192 posts
Joined: July 2005
Offline
It's true, but there are cases (like it's happening now) when the consumer edition (8800 GTS for example) is available but not yet the Quadro version based on the same chipset (G80).
So one can order a very very powerful card (like I just did :-) ) and then find it works worse than a much older Quadro.
I know Quadro are optimised for 3D professional applications, but I think 3D software should run in an _acceptable_ manner on a consumer card (and of course better on the Quadro version). For example Softimage XSI runs flawlessly on a consumer GeForce 7950 with the latest drivers.

Dragos
Dragos Stefan
producer + director @ www.dsg.ro
www.dragosstefan.ro
User Avatar
Member
7715 posts
Joined: July 2005
Offline
Are you reading nutman's post? He's running the *same* video card (a GeForce). He's NOT running a Quadro video card. This has nothing to do with GeForce hardware vs. Quadro hardware. Nutman has a GeForce video card, and all he's done is change his drivers. I still use a GeForce 2 GTS at home and I've seen less problems. This isn't a hardware issue and everything to do with OpenGL drivers. I keep my hardware the same, my Houdini version the same, and all I've done is changed drivers.
User Avatar
Member
1192 posts
Joined: July 2005
Offline
Yep, I read it very well because it's a topic I'm highly interested in.
And no, he didn't _only_ change his drivers. He also used Riva Tuner software to install a very low level “driver” which tricks Windows into thinking the GeForce card _is_ actually a Quadro. That, in turn, made possible to install Quadro drivers on the same GeForce. I'm curious to see the solution he promised regarding the GeForce 8800, because these have no equivalent Quadro yet, so no available Quadro drivers to install.

Dragos
Dragos Stefan
producer + director @ www.dsg.ro
www.dragosstefan.ro
User Avatar
Member
7715 posts
Joined: July 2005
Offline
It's not tricking Windows into thinking that the GeForce is a Quadro, it's tricking the nVidia OpenGL driver to think that it is running on a Quadro.
User Avatar
Member
33 posts
Joined:
Offline
edward
It's not tricking Windows into thinking that the GeForce is a Quadro, it's tricking the nVidia OpenGL driver to think that it is running on a Quadro.
Exactly.

Truth is Quadros have a different hardware ID and that's what distinguishes them from consumer cards.
Now there's some manufacturers like EVGA that hand-pick the G7x or G8x chips for the cards and install only the best memory in their implementations because they expect the consumer to overclock the heck out of that board.
They even stress-test the card to a simillar extent to that of Quadros.
The end result is a graphics board that is virtually indistinguishable from a quadro save for the Hardware Device ID (which is what we faked using RivaTuner)

The driver “reads” the Device ID and loads the respective libraries (ie. OGL)

As for the above guide, everything described is from memory only since I moved to the 8800GTX on half of my workstations since early November (the other half use Quadro GPUs)

I will put up a guide for doing pretty much the same for the 8800GTX / GTS (that measure will be temporary until nVidia releases a Quadro implementation of the G8x chip) just to get Houdini working smoothly without setting the environment varriable for software OGL which has a lot of lag when navigating the viewport.

In fact, if you can (somehow) become a Registered Developer for nVidia you can have access to a list of hardware IDs for all nVidia GPUs. Fortunately or not, I can not publish such a list without getting the heck sued out of me but suffice it to say nVidia is playing some funny games with us… http://www.gpureview.com/show_cards.php?card1=432&card2=395 [gpureview.com]
User Avatar
Member
1192 posts
Joined: July 2005
Offline
nutman
As for the above guide, everything described is from memory only since I moved to the 8800GTX on half of my workstations since early November (the other half use Quadro GPUs)
I will put up a guide for doing pretty much the same for the 8800GTX / GTS (that measure will be temporary until nVidia releases a Quadro implementation of the G8x chip) just to get Houdini working smoothly without setting the environment varriable for software OGL which has a lot of lag when navigating the viewport.
So you mean you make the drivers think the 8800 GTX is actually a Quadro 4500 or 5500 (a previous generation chip)?
On the other hand, I'd like to know (if you have experience with that) how other OpenGL apps (like XSI, Maya, After Effects etc) are running on the “plain” 8800GTX (that is, without the “Quadro hack”).

Dragos
Dragos Stefan
producer + director @ www.dsg.ro
www.dragosstefan.ro
User Avatar
Staff
5158 posts
Joined: July 2005
Offline
While it certainly seems like a shifty practice on the surface, optimizing your driver for a given task set is actually a pretty good idea. Consumer cards are optimized for 3D games - fullscreen, single window apps running at high frame rates. Professional cards are optimized for 3D apps - windowed, multi-viewport apps running at excellent quality at a small expense to these frame rates.

Nvidia is unlikely to fix professional app bugs in their consumer cards, and I can see their point - if you're running professional apps, buy a professional card. The fact that pro cards are 4-5x more $$ is a little shifty IMO though.

We are developing better methods for OpenGL rendering that should work better for both Quadro and GEForce cards, especially the elimination of artifacts. However, we can't change how nvidia optimizes their drivers - so the performance scales will probably always tip in favour of the system with the Quadro, as developing a fullscreen, single window Houdini is not in the cards.
User Avatar
Member
33 posts
Joined:
Offline
digitallysane
So you mean you make the drivers think the 8800 GTX is actually a Quadro 4500 or 5500 (a previous generation chip)?
Precisely
That is because there's no current generation (G8x) chipped Quadro cards yet.
It's the safest bet if you want to get the job done.


twod
We are developing better methods for OpenGL rendering that should work better for both Quadro and GEForce cards, especially the elimination of artifacts. However, we can't change how nvidia optimizes their drivers - so the performance scales will probably always tip in favour of the system with the Quadro, as developing a fullscreen, single window Houdini is not in the cards.
I wouldn't expect you to cater for gamers' hardware.
Houdini is a professional app in a class of it's own and as a professional using Houdini I can only rely on purpose-built hardware (Quadro) to get the job done.
I have apprentice loaded on a few other machines so I can try out new things and a couple of them are based on (mostly) off the shelf hardware.
So the idea of using consumer graphics cards with Houdini applies mostly to people who try their hands on Apprentice.

Truth be told, if you can manage to have GeForce cards working well with Houdini without the need for any tricks you could attract more users as future professionals who learn using Apprentice..

But you gave me an idea for a feature request for H9.
How about the ability to switch from the normal UI to a full-screen viewport or tear off a viewport to display it as full screen on a second display? Some tasks can be accomplished with hotkeys only and I know multi-display users (like me) would be eternally gratefull for a feature as such
User Avatar
Member
1192 posts
Joined: July 2005
Offline
nutman
digitallysane
I wouldn't expect you to cater for gamers' hardware.
Houdini is a professional app in a class of it's own and as a professional using Houdini I can only rely on purpose-built hardware (Quadro) to get the job done.
I do not necessarily agree. As twod said, Quadro cards are much too expensive, and I don't think they justify their price (maybe the SDI versions, for a Piranha or Flame workstation). Also, the gaming hardware is advancing very fast and offers more options depending on the job.
I just recently ordered a new workstation (waiting for it to arrive…) and after much thought I specified a 8800 GTS instead of a Quadro 4500. It seemed crazy to me to spend 1500 USD on a 4500 when 500 bucks can get me at least double speed and a much advanced architecture.
While I agree that a professional app should give a better experience on a Quadro, I think it's reasonable to expect it to perform _decent_ on a consumer card (that is, without artefacts etc). Most other professional apps actually do.

Dragos
Dragos Stefan
producer + director @ www.dsg.ro
www.dragosstefan.ro
User Avatar
Member
2199 posts
Joined: July 2005
Offline
nutman
But you gave me an idea for a feature request for H9.
How about the ability to switch from the normal UI to a full-screen viewport or tear off a viewport to display it as full screen on a second display? Some tasks can be accomplished with hotkeys only and I know multi-display users (like me) would be eternally gratefull for a feature as such

You can already work full screen if you want, just hit alt + ' in the viewport.
Or press the maximise button top right next to the close button.

The tear off is the next button along.
The trick is finding just the right hammer for every screw
User Avatar
Member
33 posts
Joined:
Offline
Funny thing is I have a measly (for the task) VAIO UX280p that has an on-board intel GPU with shared 128MB memory and it runs Houdini like a dream (albeit at 1024x600)

And no, I do not agree with the multiple price figure of Quadro cards versus GeForce cards especially since I chose a GeForce 8800GTX (in fact I have 4 in my two HP dual opteron workstations) over a Quadro.

The only card I have briefly owned that justified the price was the Realizm 800.
My heart weeps for 3D Labs

Noooo, I didn't mean the Alt + ' thing OR that kind of a tear-off. What I meant was for JUST the OGL viewport to be independent of full screen taking up 100% of the screen real estate.
User Avatar
Staff
5158 posts
Joined: July 2005
Offline
While I agree that a professional app should give a better experience on a Quadro, I think it's reasonable to expect it to perform _decent_ on a consumer card (that is, without artefacts etc). Most other professional apps actually do.

This is what our current development on OpenGL rendering addresses.
  • Quick Links