Search - User list
Full Version: Cg fragment/vertex shaders in Houdini
Root » Technical Discussion » Cg fragment/vertex shaders in Houdini
xiondebra
Hi there,

Are there any plans to utilize Cg fragment and vertex shaders in Houdini? Don't know if I'm asking the question correctly but wouldn't that help with interactive texturing and shader development?


–Mark
digitallysane
I'm not too familiar with this, but as I understand Cg is just a development language (an interface) for DirectX and OpenGL shaders. So you write a Cg shader and compile it for different architectures. Maybe this could be achieved directly in VEX, by creating a new context (Real Time Operator). Or extending the already existing OpenGL context.

Dragos
xiondebra
No, Cg is a language for the graphics hardware itself. You can read about it on Nvidia's web site. I took a short course on it at SIGGRAPH and it was pretty impressive, especially for an assembler programmer and someone who likes to dig around at the h/w level … :-)
JColdrick
How's that catching on, anyway? I've not been following it…I've heard it's The Cat's Meow, but if implementation on non-NVidia hardware is less than stellar then it won't fly… Personally, I've always been suspicious of a manufacturer coming up with a new “standard” without an agenda attached.

Well, OK, MIDI from Sequential Circuits was an exception.

Cheers,

J.C.
xiondebra
JColdrick
Well, OK, MIDI from Sequential Circuits was an exception.

It's way cool! I'd be doing more Cg stuff but when do I find the time …
:-(

But I know *exactly* what you mean about manufacturers coming up with the latest and greatest standard …

Can you believe MIDI has lasted so long? I've loved it the day it was released and still do … a very nice compact protocol … and it's grown so far beyond the original idea of “syncing” up 2 synths!

Another “success” story I would cite is the RIB spec …


–Mark
digitallysane
Hmm, it's getting pretty confusing for me. I browsed through the pages at
http://www.graphics3d.com/guides/cg_1_1/index.html [graphics3d.com] and http://www.fusionindustries.com/content/lore/code/articles/fi-faq-cg.php3 [fusionindustries.com] and I read something like this:
The Cg compiler translates the Cg high-level shading language (which is extremely similar to the DirectX HLSL) to assembly code that can be loaded on graphics cards supporting DirectX….
So, yes, it's a hardware-oriented programming language, but it finally compiles into DirectX (or OGL) “assembly”. This made me think that (at least in theory) VEX can be extended for use in hardware programming.
How do you see the implementation of Cg shaders in Houdini? Would they be called from VEX shaders? Would Houdini have a Cg editor/compiler embedded? A translator of VEX in Cg?

Dragos
MichaelC
I think it might've been from last year, 3DLabs put out a paper on Open GL 2.0's HLSL and it had a quote from one of the Side FX developers talking about the potential of incorporating it into Houdini. The statement made it sound like something they were working on, of course there hasn't been a peep since. I asked someone at Siggraph about it, and they said it wasn't a priority right now for Houdini because their customers weren't asking for it.

Personally, I don't think Cg is the way to go, especially since it doesn't perform well on non-Nvidia hardware, and ATI is making huge inroads in many markets right now. DX HLSL also isn't the answer, since it's Windows/Xbox only. I kinda get the feeling, from the programmers that I've talked to, that most people are really sitting on their hands waiting for Open GL 2 and the hardware that supports it. Open GL 2, is probably going to be the one worth supporting because it's cross platform, and the word is that Sony's next PS console will again be insanely difficult to program, however, they are building a Linux centric development environment using Open GL 2, and they will support fragment an vertex shaders via Open GL. If things continue the way they have been in game development, Open GL 2 will be adopted very rapidly because PS3 will be the hard target, and all other systems will be ports.

Now if SideFX can incorporate Open GL 2's HLSL via VOPs, it'll be an absolutely brilliant tool for future game development, especially if what people are saying about the PS3 is true, and probably with two more hardware cycles it'll be viable for feature work. It's something I hope SideFX is seriously thinking about.
MichaelC
I found the quote from SideFX, it's in 3DLabs 2002 Open GL Siggraph presentation, from Paul Salvini himself, CTO of Side FX.

http://www.3dlabs.com/support/developer/ogl2/presentations/siggraph/ogl2intro.pdf [3dlabs.com]

We're at the point where we can apply an OGL2 shader through our (Houdini) interface and (given an Equivalent VEX shader) watch the software renderer (Mantra) draw the same thing but much, much slower :-). It's one of those jaw dropping “wow” moments actually, so we thank you for making it happen!… It rocks. Having read the original white paper still did not prepare us to actually see it working. The ease with which we can now define and adjust OGL2 shaders is astonishing."
MichaelC
digitallysane
Hmm, it's getting pretty confusing for me.

Ok then, I'll explain it. These fragment and vertex shaders are actually self contained little programs like the shaders you'd use in Houdini or PRMan. The code for them is assembly, and it varies slightly from card to card, becuase each hardware manufacturer obviously designs their own chips.

The assembly required to write a vertex shader is difficult for most programmers to deal with, and it's compounded by the fact that if the developer needs to support more than one piece of hardware, they'd have to rewrite these shaders for other cards. The idea then, behind these High Level Shading Languages is to give programmers a language or an API with a C like syntax, that's easy to use and learn and can be compiled for any piece of hardware the developer needs to support. This way, the developer only has to write the code once, and they can port it anywhere given the right compiler, and reuse the code.

Right now there are 3 major competing standards. Cg from Nvidia, Microsoft's DirectX SL, and Open GL's SL. The reason why this is such a big deal is because it's the future of real time graphics, and which ever one the developers flock to, has the potential for the greatest success. Customers will follow the developers and eventually this stuff is going to end up in set top devices and cel phones. It's a big market and the potential for licensing and revenue from software and hardware is huge.
xiondebra
Hmmm .. this is interesting. I started looking at Cg purely from a programmers point of view, not being into gaming I don't know if I'll ever really use it, maybe I'm wrong after reading these replies. What I do like are some of the short cuts in the language (which might be nice to have in VEX) and the access to the h/w registers and gfx pipeline.

I don't know a thing about Direct X, but I do understand a bit how nVidia cards and Cg work with OGL, but my understanding was Cg had a direct path into the h/w, without needing to call on OGL, unless you want to.

Again, I'm just getting into this, so I may be all wet on that last point .. :roll:


-Mark
digitallysane
Right now there are 3 major competing standards. Cg from Nvidia, Microsoft's DirectX SL, and Open GL's SL. The reason why this is such a big deal is because it's the future of real time graphics, and which ever one the developers flock to, has the potential for the greatest success. Customers will follow the developers and eventually this stuff is going to end up in set top devices and cel phones. It's a big market and the potential for licensing and revenue from software and hardware is huge.
OK, I knew most of this, but my understanding (maybe I'm wrong) was that Cg works on top of DX SL or OGL SL, to make it easier to write shaders for those, and also to make more portable shaders. So you write a shader in Cg, then compile the same source for DX9, for OGL2, for OGL1.2, for DX8,etc. (and anything that might appear in future, if someone writes a Cg compiler for it). Exactly like you would write a program in C and then compile it on Windows, Mac, SGI etc.

Dragos
craiglhoffman
From what I read ATI/3dlabs are pretty far ahead with OpenGL 2.0 and it's development tools and it is gaining more acceptance than Cg, but NVidia does have a huge market share… But in CAD and Feature Film work there are a lot of ATI and 3dLabs cards out there, so I would think that most high end Animation packages out there are going to go for OpenGL 2.0 which should run on every card.

Anyway, I wrote to someone at Side Effects a couple weeks ago on this topic and he said they were waiting for card manufacturers to start releasing their OpenGL 2.0 drivers and that it would be feasible for them to release a version of Houdini where you can simply point to an Open GL 2.0 shader and have it render on the hardware on your screen.

I own a Wildcat VP 990 Pro, and I pointed him to the new OpenGL 2.0 Beta drivers for it and he thanked me and passed it on to the developers.

Now, I don't know what it take for Houdini to support OpenGL 2.0 with VOPs, but that would be a dream come true! Perhaps there can be standard VOPS that are “OGL 2.0 friendly” so you can make shaders that work in hardware and software. I can't imagine that al the VOPs would work…

I played with some of the OpenGL 2.0 demos for my Wildcat, and they were pretty neat. The most impressive was the real time bump mapping. That kicked butt! I was a little surprised that it seemed to tesselate the geometry into “facets” that were easily seen. I expected it to do it at a pixel level, but that might be too much for the hardware to handle, perhaps. I tried upping the hardware anti-aliasing, but this did nothing but totally screw up my Houdini session when I brought it back up. For some reason Houdini only likes the standard anti-aliasing on the Wildcat VP for now. (That goes along with the other quirks with using Houdini on the Wildcat and waiting for a driver update to fix the bugs….)

Anyway, I can't wait to use hardware rendering. Even if it is for test renders… I was thinking of selling my Wildcat VP to get an NVidia QuadroFX card, but with the Wildcat having early OpenGL 2.0 drivers, I am going to stick with it now.

Cheers,
Craig
MichaelC
xionmark
but my understanding was Cg had a direct path into the h/w, without needing to call on OGL, unless you want to.

This is my understanding as well, but I must admit I'm just a hobbyist programmer and I haven't touched Cg in a long time, so I could be off. Cg is an HLSL but it's also trying to be a superset of Open GL and Direct X, and it's trying to support all available hardware. The problem is that nobody except for Nvidia has any interest in seeing Cg succeed, and a few companies, (ATI and 3Dlabs and Sony, in particular) might be better off is Cg went the way of the Mammoth. As a result, Cg's support of non-Nvidia hardware will always lag, and eventually developer support for Cg will probably stagnate as developers gravitate towards Direct X or Open GL.
craiglhoffman
This came out today:

http://www.3dlabs.com/whatsnew/pressreleases/pr02/02-10-22-acuity.htm [3dlabs.com]

“Prototype OpenGL 2.0 Drivers Available for Wildcat VP560

As part of its ongoing commitment to developing and supporting open standards for high level shading programmability, 3Dlabs is making its prototype OpenGL 2.0 drivers prototype available to qualified ISVs (independent software vendors) on the Wildcat VP560 – making it one of the most cost-effective OpenGL 2.0 evaluation and development platforms. Available now for all Wildcat VP accelerators, OpenGL 2.0 enables software vendors take advantage of the full programmability of the Wildcat VP family to develop applications that extend the boundaries of interactive realism.

If you are an ISV and are interested in exploring the possibilities of OpenGL 2.0, please contact 3Dlabs at ogl2@3dlabs.com

More detailed information on the 3Dlabs Acuity driver suite can be found at www.3dlabs.com/acuity. ”
xiondebra
Thanks Craig!

That's *extremely* cool!

I gotta gut feeling OGL 2.0 is the way to go …




–Mark
MichaelC
digitallysane
OK, I knew most of this, but my understanding (maybe I'm wrong) was that Cg works on top of DX SL or OGL SL

You could be right, because like I said I haven't touched it in a long time. I decided early it wasn't something worth investing time in. I downloaded their development kit the day it was released. Basicly what I could do with it was write a shader using Cg, compile it with Nvidia's supplied compiler, and then call it from within and OpenGL or DX app. I'm nowhere near being a good programmer, but from my point of view Cg was an extra layer of abstraction I think most programmers won't want to deal with. If you are writing something in Direct X or writing something in Open GL, and these API's make an SL available to you, as a programmer, why would you want to leave that API to make shaders? It makes sense to me from a programming standpoint that you'd want to keep your environemnt homogeneous. I mean as, an artist I like using as few tools as possible. I don't want to leave my software to convert assets, and I don't want to go to other pieces of software for specific tasks like flagging or whatever. I'd think programmers would be big fans of simple pipelines.
digitallysane
MichaelC
If you are writing something in Direct X or writing something in Open GL, and these API's make an SL available to you, as a programmer, why would you want to leave that API to make shaders?
I think the reason might be portability. If you write a shader in Cg, you have to do this only once, and then compile it for different APIs (DX,OGL), instead of re-writing it from scratch for each one. This might be interesting, I suppose, for game developers who have to use DX on Windows and OGL for Mac/Linux. It seems to me that the approach is identical with the one taken in “normal” programming: you write a program in C because you can compile it for different platforms, instead of writing assembler for each platform. Probably Cg tries to become the standard “development language” for shaders.
Maybe I'm saying something stupid, but I think it wouldn't be impossible (but it would be probably useless) to write a Cg compiler that outputs compiled RenderMan (or Mantra) shaders.

Dragos
craiglhoffman
Anyone know anything about this new software renderer NVidia is developing that will use the hardware in conjunction with software to render? I believe it was called “Galileo” and that was changed to “Gelato” or something like that.

I think it is designed to take advantage of Cg in your OGL display, but allow a higher quality render through software but sped up using the graphics hardware. I think they are shooting for winning people over from Renderman.

I wonder if VMantra can start taking advantage of Graphics card hardware to speed up it's rendering?… Anyone have any theories on this?

-Craig
digitallysane
craiglhoffman
…but sped up using the graphics hardware. I think they are shooting for winning people over from Renderman.
I wonder if VMantra can start taking advantage of Graphics card hardware to speed up it's rendering?
As far as I know the latest release of MentalRay (3.2) already does this.
http://www.mentalimages.com/2_1_1_technical/index.html [mentalimages.com]
and
http://www.softimage.com/products/xsi/v3/new_v35/ [softimage.com]

Dragos
JColdrick
Hmmm…how would that work in the real world with render farms? Obviously it's only useful for graphics workstations - but if I let it rip over the farm I wonder if I'd get identical images on the graphics/non-graphics machines?

Cheers,

J.C.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Powered by DjangoBB