gelato?

   4241   4   1
User Avatar
Member
1 posts
Joined: July 2005
Offline
Is there any plans on incorporating support for nvidia's gelato into Houdini. gelato already has tight support for maya and has recently released plugins for hardware acceleration of rib and shading language .
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Actually, now that I have an FX card I've poked around Gelato for a bit, and I must admit to being a little confused as to how it would be used. I can see how it uses the graphics hardware to make very fast rendering calculations, but how would that be related to what mantra does, for instance? Is the idea to implement calls within mantra that would utilize the Gelato API to speed up calculations? I've heard that in Maya it's treated literally as another renderer, primarily intended for display. I'm unclear how this would map across from a display gadget to the final render.

Certainly, I can get some snappy occlusion tests using their demo code…I'm just trying to figure out how this blends into the pipeline.

Cheers,

J.C.
John Coldrick
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Ah, OK - I'm confused. It's basically just a renderer…so you're asking, I guess, for a Gelato ROP?

Cheers,

J.C.
John Coldrick
User Avatar
Member
7046 posts
Joined: July 2005
Offline
Apparently there is a RIB->Gelato converter and also a PRman Shading Language-> Gelato language (whatever it's called) converter. Not written by Nvidia of course!

So theoretically, you can use it with Houdini now, by converting the PRman stuff.

I talked with Larry Gritz at length at Siggraph about it and it sounds interesting. I brought up the usual “who wants to put 500 Nvidia cards in all their render farm machines” and he says they don't feel this is a big problem. They're also looking long-term, likely taking a loss for a few years until it catches on…

Cheers,

Peter B
Cheers,

Peter Bowmar
____________
Houdini 20.5.262 Win 10 Py 3.11
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Gelato looks strangely similar to Entropy…hmmm…

I'd be curious as to what he meant by the need to buy graphics hardware for render farms as “not a big problem”…to me it's a significant problem…unless they can show there's a tangible return on the dollar for it. Those FX cards can be pretty pricey, and I tested Gelato for a while and yup it's snappy - but $3000 more snappy? And worth using an entirely different new product? Hard to say…needs some serious numbers.

I figured this was perceived as a long term investment by NVidia. I have tremendous respect for Larry, but I'll admit to the “we'll sell you software to embed more of our hardware” very Apple/MS like. But then I'm paranoid anyway…

Cheers, and thanks for the insight, Pete…

J.C.
John Coldrick
  • Quick Links