Gelato Intergration into Houdini

   22916   27   5
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey Guys,

I was just checking out Gelato on the web and it appears that Houdini won't have to do a thing in order to intergrate with it since they can make their own RIB files and Renderman Shaders. Anyone considering using this in production or have tested it to see what kind of results they have gotten with Houdini.

Cheers,
Nate Nesler
User Avatar
Member
19 posts
Joined: July 2005
Offline
To test it you need to have not only Houdini and Gelato, you need to have 1'500$ for good NVidia card
Official Gelato is not RenderMan compatible, but the different was made only to not have act of the law from Pixar
But as I see it's very easy to change you workflow from RenderMan to Gelato. You need to have hwo programs which will change some words in your Shaders and in your RIB's, because they are like in Gelato.
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey Suvo,

Your right you do have to have a NVIDIA card but its any Quadro FX series cards. Just the higherend cards was true a while back. There have been updates though so that they even work with an NVIDIA 500 Quadro FX card. They actually have a converter for RIB files and they also convert Renderman Shaders to Gelato shaders. So create your scene save it out as a RIB and create Renderman Shaders through VEX and then save them out and convert them with NVIDIA's convert Renderman RIBs and Shaders to Gelato Scene files and shaders. This converter is free.

Cheers,
Nate Nesler
User Avatar
Member
135 posts
Joined: July 2005
Offline
I was wondering the same question about if Houdini can use Gelato for anything? I mean it has a direct connect to Maya as a plug in. I was wondering how one can link houdini to it. I have a quadro fx 1100 card would that be a good card to have for this type of thing? Any suggestions if it can be worked out so one can use Gelato from houdini?




George
3D Mind body and Soul Great illusions are done by great artists…
User Avatar
Member
4140 posts
Joined: July 2005
Offline
AFAIK no, there's no way to directly plug into Gelato, but I'll confess ignorance as to the Maya plugin - what does it do? Is it simply an output driver to it, or is it more integrated into the viewport? If it's the former, that's not all that big a deal - you can write RIB from Houdini with the rman extension(either with Master or the rman-only option). I can almost promise you there will likely be some RIB massaging needed for serious production, but again with Houdini that tends to be pretty straighforward for a TD.

Yup, your card should show significant performance increases with Gelato, but you don't need a plugin for that - you should be able to test with the free trial(I think it plunks a watermark in your image). Don't forget there's a certain version of graphics drivers you'll need to meet or exceed for that to work, though.

I'm still highly skeptical of the whole model - sure it's great to have really fast renders on your workstation, but is it worth changing the entire production over to a renderer that not only has it's own quirks(as do all), but *requires* every single system on the farm to have NVidia graphics hardware on it? That's the part I don't like. Don't add yet another restriction to my farm options, and up my cost!

Cheers,

J.C.
John Coldrick
User Avatar
Member
4262 posts
Joined: July 2005
Offline
JColdrick
I'm still highly skeptical of the whole model - sure it's great to have really fast renders on your workstation, but is it worth changing the entire production over to a renderer that not only has it's own quirks(as do all), but *requires* every single system on the farm to have NVidia graphics hardware on it? That's the part I don't like. Don't add yet another restriction to my farm options, and up my cost!

You have a farm of 250 NVidia machines. You have a month left in production. You start getting these nasty artifacts in the render. After a couple support calls back and forth they give you a fixed version of the software but it turns out it doesn't fix the problem. After some more tests it turns out there is a small bug with the GPU designs themselves.

Now what?
if(coffees<2,round(float),float)
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey Guys N Gals,

Can I use it with Maya? Is there a Maya plug-in available?
Yes. Gelato comes with MangoTM, a plug-in to Alias Maya modeling and animation package that allows Maya users to render their scenes with Gelato. Mango permits artists to use the Maya GUI to select the attributes for Gelato shaders and then converts the Maya outputs to a Python (pyg) file that Gelato can read. Mango is included with the Gelato software at no additional charge.

What about other DCC applications, like 3ds max or Softimage?
Frantic Films has created a 3ds max plug-in for Gelato called Amaretto. It should be available for sale soon. When it is, we'll have information available on this site. For our part, NVIDIA will focus its efforts on improving the Mango plug-in for Maya and and will not be creating plug-ins for other modeling and animation applications, but we encourage third party developers. To assist others in developing tools and plug-ins for Gelato, we have created a developer program. Details are at http://film.nvidia.com/page/gelato_developers.html. [film.nvidia.com]

More info on the Gelato Mango Plugin that is just a converter for Gelato. I don't how much of a difference there is between the Mango converter and the Renderman converter other than the fact that it is just easier. Additional Info on this.

http://film.nvidia.com/docs/CP/4825/Gelato_product_overview.pdf [film.nvidia.com]

Yeah George your FX1100 will work fine according to NVIDIA's papers. Also you want to run it on linux. I think you have to emulate linux shell on windows or something like that in order to run it on windows. I have not tested it out so I won't say for certain. I can't test it out myself because I have a Quadro 980 XGL card which is not compatable with this solution.

Cheers,
Nate Nesler
User Avatar
Member
4140 posts
Joined: July 2005
Offline
After some more tests it turns out there is a small bug with the GPU designs themselves.

That's OK, you can always code around the bug in software…ummm…oh…I see….

Cheers,

J.C.
John Coldrick
User Avatar
Member
135 posts
Joined: July 2005
Offline
Yeah i installed in Maya but there is an issue still is to weather to use the shaders that don't show on the model from the Gelato libaray that comes with the beta version.. The interface with it is sweet but for some reason my computer stallswhen i use it. A dual opteron but in any case the render looks pretty nice but then again anything looks better then the maya render.. Which is another reason to Know Houdini as the defualt render looks great…
BUt i have tried to get into renderman but at school we don;t have any licenses of it. I don;t think they will either. As they are Pure Maya freaks in school. What i do like is the way houdini can creae both types of shaders form Mental ray to Renderman all in it;s package..


George

thanks for the advice guys …..
3D Mind body and Soul Great illusions are done by great artists…
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey Guys,

Well normally you are rendering in passes anyhow so whatever pass you are having problems with render it with another render and then composite them together. I know its a little more involved then that since renders don't match up exactly but still its not that bad and you can fix it in the compositing stage.

Cheers,
Nate Nesler
User Avatar
Member
4140 posts
Joined: July 2005
Offline
LOL! You're starting to sound like clients…“oh shoot it anyway you want, they can fix it in post…”.

Unfortunately, there's more to it than that - every renderer has it's own “quirks”, to put it politely, and you need to account for that in the pipeline. There's a lot more to a decision to have multiple fulltime renderers in the primary pipeline. I'm not saying it's impossible, it's just more work. There needs to be a good reason to do it.

Cheers,

J.C.
John Coldrick
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey JC,

LOL Oh no and thats why some things never line up. lol Yeah I know its more work. I am just thinking if you get a 2 to 6 times the rendering speed you save that much money in the rendering process then you have that much more money to use to fix the quirks between the two renderers and it may even be possible to setup a procedural system that solves some of these problems although some others will have to be done by hand. What I am thinking is that it might be more cost effective to do the extra work if you save on the massive amount of computing power needed to complete the final render frames. I mean you take a 2 to 6 times render time saving over a 1,000 machines and thats quite a bit of money involved there. Of course it would have to be preplanned to eleminate the problems between the two final renderers. Just food for thought.

Cheers,
Nate Nesler
User Avatar
Member
4140 posts
Joined: July 2005
Offline
Absolutely, something to think about. I just think once you've done the math, added all that vendor-specific hardware(at a significant cost) to everything on the farm(because gelato *forces* you to - not optionally for added speed), and then taken into account the fact that all that GPU hardware is *only* usable for that one specific software package, well, that makes me question things bigtime. Obvious advantages are faster testing on workstations, but I would have given a lot more consideration to this product if they had offered a “software only” render mode. That way, you use the GPU advantage where and when you want, not with every purchase.

Just the concept of selling software to sell hardware pisses me off. Ask Discreet. I've been told the reason software-only mode doesn't exist is for technical reason. Yah.

Cheers,

J.C.
John Coldrick
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey JC,

Yeah I don't believe them for a minute about the not being able to do a software only mode. They would just have to rewrite their modules to for the 2nd option for the software side which I suspect will be a big deal. Such a move would not reflect their focus with this product which is to troubleshoot the problems that are facing them to get film quality graphics for real time interaction within 5 years. The extra flexibility would be nice. It might make sense that when a studio upgrades the render farm they add in the cards and have a smaller farm. There is extra cost to the cards but you would not need as many computers. The math will have to be done and it will depend on the studio mostly with their pipeline. I can understand the frustration with the hardware intergration. Yeah because if you bought all these machines from a major distributor then they have all these nice hooks you have to call them up about everytime you need to upgrade or do anything for that matter so that will defantly add to your costs. So I won't be suprised if studios don't make the move until they are ready to do a full upgrade of the renderfarm computer nodes. You guys make alot of good points. I think TV, Video, and Commericals in particular will have to do something like this because time is super fast on those mediums and with HD looming within 1 year and the talk of 2k and 4k frames they pretty much don't have much of a choice to still deliver within the 1 week or 2 week time frames at fall lesser budgets.

Cheers,
Nate Nesler
User Avatar
Member
252 posts
Joined: July 2005
Offline
I am a big fan of using GPU power, especially OGL2 support for hardware shaders. GPUs are speeding up much faster than CPUs and it is the wave of the future for any type of visualization.

I know that the big studios will be slow to embrace this, but they have the financial resources to throw at big render-farms and need to have the utmost in reliability and repeatability. Right now I think Gelato and OGL2/DirectX9 hardware rendering is more for smaller studios and those of us wanting to do things at home and cut down the most time consuming part which is rendering. (It is also for quick visualization of things when you want to show something to a client for rough approvals without having to spend the time on a full software render- fast communication always pays off.)

Sometimes you just want to tell a good story and faster rendering methods suffice. It would be cool to have our favorite Animation software (Houdini) be put on the same level playing field as Maya, Softimage, Max, Lightwave, etc. with regards to these faster solutions.

As far as necessitating NVidia hardware, NVidia makes Gelato and can't be sure other vendors have implemented the math routines in their hardware in exactly the same way. From what I have heard about OGL2 implementation on different vendor cards, I think that NVidia can only guarantee Gelato works properly if people use their hardware. (Can you imagine how people would cry foul if their Gelato renders on all their machines with various cards didn't match?)

We can make as many arguments for using Gelato and OGL2 hardware rendering as against. The fact is people are using this stuff in Maya, etc. and it is working for them. Why do we have to bury our heads in the sand and fight the surge of technology and act like this is a good thing?

If it works for you, use it. If it doesn't, don't. I would just like to have the choice.

-Craig
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey Craig,

Well Houdini is getting fully supported indirectly because of their Renderman Architecture. Lightwave is not supported, XSI is not, Cinema 4D is not etc. Only Maya is getting directly supported by NVIDIA. Max is having a special rendering thing done with a 3rd party studio that you have to pay for. Since Houdini allows you to create Renderman shaders directly in Vex and also export out RIBs directly you can have these converted in the Renderman Conversion Tool. Even the Mango tool plugin for Maya is just a converter for Maya files. So there is no direct Gelato Tool everything goes through a converter. The Renderman converter is free with Gelato. So the point is Houdini is supported even though it was never really meant to be directly. I don't really see a difference between Mango and the Renderman converter with regards to Houdini vs Maya other than the fact that their is a pretty interface built directly into maya because its a plugin. However I suspect that the Renderman converter has a pretty interface too. lol So there won't be a big difference in my book.

Cheers,
Nate Nesler
User Avatar
Member
252 posts
Joined: July 2005
Offline
Actually I meant that hardware shading (HLSL/GLSL) is supported in the other packages besides Houdini, not necessarily Gelato.

-Craig
User Avatar
Member
405 posts
Joined: July 2005
Offline
Hey Craig,

Sorry I missunderstood you. Yeah I know future releases of Houdini are suppose to sport the new OpenGL 2.0 specification. So it will be interesting to see if Houdini 8.0 has OpenGL 2.0 implemented or if it will be the next version. Your right XSI is probably the best for OpenGL/NVIDIA Shader hardware acceleration implementation and then Maya 2nd.

Cheers,
Nate Nesler
User Avatar
Member
4140 posts
Joined: July 2005
Offline
No arguments, Craig - utilizing GPUs for interactive work is a Good Thing, no matter how big your studio is. I'd love to see more of it in Houdini. I'm just not a fan of the Gelato model(selling hardware by selling absorbed software - the Apple model ). The biggest problem is that this sort of thing tends to only work when you have a monopoly or an embraced open standard. We have neither, unfortunately, just a very popular hardware solution that forces you to use their hardware to get this working. OGL2 might help, but it seems to be lagging significantly behind Cg. We'll see!

Cheers,

J.C.
John Coldrick
User Avatar
Member
252 posts
Joined: July 2005
Offline
Actually, I think NVidia have pretty much abandoned Cg and moved toward the Microsoft DirectX standard HLSL. At least that is what an NVidia guy told me a while ago if I remember correctly. But, that might be wrong since I have heard rumors that the PlayStation3 shaders will be Cg.

Anyway, they are almost identical for the most part. GLSL is a little more different, but we can't expect DirectX support in Houdini since it is OGL and not many developers are going to put in the time to do a DirectX plug-in for Houdini like they do for Maya (Nvidia, Ginza, etc.).

I wish someone would use RenderMonkey's (from ATI) SDK and do a plugin for Houdini so we can use it for shader creation. Much nicer than coding it all by hand in one big monolithic window or using things like NVidia's FX Composer. Plus RenderMonkey supports OGL2 GLSL.

You guys should download the Ginza2 free demo and play with it. It is node-based- like a simplified VOPs with all the main functionality easily accessible plus instant real time rendering! Quite amazing. (Also, RenderMonkey is free and you can download that and run all the sample shaders. That is quite amazing too.)

To me this is the future of Houdini and VOPs, but I am sure it will take a while.

-Craig
  • Quick Links