Search - User list
Full Version: Is there a full GPU renderer avaiable for Houdini ?
Root » Houdini Lounge » Is there a full GPU renderer avaiable for Houdini ?
mandrake0
i just wanted to play with SOHO sadly it's only accessible with a full licence (it makes sens because of miss usage, but still a export limitation of example 16k poly would be nicer:-).

so at the moment no play with soho :-/
eetu
from this thread [sidefx.com]
jeff
And don't forget to pad the budget a bit to pay for the next bit of kit to take advantage of the various renderers and their vastly improving architectures and capabilities. Take that for what you will.

Someone with an active imagination might get all sorts of ideas from that..
anon_user_37409885
I'm convinced that GPUs will become like the FPU. integrated into the CPU.

No-one buys an FPU these days.
cgcris
Octane has just been anounced:

http://render.otoy.com/newsblog/ [render.otoy.com]
jordibares
Such great news!!! Octane render is so good and fun to use!! :-D
juanjgon
Hello folks !!!

I am Juanjo Gonzalez, the developer of the Octane for Houdini plugin. I hope to have a full featured Octane plugin in the shortest time possible. I am also the developer of the Lightwave plugin, so at least some research about how Octane works as external render engine is already done

Currently the plugin is only a prototype, a simple ROP node, but all is working fine so far. I am working only with the C++ HDK API, not with the SOHO Python architecture, because Octane only has a C++ SDK, and because I feel better working with direct access to the Houdini scene data and node graph from C++ code. I am not sure if I am going to find restrictions without SOHO, but at least the prototype works fine using the HDK, and the loading times are really fast.

I will try to have a beta version as soon as possible, perhaps later this year or early next year, but it is to early to have a closed roadmap. The project is in it's early stages, and I need also to learn a lot of things about Houdini to be sure that the plugin is full featured and integrated with it.

Next month I will open a new thread in this forum to post all the news about the plugin development.

Best regards,
-Juanjo
cgcris
Increible Juanjo!

If you need any help, please let us know
houdiniWannabe
I have the Lightwave version, and Juanjo has done an incredible job. Looking forward to the running on Houdini!!
Mirko Jankovic
Accidentally run into his topic and figured that it deserves update.
Redshift is available for Houdini now as well
www.redshift3d.com
fiddybux3d
I found this to be a very interesting read. I arrived here because I found myself asking why my GPU is not being utilised to render my pyro scene in H17.5 with Mantra.

I wonder, over 6 years on since this discussion started, what is the state of play concerning GPU support in Mantra nowadays? Is CPU rendering still the most consistent and reliable, albeit slower?

Please be gentle. I've only been using Houdini for a couple of weeks.
mestela
https://www.youtube.com/watch?v=emcT5qXdUsc&t=46m20s [www.youtube.com]

Karma is the Mantra replacement due in H18, GPU support is planned.
fiddybux3d
That's great. A good time to get on board!

I'll look forward to Karma when it lands.
Andr
mestela
https://www.youtube.com/watch?v=emcT5qXdUsc&t=46m20s [www.youtube.com]

Karma is the Mantra replacement due in H18, GPU support is planned.


I asked few times already, but still haven't found if:
1) it's meant to be xpu: render the frame utilizing both cpu and gpu power at the same time.
2) multi-gpus will be supported.


I'm planning to buy new pc in the near future, and 1) and 2) answers would lead my choice better.
fiddybux3d
Good question. It would certainly influence my upgrade decisions as well.
Midphase
Andr
I asked few times already, but still haven't found if:
1) it's meant to be xpu: render the frame utilizing both cpu and gpu power at the same time.
2) multi-gpus will be supported.

I don't have time to go dig around, this has been addressed during the Siggraph presentation and on the forums. Watch the Siggraph presentation and stick it out for the Q&A afterwards.

But to offer answers based on what has been officially said:

1. CPU-only right now, but eventually (H18.5, H19?) they will start tapping into the GPU as well.

2. See answer #1, but I would think yes once that functionality arrives in the future.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Powered by DjangoBB