IF using 32/64core Threadripper, GPU or CPU renderer?

   5483   9   2
User Avatar
Member
40 posts
Joined: April 2020
Offline
I posted this in an old thread, but thought I'd start a new one for discussion here as Midphase recommended.

If one was to get a 32 or 64 core Threadripper, with the 128 gb of ram and a great SSD, would it be wise to use a CPU renderer instead of Redshift?

I ask because I've been using the Redshift Demo but it goes 30% off in a few days. I am planning on building the Threadripper desktop later this fall or winter (Nov/Dec.) I'll likely wait for the Zen3 architecture. I'll also get whatever the new Nvidia ampere T.I. is called (I think they are calling it the 3090 for the T.I. for now)

I'm not sure if I should just at that point use Mantra, or Arnold etc, some other CPU renderer, or still go with Redshift.

I'm most interested in achieving a sweet spot between realistic sims and reasonable render times.

Thanks to anyone who answers this. I apologize if I'm bad at responding, I have not been getting reliable alerts in my email that people respond to my posts, I'll try to be more diligent in checking the actual website instead of relying on email notifications!
Edited by KZLCR - June 19, 2020 17:00:29
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
I feel that the answer is always – it depends. There are a number of factors involved in the choice, including what type of renders you're trying to create, and most importantly who your clients are and what are your production pipeline's needs.

GPU renderers took the industry by storm a few years ago, primarily with Octane being the first of the scene, followed by Redshift, ProRender, Corona, VRay Next and Arnold GPU (I'm not going to include so-called real-time engines which rely on GPU's such as Unreal).

While at first glance GPU renderers appeared to be a godsend to CG artists, as more and more people started really diving into them some of the limitations were revealed. For instance, GPU renderers aren't great at rendering VDB's and volumetric, sometimes making volume-heavy scenes actually slower than on CPU's. Tons of lights can also affect render speeds. GPU renderers are also limited by the memory of the card, and while steps have been taken to optimize this as well as combine multiple GPU's memories together, there are limitations. There are other instances where GPU renderers have a hard time accomplishing what CPU renderers can do fairly easily – caustics, SSS ray marching, complex refractions, etc.

In the meantime, one should also consider that when GPU renderers really took off, CPU's were limited by a relatively small number of cores, and the price for a 12-core Xeon was beyond the budget of most users. GPU's were also more budget friendly, with the price range of a high-end GPU such as a 980ti was still well within $1,000.

Nowadays, these trends appear to have reversed – CPU's with 16, 32, even 64 cores have become faster and much more affordable, while GPU cards such as the 2080ti have well surpassed the $1,000 mark.

CPU rendering technology has also taken some leaps ahead, Karma, Renderman and more recently 3Delight have improved render speeds considerably, while other traditional CPU renderers have also introduced a hybrid approach utilizing the best of both worlds.

I used Redshift for a number of years, and even gave an enthusiastic presentation on it last year at the Los Angeles HUG. While Redshift has its strengths, I always had render-envy toward other renderers such as Arnold, and even Octane which IMHO yields the best looking images of the GPU-bunch. The amount of work needed to get Redshift to look realistic IMHO negates a lot of what it offers in terms of speed. I also feel that their business model locks a user into a maintenance plan where after a couple of years the cost starts adding up without a whole lot of improvements (or at least not particularly exciting ones). Since the acquisition from Maxon, Redshift development has slowed down considerably, and while on the surface they seem to be more communicative with the user base, in reality very little is revealed, and the information that is given is often incredibly ambiguous with no information on a release timeline.

Having said that, I am glad that Redshift exists, and there are many users who still find it a vital tool. It is well integrated in Houdini and considering their upcoming sale which would bring the price down to about $350, it is a good option for many.

However, at least for me the cost factor of a GPU renderer starts making less and less sense once one starts adding up the cost of GPU's. Everything seems to indicate that the upcoming 30xx Nvidia GPU's will be more expensive, and not less than the current RTX line. If a couple of 3080ti's add up to over $3,000, then it starts getting into 64-core Threadripper territory and personally I would lean toward putting my money into CPU.

So we're back to the beginning – it depends.

If you're rendering pretty massive scenes with lots of textures and lots of polygons and volumes, and if you demand as much realism as modern renderers can squeeze, then you might want to stick to a CPU renderer.

On the other hand, if you're planning relatively low-geometry renders, and you're ok with somewhat biased imaging that is kinda sorta physically accurate, then a GPU renderer might be right for you.

Personally, I've switched to CPU and I'm not looking back. The render engine I'm using is 3Delight and for my needs it is as fast or even faster than Redshift was on my two 1080ti's. My renders also look considerably better and much more realistic with less effort. If in the future I decide to go back to GPU rendering, I am leaning much more toward Octane for several reasons, including the fact that it still looks better, has acquired more features and better integration in Houdini, and with fast-changing technology a subscription model makes much more financial sense.

Ok, I think I typed enough on the subject. As always these are just my personal opinions and observations based on what works for my personal needs and production pipeline. I am curious to hear what others have to say on the subject.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
40 posts
Joined: April 2020
Offline
Midphase
If you're rendering pretty massive scenes with lots of textures and lots of polygons and volumes, and if you demand as much realism as modern renderers can squeeze, then you might want to stick to a CPU renderer.

Thanks for your thoughts Kays, I have read a lot of your posts and listened to you speak about renderers on a recent podcast, and I even started checking the 3Delight website fairly often to see when they get out of beta! One thing that does sort of concern me about 3Delight is it's integration with the Substance suite, which I have not spent a ton of time with, but I plan to. I don't know if you can speak to this at all, but Substance does have good Redshift synergy. At least it did in C4D.

Midphase
If you're rendering pretty massive scenes with lots of textures and lots of polygons and volumes, and if you demand as much realism as modern renderers can squeeze, then you might want to stick to a CPU renderer.
So in regards to this comment, this is my goal. Lots of textures and volumes, large scale water/liquid scenes. I was first thinking about getting a serious CPU for the sim time, but then was thinking, might as well see if CPU renderers would be a better option when I'm packing that type of heat, hence this post.

Midphase
Personally, I've switched to CPU and I'm not looking back. The render engine I'm using is 3Delight and for my needs it is as fast or even faster than Redshift was on my two 1080ti's. My renders also look considerably better and much more realistic with less effort. If in the future I decide to go back to GPU rendering, I am leaning much more toward Octane for several reasons, including the fact that it still looks better, has acquired more features and better integration in Houdini, and with fast-changing technology a subscription model makes much more financial sense.
I have ALWAYS thought that Octane looked better than Redshift, significantly, with much less work. I hear though, that studios were not keen on using it as it bogs down in big projects. Also, I guess you can speak to this too - it used to have relatively poor Houdini integration from what I read. I guess this has changed?

Midphase
Ok, I think I typed enough on the subject. As always these are just my personal opinions and observations based on what works for my personal needs and production pipeline. I am curious to hear what others have to say on the subject.
Me too! If anyone has an opinion on this I would be very interested to hear their thoughts. And in terms of CPU renderers, what is the “best” one for a TR setup?
Edited by KZLCR - June 19, 2020 19:16:55
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
Ok…let me answer based on what I know (which isn't much):

Regarding Substance integration, you can use the Substance plugin for Houdini to bring .sbsar files directly into the Houdini COPs, and then use the op:… command to connect them to the appropriate Texture nodes in 3DL. Having said that, I find it just as easy to do all my texturing and fine tuning in Substance Painter and Substance Alchemist and just export the textures that I need (typically as EXR files). Since I'm using 3DL, I really don't need more than my Albedo, Roughness and Displacement. I don't use Normal or Bump since it doesn't buy me anything in 3DL, so connecting 3 texture files (4 if I'm also bringing in Metalness) isn't terribly time consuming even if I have lots of materials.

Regarding simulations CPU is king. I mean yeah, you can offload some computation to the GPU in Houdini, but for instance the new Sparse Volume Solver will only run on CPU at the moment. Simulations burn through RAM and CPU, so the more of those two resources you can throw at it, the better.

Regarding Octane – the truth is that high-end studio work is being done on CPU renderers like Arnold. You talk to most of these guys from Method, WETA etc. and they'll start laughing when you mention GPU rendering (and even bigger mocking if you dare mention Blender). Now I do think that some of that is they're not quite being familiar with renderers like Redshift or Octane, or even being a tad snobby; but if the whole point of switching to Houdini was to play with the same toys as the big boys, wouldn't that also apply to the render engine?

3Delight for Houdini is still in beta, but it's pretty stable and good enough for some production work. It doesn't have all the cool bells and whistles as Arnold does, but the developers are very responsive and open to suggestions on how to improve it. Look-wise it's as good as Arnold and Mantra to my eyes. The fact that the full-featured 12-core version is free is kinda nuts. I'm updating my machine to a 16-core Ryzen….or a 32-core TR if work picks back up soon, but even on a “lowly” 8-core 9900k I'm getting what I need out of 3DL within very reasonable render times.

One final point regarding render speed. People sometimes will confuse render speed with time-to-first-pixel speed. They will mention how fast a GPU renderer is based on how responsive the IPR feels. IPR responsiveness is nice, and the 3DL IPR is as responsive as Redshift, but what really matters is how quickly the final quality render is generated, and on that point GPU renderers get more credit than they deserve (while I don't think CPU renderers get enough).
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
13 posts
Joined: March 2010
Offline
KZLCR
….
Thanks for your thoughts Kays, I have read a lot of your posts and listened to you speak about renderers on a recent podcast, and I even started checking the 3Delight website fairly often to see when they get out of beta!

If you wish, you can join us in the beta.
Edited by aghiles - June 20, 2020 13:08:37
//AK
The 3Delight Team
User Avatar
Member
40 posts
Joined: April 2020
Offline
aghiles
If you wish, you can join us in the beta.
Hey thank you, I think I'm ready to try this. Do I just go over to the “free download” section to join the beta? Currently I'm just using 8 cores and 64 gigs of ddr4. Also how close is the first full non-beta release?
And are there tutorials I can reference?

Thanks aghiles!
Edited by KZLCR - July 20, 2020 21:42:47
User Avatar
Member
86 posts
Joined: April 2016
Offline
aghiles
If you wish, you can join us in the beta.

Curious about this as well…
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
I don't want to hijack the thread too much with 3Delight info, as this should probably be in the 3rd Party section, however for anyone interested in trying 3Delight with Houdini, I would recommend to join the Discord group:

https://discord.gg/MGtJx4q [discord.gg]

Also don't forget to set up a free account on 3Delight's site:

https://www.3delight.com [www.3delight.com]


From there you will be able to download the fully featured free version (restricted to 12-threads).

PM me if you have other questions, or create a thread in the appropriate 3rd Party section of this forum and I'll try to chime in with more info.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
40 posts
Joined: April 2020
Offline
Midphase
I don't want to hijack the thread too much with 3Delight info, as this should probably be in the 3rd Party section, however for anyone interested in trying 3Delight with Houdini, I would recommend to join the Discord group:

https://discord.gg/MGtJx4q [discord.gg]

Also don't forget to set up a free account on 3Delight's site:

https://www.3delight.com [www.3delight.com]


From there you will be able to download the fully featured free version (restricted to 12-threads).

PM me if you have other questions, or create a thread in the appropriate 3rd Party section of this forum and I'll try to chime in with more info.

Thanks Kays,

I REALLY need to pay more attention to my SideFX messages, I just don't get alerts on responses! I'll have to join that discord!
User Avatar
Member
3 posts
Joined: March 2021
Offline
Midphase
One final point regarding render speed. People sometimes will confuse render speed with time-to-first-pixel speed. They will mention how fast a GPU renderer is based on how responsive the IPR feels. IPR responsiveness is nice, and the 3DL IPR is as responsive as Redshift, but what really matters is how quickly the final quality render is generated, and on that point GPU renderers get more credit than they deserve (while I don't think CPU renderers get enough).


Absolutely! I've experienced this on my last project (Using octane) with a massive (to me) grain sim and smoke sim. (100M particles and 100M Voxels). As far as actual render time- Octane renders the frame in under a minute- but the time to first pixel- close to 10 minutes! As a result I'm trying to determine the best CPU renderer to learn- leaning toward Arnold- but maybe renderman?
  • Quick Links