Is there any reason to switch to Karma? Is unbelievably slow

   33678   64   9
User Avatar
Member
642 posts
Joined: Aug. 2013
Online
@Soothsayer

Do you mean Light samples mode to uniform and convergence mode to path traced? Best Mark
User Avatar
Member
856 posts
Joined: Oct. 2008
Offline
Mark Wallman
@Soothsayer

Do you mean Light samples mode to uniform and convergence mode to path traced? Best Mark

No, the variance one below. The idea is to find out what max pixesamples we need to use without interference from other optimizations.

That makes me think, we change variance according to image planes but I wonder if components (diffuse, refl) would be useful too? Just thinking out aloud in case sesi listens in. I really think optimizing those pixel samples is key and the more tools we get to modify or inspect this the better. Otherwise karma just keeps shooting rays at irrelevant parts of the image and we'll never know.

Edit: Actually I saw that just today they changed the pixel sample default from 128 to 9. That's much more in line with other renderers. Not that it changes karma's inherent speed much but at least newcomers aren't hit with this huge shock.
Edited by Soothsayer - Nov. 3, 2020 07:14:31
--
Jobless
User Avatar
Member
13 posts
Joined: March 2010
Offline
Daryl Dunlap
Buying 3Delight makes no sense, SideFX already has a production CPU engine in Mantra - and SideFX clearly accepts that CPU engines are a legacy product - as SideFX has literally stated, they are not investing in their Mantra codebase anymore.

You don't even have to buy it: first license is free for commercial use And you get 1000 mins of free cloud rendering.

… GPU is the future and AI has just begun its disruption in the 3D/VFX industry.

No one can really predict the futur but I don't see any disruption coming from AI and GPU on the production side of things in the medium term at least.
//AK
The 3Delight Team
User Avatar
Member
13 posts
Joined: Dec. 2018
Offline
aghiles
Daryl Dunlap
Buying 3Delight makes no sense, SideFX already has a production CPU engine in Mantra - and SideFX clearly accepts that CPU engines are a legacy product - as SideFX has literally stated, they are not investing in their Mantra codebase anymore.

You don't even have to buy it: first license is free for commercial use And you get 1000 mins of free cloud rendering.

… GPU is the future and AI has just begun its disruption in the 3D/VFX industry.

No one can really predict the futur but I don't see any disruption coming from AI and GPU on the production side of things in the medium term at least.

It's already starting, in the next 5 years we'll be surrounded by AI tools in 3D productions. One example: https://graphics.pixar.com/library/SuperResolution/ [graphics.pixar.com]
User Avatar
Member
603 posts
Joined: July 2013
Offline
… GPU is the future and AI has just begun its disruption in the 3D/VFX industry.
aghiles
No one can really predict the futur but I don't see any disruption coming from AI and GPU on the production side of things in the medium term at least.

Edited by TwinSnakes007 - Nov. 12, 2020 10:06:41
Houdini Indie
Karma/Redshift 3D
User Avatar
Member
236 posts
Joined: March 2013
Offline
What is your point? I see a great example, but using hard surfaces, approximate motion blur (that looks odd) and playing to
the strengths of the GPU. I'm not sure why you are so against CPU offline rendering. Have you worked in Studios that are
able to push heavy production scenes through a GPU? Because I haven't seen one example of this. I'm for both CPU and GPU,
but there is no way offline CPU is going anywhere. Not with the data sets we have to munch.

I did see Moana pushed through a GPU, at 5hrs for a frame, at 1k. Though that might have been a more academic exercise
so not sure if it's accurate. I tried to find a RS or Octane, etc example of the Moana set being rendered, but couldn't see
anything. Do you have some heavy data set examples?

There is room and scope for both GPU and CPU.


L
I'm not lying, I'm writing fiction with my mouth.
User Avatar
Member
44 posts
Joined: Oct. 2014
Offline
There is no point when one doesn't know what to say.
In such cases one prefers to repeat what someone else is saying, without really understanding the context and/or its implications.
This modern society problem goes beyond CG btw
User Avatar
Member
642 posts
Joined: Aug. 2013
Online
Hi.

So I am testing a range of renderers in Solaris.

For Karma however there are a few things that are helping me.

-add a lightmixer just before your render, add all lights into a collection and increase the expose up by 10.
-in the Karma HDA LOP, go into the color limit and increase it to stop high value pixels clamping (for me my current limit is 10000 rather than the default 10).

-In nuke/fusion etc take down the exposure of your render by the same number of stops that you increased it by.

This is getting rid of a lot of grain. I use this principle in a lot of render engines.

I am still doing a bunch of testing with different render options. If I find anything usfull I will post up.

Best
User Avatar
Member
603 posts
Joined: July 2013
Offline
tinyhawkus
Have you worked in Studios that are
able to push heavy production scenes through a GPU? Because I haven't seen one example of this.

Houdini Indie
Karma/Redshift 3D
User Avatar
Member
236 posts
Joined: March 2013
Offline
That's not what I mean. I'm talking heavy destruction, multi scatter volumes, high speed, all with motion blur.

Also, it would be appreciated if you would respond with more than links to videos. This is a discussion around the topic after all.

L
I'm not lying, I'm writing fiction with my mouth.
User Avatar
Member
603 posts
Joined: July 2013
Offline
A slight tangent to the topic, but, relevant none the less.

https://www.sidefx.com/community/epic-games-invests-in-sidefx/ [www.sidefx.com]

LiveLink from Houdini is gonna be even better. I'd imagine Studios adjusting Sets in real-time and seeing those results on those LED stages in real-time as well.
Houdini Indie
Karma/Redshift 3D
User Avatar
Member
603 posts
Joined: July 2013
Offline
Hmm…I'm thinking too small here…possibly:

* Unreal Engine as a Delegate/ROP?
* HoudiniEngine at runtime in Unreal?

..that's what I would do if I had access to both IP's
Houdini Indie
Karma/Redshift 3D
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
It's an interesting development. My guess is that Epic actually tried to buy SideFX outright, and ultimately SideFX didn't want to sell and this is the best they would both agree to. If SideFX was a publicly traded company, I think it would have been a very different story.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
User Avatar
Member
603 posts
Joined: July 2013
Offline
Yup, I was saying the same, Epic definitely has the cash, and some folks said Kim said he'd never sell.

Christopher just posted on FB, that nothing “really” will change because of this investment. But, one can dream of tighter integrations.
Edited by TwinSnakes007 - Nov. 13, 2020 18:08:14
Houdini Indie
Karma/Redshift 3D
User Avatar
Member
123 posts
Joined: Jan. 2015
Offline
tinyhawkus
What is your point? I see a great example, but using hard surfaces, approximate motion blur (that looks odd) and playing to
the strengths of the GPU. I'm not sure why you are so against CPU offline rendering. Have you worked in Studios that are
able to push heavy production scenes through a GPU? Because I haven't seen one example of this. I'm for both CPU and GPU,
but there is no way offline CPU is going anywhere. Not with the data sets we have to munch.

I did see Moana pushed through a GPU, at 5hrs for a frame, at 1k. Though that might have been a more academic exercise
so not sure if it's accurate. I tried to find a RS or Octane, etc example of the Moana set being rendered, but couldn't see
anything. Do you have some heavy data set examples?

There is room and scope for both GPU and CPU.


L


The Moana scene have been converted to Redshift. Info about it here https://www.redshift3d.com/forums/viewthread/32082/ [www.redshift3d.com]

Here is the rendertime from the post. “The above took about 46 minutes to render at 2048x857 on a single RTX 8000.”

But you should read the forum post, and you will know some of the problems with getting the render running, interesting stuff
Edited by Heileif - Nov. 13, 2020 21:12:06
User Avatar
Member
236 posts
Joined: March 2013
Offline
That Article is indeed interesting. And proof about the pitfalls of GPU rendering. There is room for both as I've said
several times, this illustrates the point. An AMD 3950x was able to render Moana in 13 mins at 4k, using about 46gb
of ram. With all SubD and displacement enabled.

Attachments:
shotCam_4k.jpg (2.3 MB)

I'm not lying, I'm writing fiction with my mouth.
User Avatar
Member
603 posts
Joined: July 2013
Offline
….and go!

80.lv: Gods of Mars Will Be Shot Entirely In Unreal Engine.
https://80.lv/articles/gods-of-mars-will-be-shot-entirely-in-unreal-engine/ [80.lv]

TLDR;

Gods of Mars is the first Hollywood production to use real-time rendering of game engine animation/images as finished Final Pixel shots. This is done by our cross-medium creative team that merges game programmers with traditional matte painters and miniature makers with Hollywood pedigrees. To push Unreal Engine to meet the high-quality standards equal to current VFX/animation software
Edited by TwinSnakes007 - Nov. 19, 2020 22:31:03
Houdini Indie
Karma/Redshift 3D
User Avatar
Member
433 posts
Joined: April 2018
Offline
Daryl Dunlap
80.lv: Gods of Mars Will Be Shot Entirely In Unreal Engine.

I'm interested to see how this develops. My main issue with Unreal/Unity filmmaking is that characters tend to look really jank compared to traditional CGI. Real time is great for hard surface and environments, but I've never seen convincing humans or animals. I'm thinking of Oats Studios ADAM (Unity) and the tech demo trailers that Unreal and Unity put out every once in a while. It's just not there yet. Not sure if it's a matter of having the right team of genius artists come along to make it work, or if the tech still needs a few years to mature…
Subscribe to my Patreon for the best CG tips, tricks and tutorials! https://patreon.com/bhgc [patreon.com]

Twitter: https://twitter.com/brianhanke [twitter.com]
Behance: https://www.behance.net/brianhanke/projects [www.behance.net]
User Avatar
Member
603 posts
Joined: July 2013
Offline
Me too Brian. I'm excited what the platform (GPU) represents. Its like a feeding frenzy among IP creatives, everyone is creating these…pieces of functionality…and the synergies are like..exponential between them, inherently.

You create an asset and then, there's this entire ecosystem that pushes that asset to its Final form in record time.

What AI is doing with inference is sobering as well.
Houdini Indie
Karma/Redshift 3D
User Avatar
Member
31 posts
Joined: June 2016
Offline
Daryl Dunlap
….and go!

80.lv: Gods of Mars Will Be Shot Entirely In Unreal Engine.
https://80.lv/articles/gods-of-mars-will-be-shot-entirely-in-unreal-engine/ [80.lv]

TLDR;

Gods of Mars is the first Hollywood production to use real-time rendering of game engine animation/images as finished Final Pixel shots. This is done by our cross-medium creative team that merges game programmers with traditional matte painters and miniature makers with Hollywood pedigrees. To push Unreal Engine to meet the high-quality standards equal to current VFX/animation software

This reminds me a little bit of when they shot some episodes (or entire seasons) of shows on an iPhone. More to make the point “yes, it can be done” than because it's better (in quality, in way of working for the crew, the DI, the …). Also it will probably - as you said - push the engine forward. But I think if you just asked an unbiased filmmaker, who just wants to produce the best stuff, with the least problems they would go for a “real” camera every day (or established offline rendering workflows in this case - not for previz, but for final image. Virtual production is another beast here because it's new, there is nothing established yet).
  • Quick Links