How to render over 1 billion particles

   4773   11   5
User Avatar
Member
15 posts
Joined: Aug. 2013
Offline
I have a flip simulation cached out as points only, and there are round 13 million particles per frame, i want to bump up the particle count at render time with point replicate procedural, say about 100 times more. So the total particles to be rendered will exceed 1 billion.

Unfortunately, mantra can't handle such amount of particles, and houdini crashes. By the way, i have 64g ram. So is it possible to render such massive particles under 64g of ram.

What is the maximum particle count mantra can handle giving that 64g of ram.
User Avatar
Member
474 posts
Joined: June 2006
Offline
it should be possible to render huge amount of particles but i think you will struggle with your ram.

have you seen this video?
https://vimeo.com/142517418 [vimeo.com]

with 64gm ram i don't know how many particles you can render with it…. can't help there sorry….
User Avatar
Member
701 posts
Joined:
Offline
not sure how many particles you can render in 64Gb , but some tricks we have done in the past to get around memory limitations are to use the camera screen window settings and near far clip planes to render smaller sections with higher point counts then stitch them back together…this worked well for fire where the stitching was additive…would have to think a bit more if you are needing shadows across the tiles but i would think it would involve generating deep shadows with fewer larger points to recreate the densities
User Avatar
Member
272 posts
Joined: Dec. 2014
Offline
I recently worked on a set of shots advecting well over 800,000,000 particles with flip sims Mantra handled it extremely well, delayed load render is the key. You'll need to cache particles to disk (files will be huge), read them back into a shot setup for lighting and use a delayed load procedural so that they only load at render
User Avatar
Member
11 posts
Joined: Sept. 2013
Offline
mandrake0
it should be possible to render huge amount of particles but i think you will struggle with your ram.

have you seen this video?
https://vimeo.com/142517418 [vimeo.com]

with 64gm ram i don't know how many particles you can render with it…. can't help there sorry….

Are you working under Windows?
Senior Houdini Effects Technical Director @ Sony Pictures Imageworks
User Avatar
Member
15 posts
Joined: Aug. 2013
Offline
mandrake0
it should be possible to render huge amount of particles but i think you will struggle with your ram.

have you seen this video?
https://vimeo.com/142517418 [vimeo.com]

with 64gm ram i don't know how many particles you can render with it…. can't help there sorry….

Yes, i saw it. But it says 4 machine with 200GB memory each, so each will take one quarter, that is 250 million particles. Is it?
User Avatar
Member
15 posts
Joined: Aug. 2013
Offline
moogtastic
I recently worked on a set of shots advecting well over 800,000,000 particles with flip sims Mantra handled it extremely well, delayed load render is the key. You'll need to cache particles to disk (files will be huge), read them back into a shot setup for lighting and use a delayed load procedural so that they only load at render

i can understand that delayed load procedural will stream the data from disk while rendering, so it can handle huge amount of data, is it?

But the thing is i don't want to write out such amount of particles to disk, just want to bump up the existing particles at render time.
User Avatar
Member
15 posts
Joined: Aug. 2013
Offline
Ivan Pulido Suarez
mandrake0
it should be possible to render huge amount of particles but i think you will struggle with your ram.

have you seen this video?
https://vimeo.com/142517418 [vimeo.com]

with 64gm ram i don't know how many particles you can render with it…. can't help there sorry….

Are you working under Windows?

yes, i am using windows. So what is the difference?
User Avatar
Member
3925 posts
Joined: June 2012
Offline
zysnow
yes, i am using windows. So what is the difference?

The memory allocator is the worst under windows, Linux is the best with OsX running second, Windows third.
User Avatar
Member
33 posts
Joined: March 2014
Offline
i recently ran into a memory problem and it can help here too.

decrease your bucket / tile size. our 8core computer didn't ram out, but our 48core computer did. So after some dev'ing we saw that 48 buckets loaded more scene into the ram at a given time, changing tile size from 16 to 8 or even 4 made the memory survive just fine.
Cool
Director at polycat studio
User Avatar
Member
5 posts
Joined: Jan. 2013
Offline
Goddy McRoodt
i recently ran into a memory problem and it can help here too.

decrease your bucket / tile size. our 8core computer didn't ram out, but our 48core computer did. So after some dev'ing we saw that 48 buckets loaded more scene into the ram at a given time, changing tile size from 16 to 8 or even 4 made the memory survive just fine.
Cool

good information , it help me a lot, thanks
https://vimeo.com/user2218930 [vimeo.com]
User Avatar
Member
1520 posts
Joined: Sept. 2015
Offline
I wasn't aware of this thread originally…glad I found it today.

Looking forward to trying the bucket size setting to help with memory.

Thanks for the comment d4riel that ended up bumping this thread so that I see it.
  • Quick Links