Setting the particle separation to the maximum your hardware allows and dialing down the Velocity Smoothing to 0.001 got me the fine details in this project.
https://vimeo.com/130328238 [vimeo.com]
Simple, but very effective.
Cheers,
Bernd
Found 5 posts.
Search results Show results as topic list.
Technical Discussion » Creating a slow river with FLIP
- Friday
- 5 posts
- Offline
Technical Discussion » Splashtank Exploding
- Friday
- 5 posts
- Offline
At the moment Im stuck, I`m working on a hires Splashtank sim which will take several nights to simulate and am not able to resume the simulation. The fluid explodes on the resumed frame. I`m now trying for days to figure out where my error lies and don`t know which buttons to press anymore.
I have been working on hires sims before and never had any issue with the workflow so I`m not sure if there are some splashtank settings I`m missing.
This is my first project in Houdini 15. Maybe something changed?
In this example I resumed at frame 35.
Maybe someone could have a look at the file.
Thanks for helping,
cheers
Bernd
I have been working on hires sims before and never had any issue with the workflow so I`m not sure if there are some splashtank settings I`m missing.
This is my first project in Houdini 15. Maybe something changed?
In this example I resumed at frame 35.
Maybe someone could have a look at the file.
Thanks for helping,
cheers
Bernd
Houdini Indie and Apprentice » Whitewater solver and RAM usage
- Friday
- 5 posts
- Offline
Hallo johner,
I only have seen your post today. Thought I had set up my account for automatically receiving emails for replies to my posts. It seems I hadn't. Thanks for the tips but sadly I had to delete the simulation data to free disk space. The solution I came up with for now was to cut the simulation into smaller pieces and make five shots of it. I wasn't very happy about this. Costs me a lot of extra nights. I'm just rendering the first shot and it looks crazy real but simulation and render times are nuts. Will defiantly post the result here, but it will probably take another extra two months.
Cheers
I only have seen your post today. Thought I had set up my account for automatically receiving emails for replies to my posts. It seems I hadn't. Thanks for the tips but sadly I had to delete the simulation data to free disk space. The solution I came up with for now was to cut the simulation into smaller pieces and make five shots of it. I wasn't very happy about this. Costs me a lot of extra nights. I'm just rendering the first shot and it looks crazy real but simulation and render times are nuts. Will defiantly post the result here, but it will probably take another extra two months.
Cheers
Houdini Indie and Apprentice » Whitewater solver and RAM usage
- Friday
- 5 posts
- Offline
Hello,
I am on Linux and Houdini 14.0.247 and trying to do a large scale river simulation with some waterfalls. Just finished the flip simulation with which I was very happy. It has about 105.000.000 Particles. I did some testing with different methods of creating whitewater and found, that the whitewater shelf tool gives you really beautiful results with little efford.
The main issue I have now is the extreme memory consumption of the whitewater solver. I turned of bubbles and mist and set the birth rate from 100 to 20. On my 96 GB machine where the base simulation would use about 50gb of ram (which I also think is quite high) the whitewater which has now “only” 50.000.000 particles killed the whole rest of the ram, 16GB of the swap and then crashed Houdini.
It crashes before I reach my initial state frame. Before the crash the size of the .sim file of the base sim is 17 GB, the size of the .sim file of the whitewater sim only 1/3 of the base sim which is about 5 GB.
After dialing down the birth rate and not seeing any improvement I for the sake of testing set the birth rate to 1. Here it solves through the first time. I get a bit more than 3 mil particles which use another ridicules 32 GB of ram on top of the 50 GB for the flip simulation
I think I am doing something fundamentally wrong here. Maybe someone could have a look at this.
Attached is the Houdini file and a test render. I think it turned out really nice and would love to finish this.
Cheers
I am on Linux and Houdini 14.0.247 and trying to do a large scale river simulation with some waterfalls. Just finished the flip simulation with which I was very happy. It has about 105.000.000 Particles. I did some testing with different methods of creating whitewater and found, that the whitewater shelf tool gives you really beautiful results with little efford.
The main issue I have now is the extreme memory consumption of the whitewater solver. I turned of bubbles and mist and set the birth rate from 100 to 20. On my 96 GB machine where the base simulation would use about 50gb of ram (which I also think is quite high) the whitewater which has now “only” 50.000.000 particles killed the whole rest of the ram, 16GB of the swap and then crashed Houdini.
It crashes before I reach my initial state frame. Before the crash the size of the .sim file of the base sim is 17 GB, the size of the .sim file of the whitewater sim only 1/3 of the base sim which is about 5 GB.
After dialing down the birth rate and not seeing any improvement I for the sake of testing set the birth rate to 1. Here it solves through the first time. I get a bit more than 3 mil particles which use another ridicules 32 GB of ram on top of the 50 GB for the flip simulation
I think I am doing something fundamentally wrong here. Maybe someone could have a look at this.
Attached is the Houdini file and a test render. I think it turned out really nice and would love to finish this.
Cheers
Technical Discussion » Memory Leak in H13 with Flips maybe ?
- Friday
- 5 posts
- Offline
Having the same issues on Windows as everybody else and wanting to get into large scale Flip simulations I decided to install Ubuntu 14.04 Lts alongside my Windows 7 Installation. Thought this would be the key to solving all my problems. It took me a few days to get my head around all this and finally I could start a simulation which I already tested on Windows.
A simulation with a constant particle count of about 40 to 46 million starts occupying about 20 GB of my 48gb of ram fills up the memory completely within about 100 frames and crashes Houdini. (I did not activate a swap partition because I thought I would not need it for now). Now in Windows the Ram also fills up but to a far lesser amount then in Linux. With the exactly same sim it started with 15gb an went up to 35 GB at frame 160.
I also noticed that the .sim files written out on Linux are bigger then the ones written out on Windows and really the only thing I changed in the Houdini file is the location of the Explicit Cache.
I already set the Cache Memory to 0 and enabled Explicit cache.
As I am totally new to Linux I`m sure that I`m missing some settings.
Help would be appreciated.
Cheers.
A simulation with a constant particle count of about 40 to 46 million starts occupying about 20 GB of my 48gb of ram fills up the memory completely within about 100 frames and crashes Houdini. (I did not activate a swap partition because I thought I would not need it for now). Now in Windows the Ram also fills up but to a far lesser amount then in Linux. With the exactly same sim it started with 15gb an went up to 35 GB at frame 160.
I also noticed that the .sim files written out on Linux are bigger then the ones written out on Windows and really the only thing I changed in the Houdini file is the location of the Explicit Cache.
I already set the Cache Memory to 0 and enabled Explicit cache.
As I am totally new to Linux I`m sure that I`m missing some settings.
Help would be appreciated.
Cheers.
-
- Quick Links