I'm working on the destruction of a very large poly count model. (Will be around 1-2 million points at least).
I've found that the usual File cache bgeo.sc method is quick and easy to load back, with decent responsiveness.
This of course is no good for Maya. I've looked at FBX export as UVs need to be maintained, and also alembic caching which ends up with huge file sizes, and usually brings Maya to a hault.
I've also looked at caching everything through the bgeo.sc method, then splitting out smaller groups of geo from the simulation and caching those separately as alembics
Just wondering if anyone can point me in the right direction on what the best practice is for this type of thing.
Thanks in advance
Found 1 posts.
Search results Show results as topic list.
Technical Discussion » Best practice caching heavy geo in Houdini to use in Maya
- andyhowell77
- 1 posts
- Offline
-
- Quick Links