Hi,
I have a workflow whereby some 616 blendshapes are generated on incoming characters and then ultimately exported as an FBX file.
Depending on the total point count on the incoming characters, the total size of things can get huge.
For example, if the incoming character has a total point count of 278,000 points, then when 616 blendshapes are generated that's a total of 171,248,000 points at play when exporting as an FBX.
Am I correct in assuming that each blendshape must always contain the same point count as the original mesh?
Is there a way, when generating blendshapes, to only include the points that have actually changed?
I understand that the blendshapes are packed for the most part but when it comes time to export them as an FBX presumably they get unpacked as the memory usage goes through the roof when exporting via the ROP FBX Character Export node, often crashing Houdini.
Does anyone have any tips for optimising this workflow or is it just a case of requiring more RAM?
Thanks!
Any tips for optimising large numbers of blendshapes?
271 2 1- mrpdean
- Member
- 66 posts
- Joined: April 2019
- Offline
- tamte
- Member
- 8554 posts
- Joined: July 2007
- Online
you can definitely optimize your blendshapes, by just keeping the points that changed per blendshape and assigning i@id attribute storing the original @ptnum
Character Blend Shapes Add SOP does this by default, so if FBX Skin Import doesn't import blendshapes in this form I wonder if that can be a good RFE or if there is a reason for it
but regardless I assume you should be able to process your blendshapes this way
Character Blend Shapes Add SOP does this by default, so if FBX Skin Import doesn't import blendshapes in this form I wonder if that can be a good RFE or if there is a reason for it
but regardless I assume you should be able to process your blendshapes this way
Tomas Slancik
FX Supervisor
Method Studios, NY
FX Supervisor
Method Studios, NY
- mrpdean
- Member
- 66 posts
- Joined: April 2019
- Offline
-
- Quick Links