Any tips for optimising large numbers of blendshapes?

   275   2   1
User Avatar
Member
66 posts
Joined: April 2019
Offline
Hi,

I have a workflow whereby some 616 blendshapes are generated on incoming characters and then ultimately exported as an FBX file.

Depending on the total point count on the incoming characters, the total size of things can get huge.

For example, if the incoming character has a total point count of 278,000 points, then when 616 blendshapes are generated that's a total of 171,248,000 points at play when exporting as an FBX.

Am I correct in assuming that each blendshape must always contain the same point count as the original mesh?

Is there a way, when generating blendshapes, to only include the points that have actually changed?

I understand that the blendshapes are packed for the most part but when it comes time to export them as an FBX presumably they get unpacked as the memory usage goes through the roof when exporting via the ROP FBX Character Export node, often crashing Houdini.

Does anyone have any tips for optimising this workflow or is it just a case of requiring more RAM?

Thanks!
User Avatar
Member
8554 posts
Joined: July 2007
Offline
you can definitely optimize your blendshapes, by just keeping the points that changed per blendshape and assigning i@id attribute storing the original @ptnum

Character Blend Shapes Add SOP does this by default, so if FBX Skin Import doesn't import blendshapes in this form I wonder if that can be a good RFE or if there is a reason for it

but regardless I assume you should be able to process your blendshapes this way
Tomas Slancik
FX Supervisor
Method Studios, NY
User Avatar
Member
66 posts
Joined: April 2019
Offline
Thanks you so much Tomas, my saviour on more than a few occasions

If this method works when exporting the FBX it's going to be a game changer for my workflow.

I'm going to give it a try tonight.

Thanks again.
  • Quick Links