H21 Metahuman for Houdini workflow Documentation

   4952   20   6
User Avatar
Member
1 posts
Joined: 9月 2025
オフライン
Hi all, I was expirimenting with the Metahuman for Houdini pipeline, involving getting my Metahuman working with the APEX system in Houdini (through use of the licensed HDA from the FAB marketplace) but I’m encountering issues as far as retargeting animations for the Metahuman, and I’ve been scouring the internet for some documentation and information on the process but haven’t found anything worthwhile.

I’d greatly appreciate either a specific solution regarding assitance with Metahumans and retargetting in Houdini, or more general Documentation involving the Houdini Metahuman system, any form of documentation for the updated H21 system and nodes would be great.

Thanks heaps!
User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
Please try again with the latest (21.0.475+) daily build as a fix has been committed for using the APEX Control Extract SOP with the MetaHuman rig. If you're talking in particular for exporting MetaHuman animation from Unreal Engine to Houdini, then there's also a specific solution that we have but it's more involved.
Edited by edward - 2025年9月16日 19:30:00
User Avatar
Member
62 posts
Joined: 7月 2013
オンライン
oh I thought there was some official documentation then! but it led here. following on from this, it appears there is a metahuman .dna read now but no metahuman .dna write. currently or am I mistaken? I think thats a missing ingredient in the custom metahuman DCC turnaround
https://tekano.artstation.com/ [tekano.artstation.com]
User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
There is currently no DNA exporter from Houdini. From what I gather, this is something that Epic Games wants to add.
User Avatar
Member
62 posts
Joined: 7月 2013
オンライン
Indeed, .DNA out from houdini would complete the puzzle.
currently, as of latest daily 21 biuld, FBX character import and FBX archive import STILL alters the vertex count / UV seams / order (or something) which stops the roundtrip from metahuman 5.6 > houdini > Unreal 5.6 to edit something like face morph targets or conform template for custom metahuman)

ther is a few workarounds like try to use the File import FBX and be careful of vertex counts for metahuman but the new metahuman identity creator tool lets you export .obj of the base mesh! super handy so you can use this .objj as the in and out for the metahuman stuff and can conform / make morph targets etc with this as .obj respects the vertex count and order - unlike crappy FBX

so there is some workarounds currently for apex / meta human rigging, have you seen this guy (need translation on )

since there s literally no other info so far to do this

https://www.youtube.com/watch?v=xSeAe3sVhJE [www.youtube.com]
Edited by Rob Chapman - 2025年9月28日 11:38:58
https://tekano.artstation.com/ [tekano.artstation.com]
User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
Note that I've posted another example here: https://www.sidefx.com/forum/topic/102043/?page=1#post-450286 [www.sidefx.com]
User Avatar
Member
21 posts
Joined: 3月 2017
オフライン
TL;DR - Metahuman t o Apex Documentation is lacking but thank you greatly for this example file, Edward!

Hi there, I've been looking into using Metahumans for a project and the moment and find that I'm running into a lot of pitfalls and gotchas with the workflow as the documentation feels limited.

However, I'd like to say, that after stumbling across this post - the example you posted above, Edward, was EXTREMELY useful and it would be amazing to have this file documented and exported to the content library, as it fixed a number of issues I was having when trying to navigate the workflow on my own.

Firstly, the Autorig builder foe me was incredibly slow and almost unusable when I was trying it out on a newly generated Metahuman and after many attempts at getting it to work it still had visible errors on the output Base.shape. I tried the Autorig component that was in your example file and it just worked - but I'm still not fully sure why this one did versus what I was doing. I can see a few different tick boxes enabled/disabled and a couple of different tags written in the advanced sections, though I wouldn't have been able to figure these out without the help of this example file.

Secondly, the body and face animation subnets that are in the Example file were also very useful, and again would be great to have a bit of documentation about the how and why - as I'd successfully got animation to work directly on the skeleton when NOT using a rig, however when I added my own rig and the animation, I ended up with lots of horrific skeleton mangling. Again this little example fixed that up and allowed me to have my animation as well as the working rig to override it.

So thankyou for providing that file, and I hope this workflow gets a lot of love over the coming Houd updates as it sure feels like it's got a huge amount of potential!
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
I'm a bit late to the party but I've been running into some issues.

First, if I "DCC export" a MH from Unreal 5.7.2 and use the MetaHuman character rig from Epic to get the MH into Houdini, that works like a charm.
But then, if I use a autorigbuilder, mirroring is "mirrored". What I mean by that is that if you keep *_l in Left and *_r in Right, it errors, but if you put *_r in Left and *_l in Right, it works. Bit strange and hard to find but ok.

Then, if I add a scene add character node and a scene animate, I can use the controls for the face just fine, works great. Only thing is AutoKey doesn't work here, but I don't like to use that in production anyway.

However, for the body, there is no control rig. I mean, there are a few dots that I can grab and rotate, FK style, but there is nothing else. No handles for IK, nothing.

Then, I tried the same setup from Edwards example file to add the body animation. Hey, that works, more or less. The character moves but is about 25 cm lower then the fbx animation and the skeleton that comes out of the body_anim subnetwork. No clue what's happening here.

Then, next, face animation, that would be cool. Helas, not working. So I dive into the face_anim network and output from after the attribute promote and now, the jaw and eyes are moving.

Next in the example file there is an Apex Scene Animate node, and if I select that, all incoming animation is gone.

Pff, it's not easy and I understand this is still heavily in development, and it looks extremely promising but there are still quite a bit of little annoyances that make it hard.

Good documentation would probably help, or a short tutorial of how all this is intended to work would be even better.

But for now, if somebody can help me to get the body rig working, I would already be a happy camper.

On a more general note: I understood Sesi is very keen on getting a bigger piece of pie in character animation land and I can see that happening. But for this to take of with less tech-savvy people, animators, students etc, it would be really beneficial if the whole process of getting your own custom character or creature rigged up to start animating easier. You now almost need a PhD in rigging to get things started and that might fly for bigger studios where there are very talented and smart TD's but in smaller studios where you do not have those kind of people available, Houdini is not idiot-proof enough to really make a dent in the position Maya still has in this field and Blender is keen to take over.
But that is just my 0.02 Euro.

Cheers

Rudi
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
Another day, another step, hopefully forward.
So now I've gotten the MH into Lops, and there are some hurdles here as well.

In sop, after a scene invoke where you output Unpacked Geometry, you've got your MH, but still with loads of packed geo's for the blendshapes I guess. The material that comes in the "metahuman character rig" sop has an mh_anim_shader, that takes care of the animated maps for the different blendshapes of the face. This seems to work in sops, but when you get it over to lops, not so much.
In lops, all the packed geos for the blendshapes are coming in as points as well so a blast of _3d_hidden_primitives can take care of that.

For now, the only way I've gotten anything decent is by reassigning the /obj/geo1/metahuman_character_rig1/mh_base_shader/head_shader to the head, so bye bye animated maps :-(
In UE, the shader for the face has a pretty nifty way of using some sort of sss that adds to the realism. Here the shader does not have that. Of course you can add that in yourself, but it takes a bit of guessing for the optimal settings.

Anyway, here is what I've got so far:
Image Not Found
Edited by RudiNieuwenhuis - 2026年1月31日 07:09:25

Attachments:
MH_test.mp4 (11.5 MB)

User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
Some tips on importing into Solaris

- On APEX Scene Invoke:
* Make it Output Unpacked Geometry
* To be sure you only get the skin geometry, turn off "Character Shapes", in Extra Outputs, use direct path to /something.char/Base.rig/output. And the for the Key use Base.shp

- In Solaris, use the Scene Import node to import into LOPs
Edited by edward - 2026年2月2日 08:50:06
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
edward
Some tips on importing into Solaris

* To be sure you only get the skin geometry, turn off "Character Shapes"

Then there is no geometry at all coming out:



Ah, wait, so, if I understand this correctly now, you have to manually specify what you want to output in order to NOT output the blendshapes? But I think those blendshapes are useful no? Or why else is there a mh_anim_shader ?
Edited by RudiNieuwenhuis - 2026年2月2日 10:07:08

Attachments:
Emtpy.png (1.0 MB)

User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
RudiNieuwenhuis
Ah, wait, so, if I understand this correctly now, you have to manually specify what you want to output in order to NOT output the blendshapes? But I think those blendshapes are useful no? Or why else is there a mh_anim_shader ?

That's correct. The blendshapes are no longer useful because the APEX Invoke SOP would have already applied them because we're trying to render the final deformed character right? This is what I've got:
Edited by edward - 2026年2月2日 12:53:36

Attachments:
APEX_Scene_Invoke_parms.png (50.3 KB)

User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
(I edited my above post to have an even simpler way)
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
edward
RudiNieuwenhuis
Ah, wait, so, if I understand this correctly now, you have to manually specify what you want to output in order to NOT output the blendshapes? But I think those blendshapes are useful no? Or why else is there a mh_anim_shader ?

That's correct. The blendshapes are no longer useful because the APEX Invoke SOP would have already applied them because we're trying to render the final deformed character right?

And here I was, thinking that for some blendshapes, there would also be a blend between different maps, normal maps mostly I think, but I guess I'm wrong on that.

Thanks for clearing that up for me.

Now only the weird offset and the _L and _R inversion.
We'll get there, eventually, its not a sprint, it's a marathon.... :-)
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
edward
RudiNieuwenhuis
Ah, wait, so, if I understand this correctly now, you have to manually specify what you want to output in order to NOT output the blendshapes? But I think those blendshapes are useful no? Or why else is there a mh_anim_shader ?

That's correct. The blendshapes are no longer useful because the APEX Invoke SOP would have already applied them because we're trying to render the final deformed character right? This is what I've got:
Image Not Found


This will actually not work because it brings in all the _3d_hidden_primitives as well and render those as points.
Edited by RudiNieuwenhuis - 2026年2月2日 15:07:44
User Avatar
Member
8108 posts
Joined: 7月 2005
オフライン
RudiNieuwenhuis
And here I was, thinking that for some blendshapes, there would also be a blend between different maps, normal maps mostly I think, but I guess I'm wrong on that.

It doesn't work like that. The mh::RigLogic APEX node creates the animated blend values that the material will use to blend between the various textures. The maps for that have nothing to do with the blendshape geometries here because those are for the actual face deformation. For the animated wrinkles, etc. the material just needs the blend weights for them.

RudiNieuwenhuis
This will actually not work because it brings in all the _3d_hidden_primitives as well and render those as points.
I don't see that on my end at all. However, my version of the plugin was from the initial release, I don't know if things have changed since. You can always delete them manually.
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
edward
RudiNieuwenhuis
This will actually not work because it brings in all the _3d_hidden_primitives as well and render those as points.
I don't see that on my end at all. However, my version of the plugin was from the initial release, I don't know if things have changed since. You can always delete them manually.

The version of the hda that I use is from 5.7.2 and the otl seems to be version 1.0 (metahuman character rig), in a folder that has 0.6.4 in it.
There is no option or menu to choose if you want to have a control rig for the body as well, and that does not get created.

There is now a metahuman body animation import and metahuman face animation import that work right away, without the subnetwork from your example, but it does more or less the same, some remapping and an animation from skeleton.

I'd like to know if the control rig for the body is supposed to be created automatically or not. I mean, the face has this great control panel that comes for free and works well, but do we have to make our own control rig for MH or is this supposed to be created automatically as well?
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
I did a bit of research and I'm sorry to contradict you @Edward but my initial impressions were right about the animated maps.
In the MH folder there is a folder Maps, and there you have Head_Basecolor_Animated_CM1,2 and 3 and Head_Normal_Animated_WM1,2 and 3.
Combined with the SourceAssets tga masks, they provide different basecolor and normalmaps for parts of the face when in certain facial expressions. You can see that clearly if you open them side by side or scroll through them. Parts of the face become more wrinkley (is that a correct term?) and you can see some parts become more red-ish of white-ish, depending on the amount of blood there will be in that part of the face.
By completely ignoring this and just adding the head base textures, you're missing out on quite a lot of realism, and that is a shame imho.
Now if only I was able to get this animated maps shader working in lops....

So, one thing I noticed here is that the reference to a chs("../../../mhfolder") does not translate well when you take this to lops.
If you make a copy of the mh_anim_shader into a material library and assign it to the head, you can get it to work if you add a parm for the folder on the shader subnetwork, and change chs("../../../mhfolder") to chs("../mhfolder") (in a lot of places)
Now the only thing that needs to be done is driving the parms here, maybe with primvars?
Hmm, developers? :-D
Edited by RudiNieuwenhuis - 2026年2月3日 05:39:53
User Avatar
Member
217 posts
Joined: 4月 2009
オフライン
Now this works:

The only thing left to figure out is how to get the face-rig to drive all these parms.

Attachments:
Wrinkles.png (2.0 MB)

User Avatar
Member
286 posts
Joined: 8月 2015
オフライン
hey! I'm trying to get some metahuman to apex workflow ready for an project here but constantly running into walls.
First this one.. any idea what is going on here?

https://www.youtube.com/watch?v=7xXl9SPE-yk [www.youtube.com]

Appreciate any insight. Thanks
  • Quick Links