Constraining to actual world position of IK joints.

   5339   3   1
User Avatar
Member
436 posts
Joined: 7月 2005
Offline
Hello

I am running into a little vexing, no pun intended, problem. The use of magnetSOP with splines and metaball for emulating musculature works well for facial animation and holding animation, i.e. breathing. Basically any kind of motion whether primary or secondary that can be accomplished independent and before skeletal motion, via Deform or SkeletonSOPs. So in a complete rig, facial and holding anination SOPs are ‘upstream’ of skeletal rigging SOPs. I hope I am not confusing anyone.

OK my approach is not working very well when muscle emulating deformers such Lattice, Magnet, etc.. need to work ‘downstream’ of skeletal rigging SOPs. Specifically, with expressions, I need to pass positional information from actual bone positions (in IK chain), via expressions to positions of deform SOPs and use them to drive other aspects of deformation SOPs through CHOPS or expressions. Sounds simple, don't it. Just create few expressions. Well theres a hang up.
The problem is that positions of the bones (in the transforms fields) remain at 0,0,0 during IK solving as chain goals and chain roots are being transformed.

Here is a simple example. BTW this project fie is available as sample.
I have a simple arm IK chain with twist effector.
clavicle (Chain_root)->upper_arm(chain_bone1)->lower_arm(chain_bone2) and wrist(chain_goal) and elbow(twist_goal).

To emulate muscles I need to create two splines, each with two points. The spline endpoints are muscle attachment points. There is one spline object for each bone.
Each spline has two transformSOP, one for each end (attachment) point. Those transformSOP need to track the motion of the bones, or more precisely, the actual attachment position on the bone. For sake of simplicity the attachment point can be co-resident with end or start points of the bones.
The 1st transformSOP of first spline, called bicep_spline, is constrained to clavicle(chain_root), via channell expressions. The 2nd transformSOP is constrained to position of lower_arm(chain_bone2) as position of the bone is disctated by its beggining point, on the fat end. Again using channell expressions.
The 1st transformSOP of lower_arm_spline is constrained to position of lower_arm(chain_bone2) and 2nd transformSOP is constrained to wrist(chain_goal). Likewise the constraining is done with channell expressions to create driven channells.
The intent is to have muscle splines be constrained to joint and end points of the IK chain. As IK chain solves the transformSOPs alter the position of the spline end points.
In the larget scheme the splines are used as paths for metaball CopySOPs. As they contract the metaballs get closer increasing their volume effecting deformation in MagnetSOP. Other deformation SOPs get similar setups adjusted for their differences. Typically I would use a combination of SOPs to deform the skin, emulating muscle based deformation.
The problem is that while the chain_root and chain_goals work, the bone joints do not. As the IK chain is animatins the position fields of the lower_arm(chain_bone2) remain constant, even though the bone is solving the position correctly. Sence the the position fields remain constant, so do the 2nd end point of the upper_arm_spline and 1st end point of the lower_arm_spline. The result is that muscle splines are incorrect at elbow joint.
I hope I have explained the situation clearly and in required detail.
What I need is to be able to track the position of the elbow joint which is the position of the chain_bone2 as it actually is in the world coordinates. How can I do that automatically?
I tried attaching a Null as child to the second bone, but even though it is transformed correctly, its position fields, which are in object coordinates remain constant.

Thank you

David Rindner
User Avatar
Member
7714 posts
Joined: 7月 2005
Online
Hi David,

Assuming you have a bone chain with bones named chain_bone1 and chain_bone2, try the following:

- Put down an Object Merge SOP. Set the following parameters:
Source1: chain_bone1
SOP: point1
Source2: chain_bone2
SOP: point1
Transform Object: This Object

- Append an Add SOP to the Object Merge SOP. On the Polygons tab of the Add SOP's parameters, put 0-1 in the first Polygon parameter.

Now you should have a line joining the two origins of the bone objects, regardless of how they are animated in world space. If you want to precisely control where the points are along the length of the bone. Try appending a Transform SOP under to each individual bone's point1 sop. In its Z translate parameter, put this expression: -ch(“../length”)*0.5
What this expression says is to translate to 50% the length of the bone. We need a negative sign because the bones in Houdini point in the negative Z axis. You can change the 0.5 to adjust where you want the point along the bone. Finally, don't forget to change the SOP parameter from your Object Merge sop to this new Transform SOP in the bone.

Unfortunately, I can't do attachments in this forum. I'll e-mail you a link to my yahoo briefcase instead. (bone_points.zip)

Cheers.
User Avatar
Member
436 posts
Joined: 7月 2005
Offline
Thank you Edward. Good advice. No need to trouble yourself with the zip file. You explanation is spot on, works. You filled in a missing peice of the puzzle and actually gave some new ideas.
By adding artbitrarily positioned points using AddSOP in Bone geometries themselves, then object merging them per your explanation I can define my own muscle attachment points. Good advice thank you.

Dave Rindner
User Avatar
Member
212 posts
Joined: 7月 2005
Offline
Yes, you can. The objectMerge will allow you to grab other points was well if you need to gather a location from more that one bone.

Just a word of caution. ObjectMerges can be be a bit trick to get their transfromation based on Object (?) correctly. That they are more memory intensive then most sops. I don't think you will run into a problem, but if start optimizing your file it's one thing to keep in mind.
-k
  • Quick Links