Blending poses almost like we blend shapes (New Technique?)

I’m curious if anybody else has worked on this problem, and what solutions you’ve found or come up with. But I’m kinda hoping that my solution is novel :blush:

Background: In Simplex (like other blendshape systems I’ve seen), there are primaries, and combos. When certain groups of primaries are activated, that automatically activates certain combos. But it’s not just flipping switches. Every primary shape can vary between 0 and 100% and beyond, so the combos need to activate smoothly as well as handle extrapolation. This means they need to be controlled by smooth (or at least continuous) mathematical functions.

But I got a request to make it work with joint poses as well.

So here’s my problem: What function should I use to interpolate and extrapolate joint pose orientations? I want a constant velocity (ie, a 25% weight would give exactly 25% of the rotation). And I want the function to be order agnostic (ie, blending poses A, B and C would produce the same output as blending poses C, A, and B)

Of course, a Radial Basis Function (RBF) immediately comes to mind. But that gives you weights to apply to the poses. I’m trying to find out how to apply those weights, so that doesn’t work.

The thing I WANT is to somehow sum SLERPing quaternions. But that’s doesn’t seem to work because applying rotations is an inherently ordered operation. I could SLERP between pairs of SLERPs (kinda like bilinear interpolation), but that requires 2**N items. And also, I’d have to make a decision about which order to pair things. And different pairings give different outputs.

I could do NLERP (which is just normalizing a linear combination of quaternions). But that doesn’t do the constant velocity.

I just read the paper on dual-quaternion skinning, and it does NLERP. So there’s no help there.

I could also just do a linear interpolation of the matrix deltas. And this can work. However, there are scaling issues during the blends. It’s just like the candy wrapper effect from linear skinning. I could add intermediate shapes, and spline interpolation, but that feels ugly to me. I’d like to get rid of intermediates. (That said, this is still useful for interfacing with linear skinning algorithms)

I was banging my head against this problem for a couple days when I got an idea. I previously found a StackOverflow question on averaging quaternions, and there was a really neat idea in there based on a paper from NASA. The reasons why/how it works are way above my head, but I can sure call the numpy functions to implement it! Apparently the largest eigenvector of the sum of the matrices formed by taking the outer product of each quaternion with itself is the average quaternion :exploding_head: :exploding_head: :exploding_head:

Ignoring all that crazy stuff, What is the naiive formula for averaging? Just ADD UP ALL THE VALUES, then divide by the count of values. So by averaging, I’m summing everything together in a manner of speaking! So then I would just need to “multiply” by the count of values (done by raising the average quaternion to that power) to cancel out that division and get the sum.

So here’s the “almost” in the title. I haven’t figured out how to reverse/invert this process in a useful way, which means I can’t get deltas the same way I would with blendshapes. But I can get a rotation offset relative to the output. So I just do the poses in layers: All the primary shapes, then all the 2-combos, then all the 3-combos and so on.

But with that little caveat … holy crap, this seems to work!

Here’s a quick python implementation to play around with

[EDIT]: And a python plugin for maya as well

4 Likes

Hey there, this looks quite interesting so I was just testing your plugin node to get a bit more understanding of this, but got this error message after connecting a couple of matrices to it with an output flicking the inputTargetLevel to 1:

setAttr “blendPose1.inputTarget[0].inputTargetLevel” 1;
// Error: AttributeError: file D:/../blendPose.py line 93: ‘OpenMaya.MDataHandle’ object has no attribute ‘set’

Eventually, could you provide an example use for this ? (even visual might help )

Ok, sorry managed to fix this i think by changing a few lines

..set(

into

..setMMatrix(

i created a couple of joints one for original, one for input target pose, and added a weight attribute to blend the pose

all looks like is working now.

Ah, sorry. This was a prototype that I moved to a c++ plugin. .set is something you can do in c++, but not python. I must’ve done something weird to put that back into the python code. My bad. The gist is updated

2 Likes

i have done something similar before but created meshes from the joints and used them as drivers, not sure if i completely understand the problem so i will talk about the full pipeline i went through:

  • we had SSDR convert certain mesh cache animations to joint based animations and skinning
  • for each shape we wanted we basically get a new setup of joint positions and rotations
  • to be able to drive this we wanted a blendshape approach but ran into the problem of lerping/slerping the transforms;
  • so we created based on the joint index a triangle on each joint, 1 point at the center, 1point at the x axis of the matrix and one at the y axis
  • we created a plugin that would generate a matrix for each triangle in a mesh using these 3 points to construct the matrix with position and rotation from the vertices given
  • we piped all generated meshes into the one that was the defualt mesh with the blendshape node in maya and used the output mesh to drive our custom plugin to move the joints.
  • this made sure that the skinning was intact from the SSDR setup and we had a way to manipulate and blend the joint positions and rotatons with the same effect as a blendshape
1 Like

That’s basically our current solution (We use a uv pin instead of a custom plugin). And it’s certainly served us well. But I’m trying to find a better solution (for some definition of “better” :smirking_face: ).

It’s equivalent to the “linear interpolation of the matrix deltas”, but with an extra step of orthonormalizing (ie, Ignoring any shears and scales for all you non-math speakers). Which doesn’t have the “constant velocity” property I’m looking for.

SimonDev
Basically, the blendshape takes the triangle points along the straight line, and you normalize to the green vector. But notice how the green vector lags behind the purple and then snaps to the front after the 50% point? That’s what happens to the rotation, it “snaps” around halfway. I’m trying to get the purple vector.

If you’ve got any more ideas, keep 'em coming! Alternate perspectives are where insight comes from.

2 Likes

sigh Everything broke :frowning:
Apparently I wasn’t testing well enough and I just happened to choose random values that made me think things were working.
There’s an issue with ignoring values. The weird “invert a quaternion average” thing doesn’t give the same answer when I ignore identity values. And unfortunately, that means the values pop in and out. On top of that, I’m finding wobbles.

That said, looking at what I wrote to peerke88, I remembered that slerp and nlerp match at exactly 50%. So what if I only do unweighted averages? Meaning, if I slerp everything by the weight, then do a simple average and normalize. It gives me something close. Like the interpolation is smooth, and the delta values I calculate provide an approximation , and … well, I’m just gonna have to do a lot more experimentation and math

1 Like

So I think I finally found a way to do this!
The trick is logarithms. Logarithms let you switch between addition and multiplication. Their defining property is log(A) + log(B) = log(A * B), and unit quaternions have a logarithm defined. And since summing quaternions is commutative, all I’ve gotta do is work in log space for everything! It doesn’t produce the exact same results as quaternion multiplication, but that doesn’t matter. It’s Good Enough™

It doesn’t work in all cases though (it’s literally impossible to work in all cases). When you start getting near 180deg flips, then things start getting wonky. And you have to start messing with the fact that q and -q produce the same rotation value, but not the same log. And I’m pretty sure there are some inputs that produce impossible situations, but if we’re restricted to about ±60deg on each axis, it produces clean interpolation throughout. And that range is good enough for face system. Not for a full pose deformation system though. But for that, I’m preserving the option to use matrix NLERP.

1 Like