Perception Neuron Retargetting project for Unity | Perception Neuron by Noitom

Perception Neuron Retargetting project for Unity

You are here

20 posts / 0 new
Last post
RBorys
Perception Neuron Retargetting project for Unity

I've been hard at work putting together a script that allows retargetting of PN live BVH data from Axis Neuron to a humanoid rigged character in Unity.

https://youtu.be/W2KWqbBhCNg

The script is simple, it takes the PN robot and retargets the joints rotations to match onto your humanoid model. It also handles translation of the root node which can also be assigned to move a parent object as well so as to have a character that moves around with rigid body and collision data. This allows it so that you can toggle between PN animations and Animator animations to mix live mocap with pre-recorded animations.

In the demo, you can see the live data being retargetted to the standard Unity humanoid and can also toggle control so that the character can run around the level and toggle back to mocap retargetting.

Here is an example project I put together:
https://drive.google.com/file/d/0BzmWbTivN9f5dW5mem1PNXdpdkk/view?usp=sh...

Note: example project uses "PerceptionNeuronUnityIntegration_0.2.5" and was built with Unity 5.3.0.

Insomnia

Damn, this is awesome. Just what i need for my project. Thank you very much. 

Khrystyna

Hi, RBorys!

Thanks for sharing your work!

The video looks nice and I was doing a very similar thing. However, what I observed is that while I apply the very same data from the PN model transforms to another identical model, their movements look very similar when they are appart. But when I overlay them, there are clear differences in the limbs positions/rotations. Basically, the two models do not match 1 to 1.

I have it in my solution and now I saw the almost the same thing in yours. Just try to overlay the same PN robot so that one follows another and they are in the same spot.

If you use another colour you'll see that the avatars do not match (or I'm doing somethin wrong and not avare of it). It also doesn't look like the avatar is a frame late or something. It's just different for no obvious reason.

*crossed* Now I suspect it to be connected to the Mechnim bones used by animator, but I have no idea how to fix it. As I use the the Network animator to distribute the data nicely.
Do you have any idea/suggestion what could be wrong? *crossed end*

UPD: that was the scale of the avatar that was bigger in the Axis. Looked very strange and different for every avatar. I didn't expect the preset avatar size to be updated by the PN animator.
Took me a while to figure it out. Your project helped a lot to cross our my possible mistakes off the list.

Thanks again for your work!

Khrystyna

RBorys

Not exactly sure I follow what the problem is.. maybe a picture would help.

One thing to keep in mind is that the animations wont be translated perfectly because in order for it to be perfect, the model would need to be in the perfect T-pose that the PN Robot is in which is very hard to do unless you are using the same rig that the PN Robot is using. Most likely you will have to eyeball the T-Pose and the percent of animation error will be dependent on how off the rotations are. So, the script isn't a perfect solution for retargetting but it is provided to help people out. If you want perfect animation, I recommend re-rigging the model to work with the PN skeletal rig as described in the PN Unity documentation. PN is in itself not a perfect mocap tool either, so there will always be some amounts of error, but if things are configured well, the error should be minimal and usually unnoticable.

Khrystyna

What I mean is that the identical avatars should be animated identically - fully match, if duplicated in the same spot. That doesn't happen cause the scale of the originally animated avatar will be synced with Axis settings. That looks a bit strange, so took me a while to figue it out.

flemaitre

Thanks for this script RBorys - super clever 

I was trying to play around with the code and stumbled upon : 
leftLegQuaternionOffset.Add(Quaternion.Inverse(mainBodyRotationOffset) * LeftLegJoints[i].rotation);

What is the reason for multiplying each body joints rotation by the inverse of the main body rotation? I tried to remove this code (in bold) and it seems to work fine. 

Thanks!

RBorys

There was definitely a reason for it, but I'd have to dive back into the code to recall it. I think the 'mainBodyRotationOffset' was so that you can have the character face a different direction than what is being provided through PN streaming.

The script is trying to do several things at once, trying to accommodate different setups; as a result, I am compounding some variables into lengthy expressions. So taking out some variables might not affect what you are doing, but may affect certain other scenarios... like, depending on if the character rotates differently ingame or whether the character is moving around.

varun

hi rborys
thanks for sharing your hard work. I was wondering if this script will work with two characters streaming in from axis?
edit: got a chance to see the script. sorry for the silly question :)

RBorys

Should work with multiple characters, but I haven't had a chance to test it. You'd just need to have 2 seperate scripts attached to the models you are retargetting onto and you will need 2 PN Robot characters somewhere in the scene that the scripts would link to.

ww6vh

Hi RBorys,
I am having trouble with the hands deforming when I use your script. The main PN model's hands are moving fine, and the models' hands are moving fine in the demo, but when I go to my scene, the hands become disfigured. The hand deformation occurs with the Ethan model from the demo and a mixamo model that I downloaded. Have you run into this problem before, and if so, is there any suggestion that you can suggest? Thank you. 

RBorys

Hmm, I'm not quite sure. The main thing is to make sure the hands/fingers are in the T-Pose shape like the PN Robot model. I only have a 17-sensor suit, so I don't focus on the fingers much. But I've had people use the script and said they were able to get the fingers to work. If your models finger structure is different than the default, you may need to go into the script and make some code changes.

Pages

About Perception Neuron

We are part of Noitom Ltd. and dedicated to the development of motion capture systems for entertainment, sports, science and medicine.

  • 278 NE 60 St
    Miami, Florida 33137
  • (305) 521-3124
  • +86 10 82055391 ext. 852
  • contact@neuronmocap.com

Flickr Feed

Check out our photo album from events, fans, and just random fun stuff.