How to fix PN hands etc. offset? | Perception Neuron by Noitom

How to fix PN hands etc. offset?

You are here

10 posts / 0 new
Last post
Roman Navratil
How to fix PN hands etc. offset?

Hello, i did tests with PN and like you i noticed that that even with good calibration, there seems to be offset, mainly i guess in the hands/palms. For example:

1) If i wanna clamp my hands/palms together the palms go through each other (intersect of around 15-20cm).
2) If i wanna touch with my palms my knees the palms dont touch my knees but stop lets say 10-20 cm or so in front of the knees.
3) If i wanna touch with my index finger my nouse for example, there is again some offset and the index finger doesnt touch my nose, but stay around 15 cm in front of the nose to the left for example)
4) Etc. other issues with "offset" of the hands or how to call it...

As far as i understand this is somehow the limitation of innertial mocap in general, not just PN? I have 2 questions:

1) Isnt it possible to fix this somehow with additional optional calibration poses? Like for exampel touching my nouse, touching my knees etc. and PN could during the recording or even after it fix it somehow? Correct the offset using the additional optional calibration posses? Would it be possible, is Noitom planning something like this?

2) If not 1, how to fix this offset in some 3D animation/modeling package afterwards, with maximum efficiency? The easiest way would be according to me the useage of animation layers (i think they are in motionbuilder, maya, max and other software)? Just create additional layer and in it move the hand/palm where it should be in the problematic part of mocap clip. Then blend both anim. layers together? Im not an animator, so im asking someone with experience with animating and mocap. Is this the best/easiest/quickest/efficient way to fix the offset issue? If not, which one is?

Thank you for tips/tricks! :-)



Your best bet is to fix this - as you mentioned - in MotionBuilder, Maya, or Max.  MotionBuilder is what I'm familiar with, so it is probably what I'd use as it has the most robust controls for editing MoCap data.  You would basically bring your mocap into MotionBuilder and create a control rig over the top of it, add a new animation layer that fixes your offsets, and then re-plot the animation to your rig and export.

Maya has animation layers - but I have no idea how to use them.

Hopefully, the Pro version of PN will have tools like this - it would be nice not to have to hop back and forth in different software packages to clean up the mocap data - that and MotionBuilder is not the most friendly package out there - it's pretty cryptic and confusing if you don't use it a lot.   


Don't try to fix this in post!  Learning how to make edits in something like Maya is important, but you need to start with correctly calibrated data.  If you put your hands together and they're that far off in the view, you're getting bad data.  This isn't a problem with inertial mocap (actually, PN sensors aren't just inertial: they a gyro and a compass), it just needs to know your body dimensions to interpret the sensors correctly.

Check the body size setting.  It's in a weird place: after you connect (and not before--it's hidden when not connected for some reason), there's a dropdown underneath the sensor display to select your body size.  (The "body size manager" tab lets you edit these, but you can't activate them there.)  Select the nearest match and recalibrate.  If it's still off, go to the body size tab, copy off the size you're using (I had to export and reimport to make a copy of a specific entry) and you can edit the dimensions there.  Be sure to select the edited entry off to the right and recalibrate after changing it.

The results I'm getting aren't perfect (shoulder motion is weird, and sometimes it misdetects the ground plane, which has weird effects), but adjusting the body size did help a lot.


Hi Roman, JoeW and Glenn, I've gottent he same kinds of issues and was told by support that this is the limitation of their software.  Glenn - how good is your data?  I was told that the hands and knees problem - even after correctly measuring all my body parts - I was told that this does indeed have to be solved in post. I'd be very interested to hear what steps you took to solve the hands and knees issue?
Sincerely, - JC

RABABA's picture

i have same, look at this video

NeuroNaut's picture

yup same problem here....   I've never been able to calibrate the hands.  For example if you were to clap your hands... each hand is at least 10-20cm offset.    
 I laso have problems with the fingers, mainly the thumbs rotate backwards.  I've tried everything & it still persists. 

 I'm so frustrate with the system that I'm close to selling it  :(   + the fact Perception Neuron still has not updaed the softwareto support  the Micro SD Card recording feature.



@NeuroNaut - perhaps you could try fudging the upper & lower arm length dimensions until the skeleton hands' contact match (relatively speaking) your own?  Worth a shot...


When testing your calibration/measurements one thing you need to keep in mind with joints such as elbos and knees is that they have a significant amount of mass that is not bieng accounted for in  the Axis software. The Axis avatar elbows and knees are thin little tubes, and nowhere close to the volume of actual human anatomy. What you are primarily concerned with when evaluating your calibration is the location of the points of rotation of on the avatar.

If your calibration is good you should be able to have both hands touch in front of the body, all the way from right in front of the chest (like a praying pose) to a full extension in both arms. You should be able to touch the hands to the top of the head. The hands should be able to reach the rotation point of the knees reasonably, remember there is inaccurate for the geometry of the knee. For the feet the performer should be able to stand with feet together and have it accurately reflected on the avatar with no crossing of the legs. The performer should also be able to touch the heel of one foot to the toe of the other and have it accurately reflected on the Axis avatar.



Its March 2018, some news on this? Face the same issue.

NeuroNaut's picture

I'm still trying to get clean calibrations and its a constant battle.   Some times the feet are completely reversed.   Really is so inaccurate. 

We now use iClone 7 to edit the animation in post.  

About Perception Neuron

We are part of Noitom Ltd. and dedicated to the development of motion capture systems for entertainment, sports, science and medicine.

  • 278 NE 60 St
    Miami, Florida 33137
  • (305) 521-3124
  • +86 10 82055391 ext. 852

Latest Posts

Flickr Feed

Check out our photo album from events, fans, and just random fun stuff.