I've been hard at work putting together a script that allows retargetting of PN live BVH data from Axis Neuron to a humanoid rigged character in Unity.
The script is simple, it takes the PN robot and retargets the joints rotations to match onto your humanoid model. It also handles translation of the root node which can also be assigned to move a parent object as well so as to have a character that moves around with rigid body and collision data. This allows it so that you can toggle between PN animations and Animator animations to mix live mocap with pre-recorded animations.
In the demo, you can see the live data being retargetted to the standard Unity humanoid and can also toggle control so that the character can run around the level and toggle back to mocap retargetting.
Here is an example project I put together:
Note: example project uses "PerceptionNeuronUnityIntegration_0.2.5" and was built with Unity 5.3.0.