Perception Neuron Vive Tracker Integration | Perception Neuron by Noitom

Perception Neuron Vive Tracker Integration

You are here

10 posts / 0 new
Last post
NE3D
Perception Neuron Vive Tracker Integration

Hi all,

I've been doing some experiments in order to get better positional tracking using the Perception Neuron Suit and a Vive Tracker, here are the results:

Perception Neuron Vive Tracker Integration - Part 2

Cheers,

Nicolas

st.app

Hello Nicolas,
already had a similar idea and also asked the support, if it is planned to support the HTV Vive Tracker. Unfortunately, there are no plans for this yet. Too bad.
The correction would help me immensely, because I would like to capture two people at the same time and it is incredibly difficult to bring them together. I want to capture  a dance couple.
The trackers have to attached at the feet or on the head because of possible overlaps.
Can you describe how you did that? Did you proramm that in Unity? Or what did you use?
Would like to try that too. Unfortunately I have no HTC Vive VR yet. Do I still have to buy.
Excuse the bad English. Hope you understand me.

Stephan

NE3D

Hello Stephan,
The idea is to have more precision while doing mocap using PN, so the Vive Trackers helps a lot by using Optical tracking.

Because of the way I created the entire setup, the Vive Tracker is placed onto the pelvis, right below the belly button, but I'm also adding the possibility to place it facing the other way, meaning same height as facing forward, but on your back, which might help you with the dancing setup you need to do.
So the data from the Perception Neuron does 90% of the job, while the Vive Tracker is driving the Pelvis position.

I did the entire setup using the Unreal Engine 4, and currently is WIP, but if everything goes according to plan, as soon as it'll be done I'll release the entire project for PN users to use for their mocap projects, and I'll be sure to share the scene here and on other forums.
Unity support is not included for now, but it will be probably be added later.

Cheers,

Nicolas

st.app

Hello Nicolas,

that's very great.

Question:
Is it possible to use two trackers per person? So back and forth?

What happens when the tracker is hidden. Does PN then continue?

The adaptation to Unity would be great.

Cheers,
Stephan

NE3D

Hi Stephan,
A single tracker is assigned to a single body part with the IK, so having 2 trackers for a single person won't make too much sense, unless you want to create an entire IK setup ( similar to IKinema Orion ).
With the setup I created, you can use 2 Vive Trackers, one for each person, both of them wearing the PN suits, everything running on the same PC, so that you'll be able to use the position tracking from the Vive Trackers on both actors, and record both of them in realtime in UE4.

Unity will come later, and it'll be a refactor of the setup done in UE4, since prototyping in UE4 is very easy to do and modify/update.

BR,

Nicolas

st.app

Hello Nicolas,

I asked about the 2 trackers only,
because it may be that a tracker is covered by another person.
E.g. when a person turns, a tracker is not always in the visible range.
With two trackers, the second could jump in for the first.

Cheers
Stephan
 

NE3D

Hi Stephan,

Yep, that make sense, but I'm not 100% sure if this is doable because of the way the entire setup works, also the lack of any official docs about it makes it a bit harder to develop, and the API won't tell you which Tracker is occluded, so an entire custom setup is needed, which will take quite a bit to develop, so for now I'm going with the one tracker only solution.

Trackers occlusion happens sometimes even with a single person doing mocap, so with two people chances of occlusion are higher.

As soon as I'll do the updates ( I'm quite busy right now ), I'll let you know

BR,

Nicolas

NE3D

Update with downloadable scene for testing purpose :)

https://www.facebook.com/EnterRealityVR/posts/1097933283703128

st.app

Hello Nicolas,

Just try to create something similar in Unity.
Can you please give me an example (ideally a piece of code) of how to transfer the position of the tracker to the avatar (hips)?
How do you determine the distance between the tracker and the hip?
Are you using a fixed value or do you have a calibration function?

Many thanks in advance
Stephan

NE3D

Hi Stephan,

Sorry for the late reply.

Distance from the hip is determined by the position of the Hips from PN compare to the  world position of the Vive Tracker

In short, all the calibration setup is given by the data from Axis, so the only thing used is the Delta Value to drive the hips position, and since the skeletal hierarchy "root" is the Hips, all the other joints follow thwe same delta.

As soon as the entire setup is done I'll publish the demo, so that you can see the full code ( very probably in January 2019 ).

Cheers

Nicolas

About Perception Neuron

We are part of Noitom Ltd. and dedicated to the development of motion capture systems for entertainment, sports, science and medicine.

  • 278 NE 60 St
    Miami, Florida 33137
  • (305) 521-3124
  • +86 10 82055391 ext. 852
  • contact@neuronmocap.com

Latest Posts

Flickr Feed

Check out our photo album from events, fans, and just random fun stuff.