JACKIE'S REVENGE - Unreal Metahumans | Faceware | PN3 Short Film | Perception Neuron Motion Capture

Description

Created by: Jason Cuadrado | Cinemonster Cinematics

"Jackie's fueled by rage and in too deep. Hellbent on revenge, he turns the table on Paulie by making his worst fear come true.  I had a couple of tech things I needed to test and thought it would be fun to do it with a short GTA-style cutscene. There's a lot of weird stuff in this test but I wasn't going to obsess about making it perfect. The important thing was to start working out a new performance capture pipeline. 
1) PERCEPTION NEURON 3 My PN3 system arrived while I was finishing up "The Soot Man" so it's been sitting in its box for almost a month. I finally got to test it out and was really impressed by how smooth the capture was. I'll have to get used to Axis Studio and its tools but so far so good. I also really like the fit of the new strap system (although they collect sweat a lot easier than my PN Pro). That said...gloves. Thank heavens for gloves! Next on my Noitom wishlist is simple prop capture for the PN system. I would be overjoyed if I could place markers on a small object or weapon. 
2) FACEWARE FOR iCLONE I decided to get Faceware for iClone for a couple of reasons. It allows me to attach a lighter GoPro on my DIY helmet and I don't have to stream the facial data to my computer at the same time I was streaming from the PN3. Now I can record to the SD card and apply the animation in iClone later using a PNG sequence. I found it to work pretty well! You still need to tweak the data but it's a great foundation.  Note: I mask out my lips when I record the facial data in iClone because I like the control of Acculips. 
3) METAHUMANS I used the first Metahumans in my "A Job to Die For" short when it was just the demo. Now that the feature has officially launched, what can I say. They look amazing- especial compared to the characters I've been using thus far. I can't wait for further innovations so it's easier to apply different clothing; and the whole limited LOD thing.
The animation was streamed to iClone via Motion Live, which was then applied to Unreal's Metahumans (using their dummy avatars). The connection between the animation and the Metahumans is still a little wonky. The fingers bend in strange ways and the body doesn't look quite right. Animations get distorted. Another iClone character is also required for the facial animation which isn't ideal (and I think hurts the FPS). I'm sure it will all get streamlined in time and I can't wait.
In terms of the test itself, I did all the animation and voices myself, as usual. I'll never stop being amazed at what you can accomplish my yourself using this technology. It took two weeks to put this together, start to finish.
The gameplay uses a FPS template from the marketplace - https://www.unrealengine.com/marketpl...
The car blueprint is Crypto - https://www.unrealengine.com/marketpl...
My single contribution to the music is the squealing electric guitar chord towards the end (viva '80s action movies).
I still have a lot more testing to do but I feel like I'm moving in the right direction. As always, thanks for watching!"
LAUNCH PROJECT

About Perception Neuron

We are part of Noitom Ltd. and dedicated to the development of motion capture systems for entertainment, sports, science and medicine.

  • 278 NE 60 St
    Miami, Florida 33137
  • (305) 521-3124
  • +86 10 82055391 ext. 852
  • contact@neuronmocap.com

Latest Posts