top of page

Live Motion Capture in Virtual Production for Final Pixel

Updated: Feb 16

Specialising in end-to-end Virtual Production for film, TV and advertising, Final Pixel have been honing their VP pipeline and have developed a reliable workflow for shooting using high-end LED wall and cameras across multiple scenarios and studios in the US and UK.


Understanding that Virtual Production is moving elements of a traditional post in a VFX pipeline into pre-production, Final Pixel knew that the next evolution of this technology was to look at the elements which had - until now - been considered too heavy or complex to move out of the post production workflow.


“Creature work is a big area for this, and also extremely important for storytelling of narrative. Having live interactions between real life actors and creature or character animation cements the creative process on a stage and opens up a huge amount of opportunities across the filmmaking spectrum,” Final Pixel CEO, Michael McKenna, explains. “We’ve been trying to incorporate the use of digital animated humans or creatures into our Virtual Production shoots for clients. For example, we have experimented with Epic’s wonderful metahumans and pre-recorded motion capture animations as background digital humans on various shoots. Ultimately they never made the final cut due to some limitations, which typically stemmed from two key factors. Firstly, the realistic and believable look of the character - dictated by the realism of character model and rendering. Secondly, the realistic movement of the character - dictated by the animation.”


This October, Micheal and the team at Final Pixel began a ‘proof of concept’ research project to incorporate live action motion capture of a detailed creature animation into their VP workflow.


The project fitted perfectly with the ethos of the new Virtual Production Test Stage in Surrey, a space that encourages curiosity and experimentation within the field, and was strengthened by the motion capture experts from Target3D.


Marta Lunes was the motion capture artist interpreting a werewolf monster for this project. She performed within a motion capture volume, rigged up in front of the Virtual Production Test Stage, which allowed the director and technicians full 360 access to the live and virtual happenings, in real time. Marta explains, “I see a potential victim and in order to lure them into my claws I initiate a dance battle. The markings are very accurate, they really follow what you are doing - for example, this character is very hairy so you see the hair moving too, which is a lovely detail. Even with the facial capture it follows when I close my eyes, or if I smile the monster opens its mouth.”


The artist’s movements were tracked in Motive, with the data then put through Motion Builder software where her skeleton was applied to the character then transferred to Unreal. The team utilised Disguise to produce rendering power to enable us to portray a creature which could be good enough to pass for Final Pixel results. Michael McKenna added, “The potential uses for this sort of technology are many, and could even be used to augment existing CG pipelines as a hybrid approach. We had a lot of fun doing this at the new Digital Catapult and Target3D Virtual Production Test Stage, an amazing resource for creative R&D in the UK.”



 


Comments


Commenting has been turned off.
bottom of page