You may associate motion capture with some of Hollywood’s biggest and most expensive films, like Avatar, Planet of the Apes, or Lord of the Rings. We’ve all seen videos of Andy Serkis with tracking dot markings all over his face or actors wearing an intricate arrangement of ping-pong-like spheres all over their bodies, motion capture can be a cost-efficient production tool used on smaller or indie projects.
Motion capture technology is being used to create expressive and lifelike 3D animated characters across films, TV shows, games, commercials, social media, and more. As the capabilities of the technology grow, the usability and accessibility of it has grown as well, and it may be an invaluable tool for your next project.
Motion capture, or “mocap” for short and also known as performance capture, is a production tool and technique used to track and record the movement of people and convert it into digital data. This data can be applied to digital characters or objects to create lifelike or realistic animation.
Mocap systems are wearable technology that is most commonly used to track body, face, or hand movements. They can vary between multiple camera tracking set-ups, such as OptiTrack and Vicon, to a single wearable technology with individual sensors, such as Rokoko and Xsens.
Setting up your animation pipeline starts with determining which motion capture system you plan to use.
A camera system like OptiTrack or Vicon uses high-speed cameras and reflective markers to capture the movement of objects or people in 3D space with high precision and accuracy. The markers are attached to the object or person being tracked, and the cameras track the markers in real-time. While this system outputs highly precise data, camera tracking is usually specific to one production location due to long set-up times.
On the other hand, a wearable mocap system uses inertial sensors to capture and track the movements of the body, such as with Rokoko and Xsens. The body suit is equipped with a number of small, lightweight sensors that contain accelerometers, gyroscopes, and magnetometers.
Hand capture systems, such as Xsens Gloves by Manus, can also be paired with the suit to get precise finger and hand movements when needed. Unlike other systems that require a dedicated studio space with multiple cameras, Xsens is easily transported and used on location.
For capturing facial movements and expressions, Faceware is a wearable headset that captures the facial expression of an actor, such as movement of the eyebrows or the opening and closing of the mouth. The data collected from Faceware can be applied to a 3D rigged character in real-time or in post-production.
For a more cost-effective and accessible solution for facial capture, some indie directors and artists are using iPhone’s new facial tracking system, ARKit, which can track up to 52 unique facial expressions as well as head movements.
The combination of body, facial, and hand motion capture can create a complete performance that accurately captures the full range of human movement and expression. This makes it a powerful tool for film and video game production, virtual reality experiences, and other applications that require realistic and immersive character animation.
Although it is most commonly associated with film, TV, and games, mocap is now a useful tool for music videos, short films, social media, VR/AR and more. It can be used in any 3D animation pipeline, including fully animated projects or even VFX in live action projects. It is often used in situations where realistic and accurate human movement is required or when it is important to capture subtle nuances in movement that are difficult to replicate with keyframe animation.
Traditionally, all 3D animation had been done with keyframe animation. This means that an animator must move the character for each frame, which can add up to hundreds of thousands of frames for a single project. It’s an incredibly complex and tedious process.
Motion capture provides data that realistically portrays how the body naturally moves. The animator will then review the raw data, fix any errors, and tweak the motions to match how they want the character to move. This process is called clean up animation.
Every production’s pipeline will be different, but there are a few key steps in order to be successful.
First, pre-production planning is arguably the most important part. In the meeting, the production team will typically use the script to begin planning the shot list, who is responsible for what, and any other production logistics. People such as the director, producers, technical directors, motion capture actors, animators, compositors, and more may be a part of this discussion.
On the shoot day, the production team will test all the hardware and prep the environment with props. As each shot is captured, the production team can see a real-time visualization (called pre-visualization) of the character in the environment. This is an invaluable tool for directors as they make on-set adjustments and actors to see how their performance will look.
After recording, the production team processes the data and cleans up any errors or glitches. It is then delivered to the animators who do cleanup animation, before handing it over to a compositor to incorporate the VFX and render the final version.
Motion capture technology has revolutionized the way 3D character animation is created in the entertainment industry. The trend of more technological advancement and more availability will continue, and more productions will have access to all of its capabilities. The integration of artificial intelligence (AI), using products like Plask or DeepMotion, is becoming more popular as well.
While big-budget films will engrain mocap into their pipelines even more often, niche or emerging industries like virtual influencers and VR/AR integrations are using it for their 3D characters as well.
Our 3D productions can be seen in films, TV, commercials, music videos, games, social media, and more. From the initial character design to animation to the final render, we’re always venturing beyond the limits of what’s possible.
Virtual influencers have blurred the line between reality and synthetic. Many are born out of new or abstract concepts such as the metaverse, but most are already impacting our digital experience on a daily basis. Explore the wide range of virtual influencers, from the most humanoid to the most cartoonish.
The animation and live action studio Deep Sky released the first installment of their animated anthology "It’s The Little Things" to global recognition and acclaim. Originally released as the studio’s 2022 holiday short, the 2D short has gained over 500K views and over 60K likes on Instagram and TikTok.