Over the span of just a couple years, virtualization has transformed the ways we connect and engage with one another. There’s been an explosion of new platforms and user behaviors—as well as a renewed urgency for brands to support them. Operating at the cusp of new opportunities in digital, the Labs.Monks play a crucial role in pushing technology to its outer limits while examining its practical uses for our wider team.
These insights take the form of newsletters, podcasts and reports—the most recent one featuring motion capture and the ways it can plug into a variety of content production workflows. Because of motion capture’s growing influence across digital experience, the Labs.Monks’ research into the technology illustrates how the team plugs into different categories and talent to unlock new innovations.
Motion Capture in the Moment
Digital has gone from the place where you go to buy a concert ticket, to the place where you go for the concert. And as experiences in the metaverse continue to illustrate this concept, there’s a growing need for solutions that close the gap between the physical and virtual worlds—like expressive Vtubers, live events that star virtual characters, purchasable dances and greetings for avatars and more. And what each of these examples have in common is that they are activated by motion capture.
Motion capture technology records the actions of humans to digitally represent live movement. It takes many forms: suits outfitted with several accelerometers (the same tech that tells your phone if or how you’re holding it), camera-based tracking balls or dots on the body, and camera-based tracking powered by artificial intelligence (like a Snapchat filter). With so many ways to record motion in real time, mocap has become more affordable and accessible than ever—and has far-reaching application across building digital content and experiences.
Realizing the different ways that mocap can impact workflows across MediaMonks, the Labs.Monks have experimented with the Xsens MVN Link suit to better understand its accuracy in sensing fine movements. This took shape in a prototype VR experience in which users can play a digital instrument simply by moving their body. Learnings from that experience will help the Labs.Monks assist other teams interested in using mocap.
We investigated the mocap pipelines in order to get a solid idea of how it can be applied in a variety of other capabilities, discovering how we can help them get the most out of the technology.
Streamlined Virtual Production
The history of motion capture correlates with the history of film, from rotoscoping footage by hand in traditional animation to an actor’s digital embodiment of Gollum in The Lord of the Rings and more. So it should be no surprise that one of the most obvious uses of mocap is in virtual production, eliminating the need to shoot in front of a green screen within a large studio setting.
“For one of our clients, we are doing a hybrid type of editing that mixes offline capture, VFX and animation,” says Cas de Brouwer, Head of Post Production. “We end up doing a lot of rotoscoping to add in the animation, but with motion capture, we can really benefit.” With tabletop production, for example, motion data captured from a robotic arm-mounted camera can be ported to Adobe After Effects to add in virtual objects and animations that are lined up with the camera’s perspective.
Patrick Staud, Chief Creative Technologist, has already experimented with mocap-enhanced virtual production for automotive clients—for example, using mocap in pre-production to capture actors’ movements and then reflect them on the car’s surface in post. But the technology can have a greater impact within fully virtual environments.
In automotive, you have a lot of influencer-style content where you have to touch the product. If done virtually, motion capture lets us interact directly by touching buttons and opening doors.
New Efficiencies in Animation
When it comes to efficiencies that are unlocked by motion capture, animation can benefit substantially. Animation as a capability is often the through line in producing digital experiences that range from original content to immersive metaverse environments. And while manually drawn or rigged animation isn’t going anywhere—the medium thrives on exaggeration and stylization best produced by an artist’s hand, rather than verisimilitude—in some cases, closely mimicking mannerisms is key.
This was true in a virtual performance starring Post Malone that we helped animate to celebrate Pokémon’s 25th anniversary. The singer’s movements were translated into the virtual environment through motion capture. “With live events on hold, this was a great opportunity to use motion capture to create an animated experience to engage our audience,” says Jessica Norton, Executive Producer, Experiential at Media.Monks.
Head of Animation.Monks Thymo van der Vlies notes how representing Post Malone realistically was key, despite his stylized look: “Because he is a celebrity, you want to capture his exact movements—he holds the microphone in a very specific way, for instance,” he says. “Use of motion capture is a huge time saver if we need to capture realistic movement.”
This same need makes it useful for research and pre-visualization: “For one of our projects, a fun part of the process was that we had animators filming each other falling down to capture what the movements looked like,” says Van der Vlies.
Immersive Live Events Combine Fantasy and Reality
Animation and virtual production come together to create live experiences starring virtual characters who can be interacted with in real time. “We’re creating images in ways that in the past would take months of post-production work—and now they are live,” says Director of Creative Solutions Lewis Smithingham, who recently worked with motion capture to bring an animated character into three dimensions for the first time during a live event. “Audiences have a higher bar for quality than ever before, and these tools allow us to deliver them live.”
In addition to breaking the boundaries between fantasy and reality, motion capture used in a live events setting can also bridge together time and space. Actors can be recorded in separate locations and brought together digitally—a solution to working within ongoing pandemic restrictions as well as talents’ busy schedules. Connectivity like 5G ultra-wideband and multi-access edge computing can reduce onsite processing and latency to ensure both talents can interact with one another—and audiences—in real time no matter the location.
If something isn't interactive, it's broken. By creating high quality VFX in-camera, we're able to bring the character to life.
Toward Future Innovation
The uses of motion capture are far-reaching, and the Labs.Monks continue to experiment and find ways for the technology to become integrated within our diverse production workflows. At the same time, the team is keen to forecast future innovations and their practical applications. “Labs has an opportunity to take more risks than other teams,” says Eichhorn. “We’re a little more flexible and have the resources to experiment with things. We can figure out if something is viable without having to invest too many resources.”
Their approach of connecting experimentation with collaboration for category teams—including automotive, fashion and sports—has been crucial in keeping our people and our clients on the bleeding edge of technology. As virtualization continues to shape new ways to interact and engage digitally, the Labs.Monks will continue to connect insights and subject matter expertise in the space.
Make our digital heart beat faster
Get our newsletter with inspiration on the latest trends, projects and much more.