Mention motion capture and the first thing likely to pop up in your head is an actor in a black spandex suit stuck with rubber balls. This norm, however, may soon be a relic of the past as software from technology company Move hopes to revolutionise motion capture for AAA and indies alike, as well as beyond games.
“We’ve created a platform that allows you to capture authentic, high quality, high fidelity human motion, using just regular video cameras,” says Move co-founder and CEO Tino Millar. “We can capture extremely high quality human motion, and make sure that motion can transfer and run on different avatars or 3D models, and then be used inside all sorts of different game engines to power really interesting user content.”
The captures have been tested to work on popular game engines like Unreal and Unity, as well as Roblox and proprietary engines. In fact, Move has already partnered with Electronic Arts, which gave a demonstration at annual computer graphics conference SIGGRAPH last year in Vancouver, showcasing remarkable results where the same actor capturing motion using Move’s system is able to get the same level of quality and accuracy as when using a more incumbent system.
The difference in costs and practicalities is what Millar hopes makes Move appealing as well.
“Traditionally, to get high quality human motion you need to put people in suits, including animals and kids, and it’s pretty uncomfortable,” Millar explains. “You have to be in a very controlled environment with the right lighting conditions. If you go to the bathroom, you have to take it off, put it back on, recalibrate everything.”
The process of capturing motion also relies on dozens, sometimes over a hundred, of high-end specialist cameras, which then also needs to be set up in the right conditions in a studio space. But Move’s technology aims to make it more accessible with high-quality motion captured with as little as just two cameras, which can be off the shelf products or even your iPhone (Move launched its iOS app just last month). The idea is also that this can be done anywhere instead of a studio, such as an office, your bedroom, even outdoors.
Move’s software uses a combination of advanced AI, computer vision, biomechanics, and physics to track multiple points in the human body, and without any markers.
“We make sure that everything is obeying the laws of motion and momentum to make sure you get accurate and authentic capture coming out from the people in the scene, so we can track a single person or multiple people,” Millar adds. “Then we translate that data so you can actually run it inside a games engine or graphics engine.”
One can imagine how beneficial this technology would have been at the height of the pandemic when studios faced immense challenges adapting to remote capture, with equipment being shipped to actors’ homes, who had to transform their living rooms into a capture studio. But the idea had been an occupation for Millar for years before that, having researched continuum mechanics, computer vision, and 3D simulation for a PhD at Imperial College. The real catalyst was however based on a simpler personal problem after the birth of his son.
“I didn’t have time to go to the gym anymore, and I was also really busy with work, so I started working out from home using apps,” he explains, except he was left wanting from the apps that were available. “I thought, let’s use the technology that I was making during my PhD, so I set up cameras in my living room, I’d work out and I was using the cameras to track my human motion in 3D, and I was trying to track my sets, my reps, and my technique.”
From trying to better accurately track his own motion and essentially creating his own digital coach, it eventually evolved into Move, where the software is helping users across mediums to digitise their motions.
In the context of motion capture for games, it’s a paradigm shift in that it’s not only going to benefit the workflow for studios that already do motion capture, but the reduced cost and practicalities means it’s going to be more accessible for indie developers, with the budget now being in the tens of thousands rather than millions. Move’s ability to capture multiple people at once on the same limited budget can also be an advantage.
“We’ve tested it in Wembley Stadium, for example, where we’re able to track 22 people to capture a football match,” Millar says, adding that Move can also track objects like a ball being kicked around. “We’ve also tested people playing basketball, rock climbing, break dancers, we’ve captured ballet, singing, and fashion.”
Millar sees a big potential for Move in the metaverse, where users don’t just make an avatar that looks like them, but can have an avatar that moves just like them.
“Motion is like a fingerprint, so if you’re to motion capture yourself now and see yourself as a stick man, you can tell it’s you because you can tell how people move,” says Millar. “So you can capture that true essence of people and you get a much more unique representation of how they move inside 3D digital worlds.”
He demonstrates this with a video showing his own son recorded outdoors with that same capture then transferred into the motion of a robot avatar replicating the exact same movements (see below). Another example comes from a user who has been testing the iPhone app in beta where they just captured a local band’s performance, and even though the graphics they’re rendered into are just stick figures, we can still see the motions of the singer, guitarist, keyboardist, and drummer all transferred accurately.
This ability to capture unique motion from a wide variety of people also has greater implications for something like sports games. While Millar isn’t able to say what games EA is currently utilising Move for, the greater freedom and flexibility of the system could mean that you can capture a whole football team instead of just using a few athletes’ mocaps to represent every player in the game.
“Or let’s say for the NBA, instead of using semi-professional athletes who’ve been brought into the studio and their motion is translated in the game, you can use existing camera feeds to capture LeBron James’ true motion,” Millar adds. “Now, when you’re playing a basketball game, it has LeBron James’ true motion in there, and it comes from his unique motion.”
That it’s possible to capture movements even from existing or historical footage does however raise the question of what happens if the software can be used to capture a person’s motion and then have it used without their consent? It’s a tricky area especially if these are motions that are then used as emotes that can be monetised on a digital storefront, similar to the lawsuits Epic faced over allegedly using copyrighted dance moves in Fortnite. The difference is that you can’t copyright an individual’s movement, although Millar thinks there will be a legal framework for motion licensing and rights as the industry realises the opportunity.
“Our approach is to ensure the highest fidelity motion that can be extracted and working with the talent and rights owners to make this an official licensed product,” he says. “In our latest investment round, we had a major music label invest, so we’re working with them to essentially capture historical artists’ motion as well as current artists they have, and then bring that motion into 3D worlds.”
Digitising human motion in a way that is accessible to far more people than before remains the mission for Move, while getting rid of the restrictions in the systems people are currently dealing with.
“Before, motion capture was this kind of painful thing you had to plan out in advance, because renting a studio is costing you tens of thousands per day, and you need to make sure that everyone shows up, and make sure all the equipment’s there,” explains Millar. “But now you can just do this from your living room or from an office so you can be more iterative. You can have an idea in the shower and be like, ‘let’s do the motion capture for that like right now’. You can set it up in less than five minutes, and you just need some iPhones.”