Blog

Easy 3D Markerless Facial and Body Tracking from video

DeepMotion to Blender

#DeepMotion works with #ReadyPlayerMe services or your own character, to create animation with markerless #FacialTracking and Body Tracking from video. In this step-by-step walkthrough video, we will bring motion capture clips and use them in Blender.

In the streaming era that we live in, you’re probably wondering how to capture your body motion with a 3D version yourself to go online? You can do that easily with DeepMotion’s Animate 3D, which is the leading solution for Al markerless motion capture.

DeepMotion user interface is very simple. Organize your clips in the Library.
You can create your custom avatar and 3D models using ReadyPlayer Me and Blender

Preparing your avatar or custom character

First, you need to create an account on readyplayer.me. Then, you can upload a selfie picture (it is not required, though) to start customizing your model which is fully compatible with different online communities such as VRChat, Animaze, and many others.

You can then connect your avatar from the ReadyPlayerMe hub site with DeepMotion. There is a live link between your account from RPMe to DeepMotion, so any changes from DeepMotion will be reflected on your ReadyPlayer.me avatar. No need to jump back and forth, DeepMotion has you covered.

You can also upload your own 3d models from Blender for full or half-body captures.

DeepMotion can track facial and body motion just by uploading a video!

The resulting .GLB motion capture file is compatible with any 3d software. We’ll use Blender for this video which is a Full 3D Open Source software suit to create and edit animations for internet ads, virtual reality, streaming movies, videogames, and much more. Here’s the general walkthrough:

Motion capture from video (the technical review)

DeepMotion invited me to review their services from a technical point of view. During this process, there were things that the developers quickly jumped to help in order to assist me with all the questions I had through their forums and email. Overall, if you’re in a 3d production of your own, chances are that they will reply to any complex technical questions you have.

About my own experience using my own custom 3d character, I will say that I really want to show you more things to keep in mind especially if you’re using Blender before exporting your .GLB 3d character. One thing you should really be mindful of is the naming of the bones your character will use. Since I knew I was going to work with .GLB format, the 3d human bone hierarchy has a specific naming structure and also a ROOT bone at 0,0,0. You may or may not add this bone at the top of the hierarchy, but I can assure you it will save you trouble when you need to mix NLA clips later in editing.

“At the core of games or movie making, the previz stage is probably where more reviews and changes happen. I see DeepMotion’s facial and body tracking as a dynamic solution to upload your character with the motion video you need, act the scene and share the file with your team anywhere in the world.”

Pierre Schiller

I will be uploading a new video specifically to address that and other key points, so don’t miss out and click on subscribe, hit the notification bell and you’ll be notified when the next video is out. Meanwhile, check how easy it is to use Facial and Body motion capture in Blender using DeepMotion:

Share your thoughts here.

This site uses Akismet to reduce spam. Learn how your comment data is processed.