I was invited a year ago to thoroughly test Radical platform for AI motion tracking as a Blender Foundation Certified Trainer and as an experienced 3D animator. I am glad to have collaborated with them to bridge Blender into their workflow to generate correct rigging solvers for the AI.
Do you need #Motion capture data in #Blender? With this technical review, you will walk through the basic process to generate .fbx files from your own videos (or phone videos) to Blender and use Rigify to animate through the NLA editor.
Radical website: https://getrad.co
Motion Capture data video: https://getrad.co/scan/686850
Importing a Mocap rig
During the period of my technical assistance to the tech team in Radical, we went over many technical considerations to make Blender work seamlessly with the tracked mocaps. I will not be listing them as you’d understand this is a reserved evaluation for the company. All I can tell you is that the furute is bright for video tracking AI generation mocaps for Blender.
I will address one common issue in any MOCAP scenario though, that you should be always aware of: “bone rolls”
We found the AI rig generation needed to correct the “roll” parameter in the mocap data. This is crucially important in any mocap session as the tagged rig could potentially be twisted in the wrong angle when interpreting the captured data. Not only this would represent a lot of technical-curve-cleaning data, but it would beat the very purpose of the AI creating such accurate body tracking if the rig couldn’t be tagged in the correct roll angle.
I will leave you with some of the considerations in general, that you should always look for when you work with MOCAP data in Blender.
Like, and subscribe if you find this article useful. Thanks!