<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Exclusive: The future of Motion Capture?

4 minute read

YEIYEI motion capture demo

RedShark talks exclusively to YEI about their groundbreaking camera-less motion capture technology

Accurate motion capture until now has called for complicated arrays of cameras, lights and markers on motion capture actors. The ultimate aim of motion capture is to map the exact movements of a human body onto a virtual skeleton, that can then be skinned and dressed to make a convincing character.

Avatar is the most widely seen example of the art so far, but the technique made a somewhat controversial major film debut in 2004's Polar Express. It was contraversial because it was the first mass-media example of an psychological effect called the "Uncanny Vally", so called because the more real computer animations of humans become, somehow the less plausible they seem. The "Valley" is the gap between what you see, and what you feel.

Motion capture is now an intrinsic part of film-making, so advances in the technology are important for the film industry. And sometimes they come from unexpected places. Portsmouth, Ohio, for example.

Yost Engineering Inc. has developed a range of motion capture sensors that are so stuffed with gyros, accelerometers and all manner of location-sensitive electronics, that they are able to supply position information with absolute and repeatable accuracy - without the need for cameras or special lights. For YEI, motion capture for film is just the start. They've looking at the consumer gaming and virtual reality sectors as well.

Watch the clip, and then read our interview with YEI's 3-Space team.

 

Do you see your extremely accurate motion capture equipment being used in movie making?

Yes.  The potential for inertial motion capture technology is tremendous and offers many advantages over traditional optical motion capture systems.  For example, in optical systems, performances must be generally recorded in a special motion capture studio outfitted with many expensive high speed cameras. Props and sets must be carefully designed so as not to occlude the camera systems and must be brought into or custom built in the studio itself.  To record a performance of simple things like climbing stairs, opening and walking through a door, or jumping a hurdle, requires the building and fabrication of these props within the limited space of the optical mocap studio.  Even something as simple as a sprinter running all-out for 50 meters becomes difficult to capture in the limited optical capture spaces.   With a system like ours, these situations are trivial: simply strap on the sensors and head to the nearest stairs, door, or track and record the data.

The other advantage in inertial systems like these is that the data streams from the sensors in real-time. There is little or no need for complex cleanup of recorded optical point-cloud data or other post-processing.  Our product is relatively new, but we do have users at major animated motion picture studios using our system for some of their projects.

How did you get into the field of motion capture in the first place?

At our core, we're an R&D company. The YEI 3-Space Sensor systems grew out of our own research needs for dynamic robotics projects we were working on.  For a robot to move and balance in a dynamic way it needs to have sensors that can accurately aid in this balancing.  At the time there were no sensors that were reasonably priced that gave acceptable performance. So, we decided to make our own sensor systems.  This led us into motion sensing research, a fascinating area of research in and of itself, and eventually to high-performance sensor units that are also affordable.  Our idea became, "if we need something with the capabilities of our sensors, then others probably do too."

How did you interface it with the UnReal game engine?

The demo uses the UnReal engine, but we have other demos in progress that use Crytek and other engines.  With our sensors, interfacing is relatively easy.  Since our sensors include fairly sophisticated on-board processing that calculates and outputs a fully qualified drift-free 3-axis orientation in global space, it is relatively easy to associate this orientation to an in-game object, such as a bone in the skeletal model of the player.  Thus, for the developer there is little difficult work to do. Simply slave the in-game bone orientations to the orientations produced by the associated sensors and you have your real-time in-game motion capture suitable for testing or recording or interaction.

Do you see a consumer version of this (to replace game controllers)?

This would certainly be an exciting direction for our technology, especially for mass-market applications such as gaming and electronic entertainment. And not just for controller replacement.  For example, since the 1990's there has been this sense that Virtual Reality would arrive that would revolutionize gaming and entertainment.  Yet until recently, the technology hadn't matured to the point where this was possible. Our commitment is to producing high accuracy sensors that are priced to allow mass-market applications that were previously impossible with sensing solutions costing thousands of dollars.

What are the unique characteristics of your motion capture technology?

One of the key unique characteristics is that our sensors have zero orientation drift.  You can set the sensor in a position and come back a week, a month, or a year later and the orientation will be perfect.  Other systems eventually experience positional error that arises due to gyroscopic drift, but ours will not.

We're committed to producing high-quality motion sensing technology that is priced to allow for high-volume and mass-market deployments that were previously impossible due cost.  Our sensors have performance characteristics comparable to or better than to competitors' sensors costing several thousand dollars.

Our 3-Space Mocap Studio software is free (both as in beer and as in speech) and allows the set-up and recording of motion capture performances for use in animation or as game content.  The software being free and open-source means that it is offered as a no-cost download and is released as open-source.  This allows users to install and use the software as they need, where they need, without worrying about complex licensing requirements.  Thus, it is easy for a user to get started instantly recording and using mocap performances for their own use, without expensive software. And, if someone wants additional features or needs to customize it, then they can always dig into the source, add their own features, and share those with others as well.

Tags: Technology

Comments