Limits...
Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

Patil S, Chintalapalli HR, Kim D, Chai Y - Sensors (Basel) (2015)

Bottom Line: A quaternion-based complementary filter is implemented to reduce noise and drift.Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction.This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

View Article: PubMed Central - PubMed

Affiliation: Graduate School of Advanced Imaging Science, Multimedia and Film Chung-Ang University, Seoul 156-756, Korea. patil.shashidhar@hotmail.com.

ABSTRACT
In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

No MeSH data available.


Related in: MedlinePlus

Punching motions of avatar.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4507623&req=5

sensors-15-14435-f010: Punching motions of avatar.

Mentions: A rich set of new motion styles can be synthesized depending on user gesture-motion mapping relationships. The avatar’s motion trajectory changes according to the user’s gesture-motion mapping relationship, which alters the style of the motion. Figure 10 shows different styles of the punching motion generated using the gesture patterns provided for each style of motion. All three motion styles in Figure 10b–d were generated using the single example motion in Figure 10a, and mimic the hand gesture patterns of Figure 6d–f for the punching motions.


Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

Patil S, Chintalapalli HR, Kim D, Chai Y - Sensors (Basel) (2015)

Punching motions of avatar.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4507623&req=5

sensors-15-14435-f010: Punching motions of avatar.
Mentions: A rich set of new motion styles can be synthesized depending on user gesture-motion mapping relationships. The avatar’s motion trajectory changes according to the user’s gesture-motion mapping relationship, which alters the style of the motion. Figure 10 shows different styles of the punching motion generated using the gesture patterns provided for each style of motion. All three motion styles in Figure 10b–d were generated using the single example motion in Figure 10a, and mimic the hand gesture patterns of Figure 6d–f for the punching motions.

Bottom Line: A quaternion-based complementary filter is implemented to reduce noise and drift.Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction.This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

View Article: PubMed Central - PubMed

Affiliation: Graduate School of Advanced Imaging Science, Multimedia and Film Chung-Ang University, Seoul 156-756, Korea. patil.shashidhar@hotmail.com.

ABSTRACT
In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

No MeSH data available.


Related in: MedlinePlus