Limits...
Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

Patil S, Chintalapalli HR, Kim D, Chai Y - Sensors (Basel) (2015)

Bottom Line: A quaternion-based complementary filter is implemented to reduce noise and drift.Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction.This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

View Article: PubMed Central - PubMed

Affiliation: Graduate School of Advanced Imaging Science, Multimedia and Film Chung-Ang University, Seoul 156-756, Korea. patil.shashidhar@hotmail.com.

ABSTRACT
In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

No MeSH data available.


Related in: MedlinePlus

Kicking motions of avatar.
© Copyright Policy
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4507623&req=5

sensors-15-14435-f009: Kicking motions of avatar.

Mentions: Figure 9 shows three styles of kicking motion generated using the gesture patterns provided for each style of motion. All three motion styles Figure 9b–d were generated using a single example motion (Figure 9a), and mimic hand gesture patterns in Figure 6a–c for the kicking motions. For kicking style 1 (angry) and style 2 (friendly), we selected the right leg part and both the right and left forearm joints as candidate joints for extracting corresponding motion components from the input motion data. For kicking style 3 (frustrated), we selected the head in addition to the style 1 and 2 candidate joints.


Inertial Sensor-Based Touch and Shake Metaphor for Expressive Control of 3D Virtual Avatars.

Patil S, Chintalapalli HR, Kim D, Chai Y - Sensors (Basel) (2015)

Kicking motions of avatar.
© Copyright Policy
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4507623&req=5

sensors-15-14435-f009: Kicking motions of avatar.
Mentions: Figure 9 shows three styles of kicking motion generated using the gesture patterns provided for each style of motion. All three motion styles Figure 9b–d were generated using a single example motion (Figure 9a), and mimic hand gesture patterns in Figure 6a–c for the kicking motions. For kicking style 1 (angry) and style 2 (friendly), we selected the right leg part and both the right and left forearm joints as candidate joints for extracting corresponding motion components from the input motion data. For kicking style 3 (frustrated), we selected the head in addition to the style 1 and 2 candidate joints.

Bottom Line: A quaternion-based complementary filter is implemented to reduce noise and drift.Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction.This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

View Article: PubMed Central - PubMed

Affiliation: Graduate School of Advanced Imaging Science, Multimedia and Film Chung-Ang University, Seoul 156-756, Korea. patil.shashidhar@hotmail.com.

ABSTRACT
In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

No MeSH data available.


Related in: MedlinePlus