Partial Motion Blending and Assembly for Interactive Motion Synthesis

Juang,G.-H., Chen,Y.-J., Peng,J.-Y., Chen Lin,I.-Ch., Chao,J.-H.

Interactively controlling or editing motion capture data is an intriguing topic in games or animation prototyping. However, data-driven approaches require a large data set to synthesize motion of high variety. In the first part of this paper, we propose novel partial motion synthesis to extend the control parameter space or variety of limited motion data. We extend the attack range of kicking or punching by blending body parts respectively and then reassembling. Users can simply assign a target position in the extended parameter space and will get a motion that hit the desired target. In the second application, we propose applying partial motion assembly for motion editing. Given a sequence of key postures, our system will retrieve resembling partial figures from data sets. While reassembling the different parts of character motion and adjusting the motion variance according to the query motion, our novel synthetic motions can preserve original style and naturalness.