DOI QR코드

DOI QR Code

Temporal Transfer of Locomotion Style

  • Kim, Yejin (SW.Content Research Laboratory, ETRI) ;
  • Kim, Myunggyu (SW.Content Research Laboratory, ETRI) ;
  • Neff, Michael (Department of Computer Science, University of California)
  • Received : 2014.01.29
  • Accepted : 2015.01.02
  • Published : 2015.04.01

Abstract

Timing plays a key role in expressing the qualitative aspects of a character's motion; that is, conveying emotional state, personality, and character role, all potentially without changing spatial positions. Temporal editing of locomotion style is particularly difficult for a novice animator since observers are not well attuned to the sense of weight and energy displayed through motion timing; and the interface for adjusting timing is far less intuitive to use than that for adjusting pose. In this paper, we propose an editing system that effectively captures the timing variations in an example locomotion set and utilizes them for style transfer from one motion to another via both global and upper-body timing transfers. The global timing transfer focuses on matching the input motion to the body speed of the selected example motion, while the upper-body timing transfer propagates the sense of movement flow - succession - through the torso and arms. Our transfer process is based on key times detected from the example set and transferring the relative changes of angle rotation in the upper body joints from a timing source to an input target motion. We demonstrate that our approach is practical in an interactive application such that a set of short locomotion cycles can be applied to generate a longer sequence with continuously varied timings.

Keywords

References

  1. O. Johnston and F. Thomas, "Disney Animation: The Illusion of Life," New York, NY, USA: Abbeville Press, 1981.
  2. P. Coleman et al., "Staggered Poses: A Character Motion Representation for Detail-Preserving Editing of Pose and Coordinated Timing," ACM SIGGRAPH/Eurograph. Symp. Comput. Animation, Dublin, Ireland, 2008, pp. 137-146.
  3. E. Hsu, M.D. Silva, and J. Popovic, "Guided Time Warping for Motion Editing," ACM SIGGRAPH/Eurograph. Symp. Comput. Animation, San Diego, CA, USA, 2007, pp. 45-52.
  4. M. Lau, Z. Bar-Joseph, and J. Kuffner, "Modeling Spatial and Temporal Variation in Motion Data," ACM SIGGRAPH, vol. 28, no. 5, 2009, pp. 1-10.
  5. N. Lockwood and K. Singh, "Biomechanically-Inspired Motion Path Editing," ACM SIGGRAPH/Eurograph. Symp. Comput. Animation, Vancouver, Canada, 2011, pp. 267-276.
  6. J. McCann, N.S. Pollard, and S. Srinivasa, "Physics-Based Motion Retiming," ACM SIGGRAPH/Eurograph. Symp. Comput. Animation, Vienna, Austria, 2006, pp. 205-214.
  7. J. Lasseter, "Principles of Traditional Animation Applied to 3D Computer Animation," ACM SIGGRAPH Comput. Graph., Anaheim, CA, USA, vol. 21, no. 4, July 1987, pp. 35-44.
  8. H. Whitaker and J. Halas, "Timing for Animation," London, UK: Focal Press, 1981.
  9. R.V. Laban and L. Ullmann, "The Mastery of Movement," London, UK: Northcote House, 1988.
  10. T. Shawn, "Every Little Movement: A Boot about Delsarte," Hightstown, NJ, USA: Princeton Book Co., 1976.
  11. A. Witkin and Z. Popovic, "Motion Warping," ACM SIGGRAPH, Los Angeles, CA, USA, 1995, pp. 105-108.
  12. S.C.L. Terra and R.A. Metoyer, "Performance Timing for Keyframe Animation," ACM SIGGRAPH/Eurograph. Symp. Comput. Animation, Grenoble, France, 2004, pp. 253-258.
  13. D.M. Chi et al., "The EMOTE Model for Effort and Shape," ACM SIGGRAPH, New Orleans, LA, USA, 2000, pp. 173-182.
  14. M. Neff and E. Fiume, "Aesthetic Edits for Character Animation," ACM SIGGRAPH/Eurograph. Symp. Comput. Animation, San Diego, CA, USA, 2003, pp. 239-244.
  15. K. Pullen and C. Bregler, "Motion Capture Assisted Animation: Texturing and Synthesis," ACM Trans. Graph., vol. 21, no. 3, July 2002, pp. 501-508. https://doi.org/10.1145/566654.566608
  16. N. Al-Ghreimil and J.K. Hahn, "Combined Partial Motion Clips," Int. Conf. Central Europe Comput. Graphic, Vis. Comput. Vis., Plzen, Czech Republic, 2003, pp. 9-16.
  17. L. Ikemoto and D.A. Forsyth, "Enriching a Motion Collection by Transplanting Limbs," ACM SIGGRAPH/Eurographics Symp. Comput. Animation, Grenoble, France, 2004, pp. 99-108.
  18. G. Ashraf and K.C. Wong, "Generating Consistent Motion Transition via Decoupled Framespace Interpolation," Comput. Graph. Forum, vol. 19, no. 3, Sept. 2001, pp. 447-456. https://doi.org/10.1111/1467-8659.00437
  19. R. Heck, L. Kovar, and M. Gleicher, "Splicing Upper-body Actions with Locomotion," Comput. Graph. Forum, vol. 25, no. 3, Sept. 2006, pp. 459-466. https://doi.org/10.1111/j.1467-8659.2006.00965.x
  20. M. Oshita, "Smart Motion Synthesis," Comput. Graph. Forum, vol. 27, no. 7, Oct. 2008, pp. 1909-1918. https://doi.org/10.1111/j.1467-8659.2008.01339.x
  21. P. Glardon, R. Boulic, and D. Thalmann, "Robust on-Line Adaptive Footplant Detection and Enforcement for Locomotion," Vis. Comput., vol. 22, no. 3, Mar. 2006, pp. 194-209. https://doi.org/10.1007/s00371-006-0376-9
  22. W.H. Press et al., "Numerical Recipes in C++: The Art of Scientific Computing," New York, NJ, USA: Cambridge University Press, 2002.
  23. Y. Kim and M. Neff, "Automating Expressive Locomotion Generation," Trans. Edutainment VII, LNCS, vol. 7145, 2012, pp. 48-61.
  24. C. Rose et al., "Efficient Generation of Motion Transitions Using Spacetime Constraints," ACM SIGGRAPH, New Orleans, LA, USA, 1996, pp. 147-154.
  25. K. Shoemake, "Animating Rotation with Quaternion Curves," ACM SIGGRAPH, San Francisco, CA, USA, 1985, pp. 245-254.
  26. J.-H. Yoo and M.S. Nixon, "Automated Markerless Analysis of Human Gait Motion for Recognition and Classification," ETRI J., vol. 33, no. 2, Apr. 2011, pp. 259-266. https://doi.org/10.4218/etrij.11.1510.0068

Cited by

  1. Data-Driven Approach for Human Locomotion Generation vol.15, pp.2, 2015, https://doi.org/10.1142/s021946781540001x