DOI QR코드

DOI QR Code

Interactive drawing with user's intentions using image segmentation

  • Received : 2018.06.22
  • Accepted : 2018.07.05
  • Published : 2018.08.31

Abstract

This study introduces an interactive drawing system, a tool that allows user to sketch and draw with his own intentions. The proposed system enables the user to express more creatively through a tool that allows the user to reproduce his original idea as a drawing and transform it using his body. The user can actively participate in the production of the artwork by studying the unique formative language of the spectator. In addition, the user is given an opportunity to experience a creative process by transforming arbitrary drawing into various shapes according to his gestures. Interactive drawing systems use the segmentation of the drawing image as a way to extend the user's initial drawing idea. The system includes transforming a two-dimensional drawing into a volume-like form such as a three-dimensional drawing using image segmentation. In this process, a psychological space is created that can stimulate the imagination of the user and project the object of desire. This process of drawing personification plays a role of giving the user familiarity with the artwork and indirectly expressing his her emotions to others. This means that the interactive drawing, which has changed to the emotional concept of interaction beyond the concept of information transfer, can create a cooperative sensation image between user's time and space and occupy an important position in multimedia society.

Keywords

References

  1. M. Jang and W. Lee, "Implementation of Hand-Gesture Interface to manipulate a 3D Object of Augmented Reality," The Journal of The Institute of Internet, Broadcasting and Communication, Vol. 16, No. 4, pp. 117-123, 2016. DOI: http://dx.doi.org/10.7236/JIIBC.2016.16.4.117
  2. S. Lee and K. Han, "Detection of Moving Objects using Depth Frame Data of 3D Sensor," The Journal of The Institute of Internet, Broadcasting and Communication, Vol. 14, No. 5, pp. 243-248, 2014. DOI: http://dx.doi.org/10.7236/JIIBC.2014.14.5.243
  3. S. Snibbe and G. Levin, "Interactive Dynamic Abstraction," in Proceedings of Non-photorealistic Animation and Rendering, pp. 21-29, 2000. DOI: https://doi.org/10.1145/340916.340919
  4. R. ST Amant and T. E. Horton, "Characterizing tool use in an interactive drawing environment," in Proceedings of the 2nd international symposium on Smart graphics, pp. 86-93, 2002. DOI: https://doi.org/10.1145/569005.569018 https://doi.org/10.1145/569005.569018
  5. http://thedigitalage.pbworks.com/w/page/22039083/Myron%20Krueger, Retrieved July, 31, 2018.
  6. http://jcnaour.com/#Kinect-Graffiti-, Retrieved July, 31, 2018.
  7. D. Dixon, M. Prasad, and T. Hammond, "iCanDraw: using sketch recognition and corrective feedback to assist a user in drawing human faces," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 897-906, 2010. DOI: https://doi.org/10.1145/1753326.1753459
  8. G. Li, S. Bi, J. Wang, Y. Xu, and Y. Yu, "ColorSketch: A Drawing Assistant for Generating Color Sketches from Photos," IEEE computer graphics and applications, Vol. 37, No 3, pp. 70-81, 2017. DOI: https://doi.org/10.1109/MCG.2016.37
  9. E. Iarussi, A. Bousseau, and T. Tsandilas, "The drawing assistant: Automated drawing guidance and feedback from photographs," in ACM Symposium on User Interface Software and Technology, pp. 183-192, 2013. DOI: https://doi.org/10.1145/2501988.2501997
  10. Y. J. Lee, C. L. Zitnick, and M. F. Cohen, "Shadowdraw: real-time user guidance for freehand drawing," in ACM Transactions on Graphics, Vol. 30, No. 4, pp. 27:1-27:10, 2011. DOI: https://doi.org/10.1145/2010324.1964922
  11. K. Ryokai, S. Marti, and H. Ishii, "I/O brush: drawing with everyday objects as ink," in Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 303-310, 2004. DOI: https://doi.org/10.1145/985692.985731
  12. G. Levin and Z. Lieberman, "In-situ speech visualization in real-time interactive installation and performance," in Proceedings of the 3rd international symposium on Non-photorealistic animation and rendering, Vol. 4, pp. 7-14, 2004. DOI: https://doi.org/10.1145/987657.987659