DOI QR코드

DOI QR Code

A Study on Marker-based Detection Method of Object Position using Perspective Projection

  • Park, Minjoo (Department of Computer Engineering, Korea University of Technology and Education) ;
  • Jang, Kyung-Sik (Department of Computer Engineering, Korea University of Technology and Education)
  • Received : 2021.02.17
  • Accepted : 2022.03.10
  • Published : 2022.03.31

Abstract

With the mark of the fourth industrial revolution, the smart factory is evolving into a new future manufacturing plant. As a human-machine-interactive tool, augmented reality (AR) helps workers acquire the proficiency needed in smart factories. The valuable data displayed on the AR device must be delivered intuitively to users. Current AR applications used in smart factories lack user movement calibration, and visual fiducial markers for position correction are detected only nearby. This paper demonstrates a marker-based object detection using perspective projection to adjust augmented content while maintaining the user's original perspective with displacement. A new angle, location, and scaling values for the AR content can be calculated by comparing equivalent marker positions in two images. Two experiments were conducted to verify the implementation of the algorithm and its practicality in the smart factory. The markers were well-detected in both experiments, and the applicability in smart factories was verified by presenting appropriate displacement values for AR contents according to various movements.

Keywords

Acknowledgement

This result was supported by the Regional Innovation Strategy (RIS) through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (MOE) (2021RIS-004).

References

  1. A. Syberfeldt, O. Danielsson and P. Gustavsson, "Augmented reality smart glasses in the smart factory: Product evaluation guidelines and review of available products," IEEE Access, vol. 5, pp. 9118-9130, 2017. https://doi.org/10.1109/ACCESS.2017.2703952
  2. H. Kagermann, W. Wahlster, and J. Helbig, "Securing the future of german manufacturing industry: Recommendations for implementing the strategic initiative INDUSTRIE 4.0," ACATECH - German National Academy of Science and Engineering Tech. Rep., 2013.
  3. D. Gorecky, M. Schmitt, M. Loskyll, and D. Zuhlke, "Human-machine-interaction in the industry 4.0 era," in 2014 12th IEEE International Conference on Industrial Informatics (INDIN), pp. 289-294, 2014.
  4. J. Um, J. Popper, and M. Ruskowski, "Modular augmented reality platform for smart operator in production environment," 2018 IEEE Industrial Cyber-Physical Systems (ICPS), pp. 720-725, 2018.
  5. A. Ivaschenko, A. Khorina, and P. Sitnikov, "Accented visualization by augmented reality for smart manufacturing applications," 2018 IEEE Industrial Cyber-Physical Systems (ICPS), pp. 519-522, 2018.
  6. V. Paelke, "Augmented reality in the smart factory: Supporting workers in an industry 4.0. environment," in Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA), pp. 1-4, 2014.
  7. D. Liu and J. W. Park, "A study on the icon and layout design of device UI in augmented reality," in 2019 2nd World Conference on Mechanical Engineering and Intelligent Manufacturing (WCMEIM), pp. 245-248, 2019.
  8. I. Maly, D. Sedlacek and P. Leitao, "Augmented reality experiments with industrial robot in industry 4.0 environment," 2016 IEEE 14th International Conference on Industrial Informatics (INDIN), pp. 176-181, 2016.
  9. R. M. Haralick, "Determining camera parameters from the perspective projection of a rectangle," Pattern Recognition, vol. 22, pp. 225-230, 1989. https://doi.org/10.1016/0031-3203(89)90071-X
  10. E. van der Kruk and M. M. Reijne, "Accuracy of human motion capture systems for sport applications; state-of-the-art review," European Journal of Sport Science, vol. 18, no. 6, pp. 806-819, 2018. https://doi.org/10.1080/17461391.2018.1463397
  11. T. Wang, Y. Liu, and Y. Wang, "Infrared marker based augmented reality system for equipment maintenance," 2008 International Conference on Computer Science and Software Engineering, 2008, pp. 816-819.
  12. A. Lopez-Ceron, "Accuracy analysis of marker-based 3 D visual localization," in XXXVII Journadas de Automatica Workshop, 2016.
  13. J. Kohler, A. Pagani, and D. Stricker, "Detection and identification techniques for markers used in computer vision," OASIcs-Open Access Series in Informatics, vol. 19, 2011.
  14. H. Subakti and J. Jiang, "A marker-based cyber-physical augmented-reality indoor guidance system for smart campuses," in 2016 IEEE 18th International Conference on High Performance Computing and Communications; IEEE 14th International Conference on Smart City; IEEE 2nd International Conference on Data Science and Systems (HPCC/SmartCity/DSS), pp. 1373-1379, 2016.
  15. G. Yu, Y. Hu, and J. Dai, "TopoTag: A robust and scalable topological fiducial marker system," IEEE Transactions on Visualization and Computer Graphics, 2020.
  16. J. Sattar, E. Bourque, P. Giguere, and G. Dudek, "Fourier tags: Smoothly degradable fiducial markers for use in human-robot interaction," Fourth Canadian Conference on Computer and Robot Vision (CRV '07), pp. 165-174, 2007.
  17. ARToolKit: http://www.hitl.washington.edu/research/sharedspace/download/.
  18. M. Fiala, "ARTag, a fiducial marker system using digital techniques," in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), vol. 2, pp. 590-596, 2005.
  19. A. Sagitov, K. Shabalina, R. Lavrenov, and E. Magid, "Comparing fiducial marker systems in the presence of occlusion," in 2017 International Conference on Mechanical, System and Control Engineering (ICMSC), pp. 377-382, 2017.
  20. T. Kawano, Y. Ban, and K. Uehara. "A coded visual marker for video tracking system based on structured image analysis," ISMAR, pp. 262-263, 2003.
  21. M. Fiala, "Designing highly reliable fiducial markers," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 7, pp. 1317-1324, 2010. https://doi.org/10.1109/TPAMI.2009.146
  22. H. Subakti and J. Jiang, "Indoor augmented reality using deep learning for industry 4.0 smart factories," in 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC), pp. 63-68, 2018.
  23. A. Dell'Acqua, M. Ferrari, M. Marcon, A. Sarti, and S. Tubaro, "Colored visual tags: a robust approach for augmented reality," in IEEE Conference on Advanced Video and Signal Based Surveillance, 2005, pp. 423-427, 2005.
  24. M. L. Liu and K. H. Wong, "Pose estimation using four corresponding points," Pattern Recognition Letters, vol. 20, no. 1, pp. 69-74, 1999. https://doi.org/10.1016/S0167-8655(98)00128-7
  25. W. J. Wolfe, D. Mathis, C. W. Sklair, and M. Magee, "The perspective view of three points," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, no. 1, pp. 66-73, 1991. https://doi.org/10.1109/34.67632