DOI QR코드

DOI QR Code

Human-Understanding Cognitive Computing Technology Research Trends

휴먼이해 인지컴퓨팅 기술 연구 동향

  • Published : 2022.02.01

Abstract

Human behavior and emotions are influenced by experiences accumulated through the past and present. To realize artificial intelligence technology that understands and sympathizes with humans, cognitive computing technology that automatically analyzes human behaviors, habits, and emotions associated with specific situations and uses past data and domain knowledge is required. In this study, we examine the latest research trends on human-understanding cognitive computing technology that recognizes human behavior and emotions, stores them as experience data, and provides services by analyzing the stored data. Further, we introduce high-quality data collection research in real-life and services for improving physical and mental health. We also review key issues essential for developing these technologies.

Keywords

Acknowledgement

본 연구는 한국전자통신연구원 연구운영비지원사업의 일환으로 수행되었음[21ZS1100, 자율성장형 복합인공지능 원천기술 연구].

References

  1. 캐서린 러브데이, "나는 뇌입니다," 행성B, 2016, pp. 1-332.
  2. 리사 펠드먼 배럿, "감정은 어떻게 만들어지는가?," 생각연구소, 2017, pp. 219-244.
  3. A.A. Chaaraoui, P. Climent-Perez, and F. Florez-Revuelta, "A review on vision techniques applied to human behaviour analysis for ambient-assisted living," Expert Syst. Appl., vol. 39, no. 12, 2012, pp. 10873-10888. https://doi.org/10.1016/j.eswa.2012.03.005
  4. N.D. Rodriguez et al., "A fuzzy ontology for semantic modelling and recognition of human behaviour," Know.-Based Syst., vol. 66, 2014, pp. 46-60. https://doi.org/10.1016/j.knosys.2014.04.016
  5. Y. Vaizman et al., "ExtraSsensory app: Data collection in-the-wild with rich user interface to self-report behavior," in Proc. HI Conf. Hum. Factors Comput. Syst., (Montreal, Canada), Apr. 2018, pp. 1-12.
  6. D. Garcia-Gonzalez et al., "A public domain dataset for human activity recognition using smartphones," in Proc. Eur. Symp. Artif. Neural Netw., Comput. Intell. Mach. Learn. (ESANN), (Bruges, Belgium), Apr. 2013, pp. 437-442.
  7. A. Prasad et al., "Provenance framework for mHealth," in Proc. Int. Conf. Commun. Syst. Netw. (COMSNETS), (Bangalore, India), Jan. 2013.
  8. S. Chung et al., "Real-world multimodal lifelog dataset for human behavior study," ETRI J., Dec. 8, 2021.
  9. Y. Chang et al., "A systematic study of unsupervised domain adaptation for robust human-activity recognition," Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., vol. 4, no. 1, 2020, pp. 1-30. https://doi.org/10.1145/3380985
  10. A. Mazankiewicz, K. Bohm, and M. Berges, "Incremental real-time personalization in human activity recognition using domain adaptive batch normalization," Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., vol. 4, no. 4, 2020, pp. 1-20.
  11. C.I. Tang et al., "SelfHAR: Improving human activity recognition through self-training with unlabeled data," Proc. ACM Interact., Mob., Wearable Ubiquitous Technol., vol. 5, no. 1, 2021, pp. 1-30.
  12. H. Ma et al., "Unsupervised human activity representation learning with multi-task deep clustering," Proc. ACM Interact., Mob. Wearable Ubiquitous Technol., vol. 5, no. 1, 2021, pp. 1-25.
  13. B. Elser and A. Montresor, "An evaluation study of BigData frameworks for graph processing," in Proc. IEEE Int. Conf. Big Data, (Silicon Valley, CA, USA), Oct. 2013, pp. 60-67.
  14. X. Zhan and S.V. Ukkusuri, "A graph-based approach to measuring the efficiency of an urban taxi service system," IEEE Trans. Intell. Transportation Syst., vol. 17, no. 9, 2016, pp. 2479-2489. https://doi.org/10.1109/TITS.2016.2521862
  15. Z. Wu et al., "A comprehensive survey on graph neural networks," IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 1, 2020, pp. 1-21.
  16. S. Deldari et al., "Time series change point detection (CPD) with self-supervised contrastive predictive coding (CPC)," in Proc. Web Conf., (Ljubljana, Slovenia), Apr. 2021, pp. 3124-3135.
  17. M. Zhang and A.A. Sawchuk, USC-HAD: A daily activity dataset for ubiquitous activity recognition using wearable sensors," in Proc. ACM Conf. Ubiquitous Comput., (Pittsburgh, Pennsylvania), 2012, pp. 1036-1043.
  18. S. Deldari et al., "ESPRESSO: Entropy and ShaPe awaRe timE-Series SegmentatiOn for processing heterogeneous sensor data," Proc. ACM Interact., Mob., Wearable Ubiquitous Technol., vol. 4, no. 3, 2020, pp. 1-24. https://doi.org/10.1145/3411832
  19. M.B. Akcay and K. Oguz, "Speech emotion recognition: Emotional models, databases, features, preprocessing methods, supporting modalities, and classifiers," Speech Commun., vol. 116, 2020, pp. 56-76. https://doi.org/10.1016/j.specom.2019.12.001
  20. R. Cowie et al., "Emotion recognition in human-computer interaction," IEEE Signal Process. Mag., vol. 18, no. 1, 2001, pp. 32-80. https://doi.org/10.1109/79.911197
  21. P. Ekman and D. Keltner, "Universal facial expressions of emotion," Segerstrale U P Molnar P Eds Nonverbal Commun. Nat. Meets Cult., vol. 27, 1997, p. 46.
  22. J.A. Russell and A. Mehrabian, "Evidence for a three-factor theory of emotions," J. Res. Personal., vol. 11, no. 3, 1977, pp. 273-294. https://doi.org/10.1016/0092-6566(77)90037-X
  23. H.M. Teager and S.M. Teager, "Evidence for nonlinear sound production mechanisms in the vocal tract," in Speech Production and Speech Modelling, Springer, Dordrecht, Netherlands, 1990, pp. 241-261.
  24. C. Busso et al., "IEMOCAP: Interactive emotional dyadic motion capture database," Lang. Resour. Eval., vol. 42, no. 4, 2008, pp. 335-359. https://doi.org/10.1007/s10579-008-9076-6
  25. K.J. Noh et al., "Multi-path and group-loss-based network for speech emotion recognition in multi-domain datasets," Sensors, vol. 21, no. 5, 2021, p. 1579. https://doi.org/10.3390/s21051579
  26. O. Martin et al., "The eNTERFACE'05 audio-visual emotion database," in Proc. Int. Conf. Data Eng. Workshops (ICDEW'06), (Atlanta, GA, USA), Apr. 2006, pp. 1-8.
  27. G. McKeown et al., "The semaine database: Annotated multimodal records of emotionally colored conversations between a person and a limited agent," IEEE Trans. Affect. Comput., vol. 3, no. 1, 2011, pp. 5-17. https://doi.org/10.1109/T-AFFC.2011.20
  28. M.N. Stolar et al., "Real time speech emotion recognition using RGB image classification and transfer learning," in Proc. Int. Conf. Signal Process. Commun. Syst. (ICSPCS), (Surfers Paradise, Australia), Dec. 2017, pp. 1-8.
  29. W. Dai et al., "Modality-transferable emotion embeddings for low-resource multimodal emotion recognition," in Proc. AACL 21, (Suzhou, China), Dec. 2020, pp. 269-280.
  30. S. Latif et al., "Transfer learning for improving speech emotion classification accuracy," in Proc. Interspeech, (Hyderabad, India), Sept. 2018, pp. 257-261.
  31. N. Jaques et al., "Predicting tomorrow's mood, health, and stress level using personalized multitask learning and domain adaptation," Proc. Mach. Learn. Res., vol. 48, 2017, pp. 17-33.
  32. C. Dobbins et al., "A lifelogging platform towards detecting negative emotions in everyday life using wearable devices," in Proc. IEEE Int. Conf. Pervasive Comput. Commun. Workshops (PerCom Workshops), (Athens, Greece), Mar. 2018, pp. 306-311.
  33. P. Soleimaninejadian et al., "THIR2 at the NTCIR-13 lifelog-2 task: Bridging technology and psychology through the lifelog personality, mood and sleep quality," in Proc. NTCIR Conf. Eval. Inf. Access Technol., (Tokyo, Japan), Dec. 2017.
  34. J.P. Onnela, "Opportunities and challenges in the collection and analysis of digital phenotyping data," Neuropsychopharmacology, vol. 46, no. 1, 2021, pp. 45-54. https://doi.org/10.1038/s41386-020-0771-3
  35. M. Allemand and M.R. Mehl, "Personality assessment in daily life: A roadmap for future personality development research," in Personality Development Across the Lifespan, Academic Press, London, UK, 2017, pp. 437-454.
  36. https://peartherapeutics.com/
  37. C.M. Morin, "Profile of somryst prescription digital therapeutic for chronic insomnia: Overview of safety and efficacy," Expert Rev. Med. Devices, vol. 17, no. 12, 2020, pp. 1239-1248. https://doi.org/10.1080/17434440.2020.1852929
  38. https://www.bluesignum.com/
  39. https://mindstrong.com/
  40. P. Dagum, "Digital biomarkers of cognitive function," NPJ digital medicine, vol. 1, no. 1, 2018, pp. 1-3. https://doi.org/10.1038/s41746-017-0008-y