DOI QR코드

DOI QR Code

A Study on the Reliability Evaluation Index Development for the Information Resources Retained by Institutions: Focusing on Humanities Assets

  • Jeong, Dae-Keun (Department of Library and information Science, Chonnam National University) ;
  • Noh, Younghee (Department. of Library and Information Science, Konkuk University, GLOCAL)
  • Received : 2019.05.03
  • Accepted : 2019.06.23
  • Published : 2019.06.30

Abstract

This study has the aim of developing an evaluation index that can help evaluate the reliability of the information resources of institutions retaining humanities assets for the purposes of laying out the foundation for providing one-stop portal service for humanities assets. To this end, the evaluation index was derived through the analysis of previous research, case studies, and interviews with experts, the derived evaluation index was then applied to the humanities assets retaining institutions to verify the utility. The institutional information resources' reliability evaluation index consisted of the two dimensions of the institutions' own reliability evaluation index. The institution provided a service and system evaluation index. The institutions' own reliability evaluation index consisted of 25 points for institutional authority, 25 points for data collection and construction, 30 points for data provision, and 20 points for appropriateness of data, for a total of 100 points, respectively. The institution provided service and system evaluation indexes consisting of 25 points for information quality, 15 points for appropriateness (decency), 15 points for accessibility, 20 points for tangibility, 15 points for form, and 10 points for cooperation, for the total of 100 points, respectively. The derived evaluation index was used to evaluate the utility of 6 institutions representing humanities assets through application. Consequently, the reliability of the information resources retained by the Research Information Service System (RISS) of the Korea Education & Research Information Service (KERIS) turned out to be the highest.

JSKTBN_2019_v9n2_65_f0001.png 이미지

Fig. 1. Research systems and procedures

Table 1. Differences in scientific area and humanities area for evaluation

JSKTBN_2019_v9n2_65_t0001.png 이미지

Table 2. Evaluation criteria for online information sources at the University of California at Berkeley’s libraries

JSKTBN_2019_v9n2_65_t0002.png 이미지

Table 3. Evaluation items for information sources at John Hopkins University’s libraries

JSKTBN_2019_v9n2_65_t0003.png 이미지

Table 4. Evaluation criteria for internet data at Georgetown University’s libraries

JSKTBN_2019_v9n2_65_t0004.png 이미지

Table 5. Evaluation criteria for online information at the University of Oregon’s libraries

JSKTBN_2019_v9n2_65_t0005.png 이미지

Table 6. Analysis of the evaluation index for research achievements of humanities professors

JSKTBN_2019_v9n2_65_t0006.png 이미지

Table 7. Reference by item of reliability evaluation index

JSKTBN_2019_v9n2_65_t0007.png 이미지

Table 8. Matters applied in expert opinions for the reliability preliminary evaluation index

JSKTBN_2019_v9n2_65_t0008.png 이미지

Table 9. Institution's own reliability evaluation - final

JSKTBN_2019_v9n2_65_t0009.png 이미지

Table 10. Institution providing service and system reliability evaluation indexes: final

JSKTBN_2019_v9n2_65_t0010.png 이미지

Table 11. Institution's own reliability evaluation index applied: final

JSKTBN_2019_v9n2_65_t0011.png 이미지

Table 12. Final reliability evaluation index applied to the institution provided service and system

JSKTBN_2019_v9n2_65_t0012.png 이미지

Table 13. Final reliability evaluation

JSKTBN_2019_v9n2_65_t0013.png 이미지

Acknowledgement

Supported by : National Research Foundation of Korea(NRF)

References

  1. Berkeley Library. [n.d.]. Berkeley University Library Evaluating Resources. Retrieved from http://guides.lib.berkeley.edu/evaluating-resources#authority
  2. Centra, J. A. (1977). How universities evaluate faculty performance: A survey of department heads. Retrieved from https://www.ets.org/Media/Research/pdf/GREB-75-05BR.pdf
  3. Chung, Y. K., & Choi, Y. K. (2011). A Study on Faculty Evaluation of Research Achievements in Humanities and Social Sciences. Journal of Information management, 42(3), 211-233. https://doi.org/10.1633/JIM.2011.42.3.211
  4. Dunsmore, C. (2002). A qualitative study of web-mounted pathfinders created by academic business libraries. Libri, 52(3), 137-156.
  5. Finkenstaedt, T. (1990). Measuring research performance in the humanities. Scientometrics, 19(5-6), 409-417. https://doi.org/10.1007/BF02020703
  6. Fogg, B. J., Kameda, T., Boyd, J., Marshall, J., Sethi, R., Sockol, M., & Trowbridge, T. (2002). Stanford-Makovsky web credibility study 2002: Investigating what makes web sites credible today. Report from the Persuasive Technology Lab. Retrieved from http://credibility.stanford.edu/pdf/Stanford-MakovskyWebCredStudy2002-prelim.pdf.
  7. Fogg, B. J., Soohoo, C., Danielson, D. R., Marable, L., Stanford, J., & Tauber, E. R. (2003, June). How do users evaluate the credibility of Web sites?: a study with over 2,500 participants. In Proceedings of the 2003 conference on Designing for user experiences (pp. 1-15). ACM.
  8. Fogg, B. J., & Tseng, H. (1999, May). The elements of computer credibility. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 80-87). ACM. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.83.8354&rep=rep1&type=pdf
  9. Gang, J. Y., Nam, Y. H., & Oh, H. J. (2017). An Evaluation of Web-Based Research Records Archival Information Services and Recommendations for Their Improvement: NTIS vs. NKIS. Journal of Korean Society of Archives and Records Management, 17(3), 139-160. https://doi.org/10.14404/JKSARM.2017.17.3.139
  10. Georgetown Library. [n.d.]. Georgetown University Library Evaluating Internet Resources. Retrieved from https://www.library.georgetown.edu/tutorials/research-guides/evaluating-internet-content
  11. Jackson, R., & Pellack, L. J. (2004). Internet subject guides in academic libraries: An analysis of contents, practices, and opinions. Reference & User Services Quarterly, 43(4), 319-327.
  12. Jackson, R., & Stacy-Bates, K. K. (2016). The enduring landscape of online subject research guides. Reference & User Services Quarterly, 55(3), 212.
  13. Jauch, L. R., & Glueck, W. F. (1975). Evaluation of university professors' research performance. Management Science, 22(1), 66-75. https://doi.org/10.1287/mnsc.22.1.66
  14. Jeong, W. C., & Rieh, H. Y. (2016). Users' Evaluation of Information Services in University Archives. Journal of Korean Society of Archives and Records Management, 16(1), 195-221. https://doi.org/10.14404/JKSARM.2016.16.1.195
  15. Johns Hopkins Libraries. [n.d.]. Johns Hopkins University Libraries Evaluating Your Sources. Retrieved from https://guides.library.oregonstate.edu/c.php?g=286235&p=1906707
  16. Kang, H. I., & Jeong, Y. I. (2002). Measuring Library Online Service Quality: An Application of e-LibQual. Journal of the Korean Society for Information Management, 19(3), 237-261. https://doi.org/10.3743/KOSIM.2002.19.3.237
  17. Keum, J. D., & Weon, J. H. (2012). Establishing control system for the credibility of performance information. The Journal of Korean Policy Studies, 12(1), 59-75.
  18. Kim, D. N., Lee, M. H., & Park, T. G. (2006). Constructing an Evaluation Model for the Professors Academic Achievement in the Humanities. Journal of Educational Evaluation, 19(3), 1-20.
  19. Kim, H. S., Shin, K. J., & Choi, H. Y. (2007). A Study on the Development of Evaluation Framework for Public Portal Information Services. Proceedings of the Korea Contents Association Conference, 5(1), 440-444.
  20. Kim, S. (2012). A Study on the Current State of Online Subject Guides in Academic Libraries. Journal of the Korean Society for information Management, 29(4), 165-189. https://doi.org/10.3743/KOSIM.2012.29.4.165
  21. Kim, Y. K. (2007a). How Do People Evaluate a Web Sites Credibility. Journal of Korean Library and Information Science Society, 38(3), 53-72. https://doi.org/10.16981/kliss.38.3.200709.53
  22. Kim, Y. K. (2007b). A Study on the Influence of Factors That Makes Web Sites Credible. Journal of the Korean Society for Library and Information Science, 41(4), 93-111. https://doi.org/10.4275/KSLIS.2007.41.4.093
  23. Kim, Y. K. (2011). Comparative Study on Criteria for Evaluation of Internet Information. The Journal of Humanities, 27, 87-109.
  24. Lee, I. Y. (2005). A Study on the Evaluation System of Research Institutes. The Journal of Educational Administration. 23(4), 343-364.
  25. Lee, L. J. (2016). A Study on the Improvement Strategies of Moral Education Using Humanities (Doctoral dissertation). Seoul National University, Seoul, Korea.
  26. Lee, Y. H. (2017). A New Evaluating System for Academic Books on Humanities and Social Sciences in Korea. Journal of the Korea Contents Association, 17(3), 624-632. https://doi.org/10.5392/JKCA.2017.17.03.624
  27. Moed, H. F. (2008, December). Research assessment in social sciences and humanities. In ECOOM Colloquium Antwerp. Retrieved from https://www.ecoom.be/sites/ecoom.be/files/downloads/1%20Lecture%20Moed%20Ecoom%20Antwerp%209%20Dec%202011%20SSH%20aangepast%20(2).pdf
  28. Noh, Y., & Jeong, D. (2017). A Study to Develop and Apply Evaluation Factors for Subject Guides in South Korea. The Journal of Academic Librarianship, 43(5), 423-433. https://doi.org/10.1016/j.acalib.2017.02.002
  29. Oregon Libraries. [n.d.]. University of Oregon Libraries Guidelines for evaluating sources. Retrieved from http://diy.library.oregonstate.edu/guidelines-evaluating-sources
  30. Park, C. K. (2014). Evaluation in the Humanities: A Humanist Perspective. In/Outside, 37, 84-109.
  31. Park, N. G. (2006). Analysis of the evaluation status of teaching professions by university and development of assessment model of teaching achievement. Seoul: Ministry of Education & Human Resources Development.
  32. Skolnik, M. (2000). Does counting publications provide any useful information about academic performance?. Teacher Education Quarterly, 27(2), 15-25.
  33. Song, H. H. (2011). Problems on current humanities journal assessment system and the alternatives. Studies of Korean & Chinese Humanities, 34, 457-481.
  34. Standler, B. R. (2004). Evaluation Credibility of Information on the Internet. Retrieved from http://www.rbs0.com/credible.pdf
  35. University of Queensland Library. [n.d.]. UQ Library Evaluate Information You Find. Retrieved from https://web.library.uq.edu.au/research-tools-techniques/search-techniques/evaluate-information-you-find
  36. Woo, B. K., Jeon, I. D., & Kim, S. S. (2006). The Effects of the Academic Research Evaluation System and the Research Achievements in Developed Countries. ICASE Magazine, 12(4), 21-32.
  37. Yoon, S. W. (1996). Reliability Analysis. Seoul: Jayu academy.