DOI QR코드

DOI QR Code

Messick의 타당도 틀을 활용한 임상실습 전 실기시험의 타당도 평가

Assessing the Validity of the Preclinical Objective Structured Clinical Examination Using Messick's Validity Framework

  • 이혜윤 (부산대학교 의과대학 의학교육학교실) ;
  • 윤소정 (부산대학교 의과대학 의학교육학교실) ;
  • 이상엽 (부산대학교 의과대학 의학교육학교실) ;
  • 임선주 (부산대학교 의과대학 의학교육학교실)
  • Lee, Hye-Yoon (Department of Medical Education, Pusan National University School of Medicine) ;
  • Yune, So-Jung (Department of Medical Education, Pusan National University School of Medicine) ;
  • Lee, Sang-Yeoup (Department of Medical Education, Pusan National University School of Medicine) ;
  • Im, Sunju (Department of Medical Education, Pusan National University School of Medicine)
  • 투고 : 2021.02.08
  • 심사 : 2021.07.08
  • 발행 : 2021.10.31

초록

Students must be familiar with clinical skills before starting clinical practice to ensure patients' safety and enable efficient learning. However, performance is mainly tested in the third or fourth years of medical school, and studies using the validity framework have not been reported in Korea. We analyzed the validity of a performance test conducted among second-year students classified into content, response process, internal structure, relationships with other variables, and consequences according to Messick's framework. As results of the analysis, content validity was secured by developing cases according to a pre-determined blueprint. The quality of the response process was controlled by training and calibrating raters. The internal structure showed that (1) reliability by generalizability theory was acceptable (coefficients of 0.724 and 0.786, respectively, for day 1 and day 2), and (2) the relevant domains had proper correlations, while the clinical performance examination (CPX) and objective structured clinical examination (OSCE) showed weaker relationships. OSCE/CPX scores were correlated with other variables, especially grade point average and oral structured exam scores. The consequences of this assessment were (1) making students learn clinical skills and study themselves, while causing too much stress for students due to lack of motivation; (2) reminding educators of the need to apply practical teaching methods and to give feedback on the test results; and (3) providing an opportunity for faculty to consider developing support programs. It is necessary to develop the blueprint more precisely according to students' level and to verify the validity of the response process with statistical methods.

키워드

과제정보

이 과제는 부산대학교 기본연구지원사업(2년)에 의하여 연구되었음.

참고문헌

  1. Kim BS, Lee YM, Ahn DS, Park JY. Evaluation of introduction to clinical medicine by objective structured clinical examination. Korean J Med Educ. 2001;13(2):289-98. https://doi.org/10.3946/kjme.2001.13.2.289
  2. Godefrooij MB, Diemers AD, Scherpbier AJ. Students' perceptions about the transition to the clinical phase of a medical curriculum with preclinical patient contacts; a focus group study. BMC Med Educ. 2010;10:28. https://doi.org/10.1186/1472-6920-10-28
  3. Graham R, Zubiaurre Bitzer LA, Anderson OR. Reliability and predictive validity of a comprehensive preclinical OSCE in dental education. J Dent Educ. 2013;77(2):161-7. https://doi.org/10.1002/j.0022-0337.2013.77.2.tb05458.x
  4. Cook DA, Brydges R, Ginsburg S, Hatala R. A contemporary approach to validity arguments: a practical guide to Kane's framework. Med Educ. 2015;49(6):560-75. https://doi.org/10.1111/medu.12678
  5. Cook DA, Hatala R. Validation of educational assessments: a primer for simulation and beyond. Adv Simul (Lond). 2016;1:31. https://doi.org/10.1186/s41077-016-0033-y
  6. Messick S. Standards of validity and the validity of standards in performance assessment. Educ Meas Issues Pract. 1995;14(4):5-8. https://doi.org/10.1111/j.1745-3992.1995.tb00881.x
  7. American Educational Research Association; American Psychological Association; National Council on Measurement in Education. Standards for educational and psychological testing. Washington (DC): American Educational Research Association; 2014.
  8. Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37(9):830-7. https://doi.org/10.1046/j.1365-2923.2003.01594.x
  9. Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What counts as validity evidence?: examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract. 2014;19(2):233-50. https://doi.org/10.1007/s10459-013-9458-4
  10. Borgersen NJ, Naur TM, Sorensen SM, Bjerrum F, Konge L, Subhi Y, et al. Gathering validity evidence for surgical simulation: a systematic review. Ann Surg. 2018;267(6):1063-8. https://doi.org/10.1097/SLA.0000000000002652
  11. Pugh D, Hamstra SJ, Wood TJ, Humphrey-Murto S, Touchie C, Yudkowsky R, et al. A procedural skills OSCE: assessing technical and non-technical skills of internal medicine residents. Adv Health Sci Educ Theory Pract. 2015;20(1):85-100. https://doi.org/10.1007/s10459-014-9512-x
  12. Brennan RL. Generalizability theory. New York (NY): Springer-Verlag; 2001.
  13. Dancey CP, Reidy J. Statistics without maths for psychology. 4th ed. Harlow: Pearson Prentice Hall; 2004.
  14. Liao SC, Hunt EA, Chen W. Comparison between inter-rater reliability and inter-rater agreement in performance assessment. Ann Acad Med Singap. 2010;39(8):613-8.
  15. Kim SH, Ko JK, Park JH. Effect of emotional intelligence on patient-physician interaction scores of clinical performance examination. Korean J Med Educ. 2011;23(3):159-65. https://doi.org/10.3946/kjme.2011.23.3.159
  16. Kiyohara LY, Kayano LK, Kobayashi ML, Alessi MS, Yamamoto MU, Yunes-Filho PR, et al. The patient-physician interactions as seen by undergraduate medical students. Sao Paulo Med J. 2001;119(3):97-100. https://doi.org/10.1590/S1516-31802001000300002
  17. Downing SM, Yudkowsky R. Assessment in health professions education. New York (NY): Routledge; 2009.
  18. Iramaneerat C, Yudkowsky R, Myford CM, Downing SM. Quality control of an OSCE using generalizability theory and many-faceted Rasch measurement. Adv Health Sci Educ Theory Pract. 2008;13(4):479-93. https://doi.org/10.1007/s10459-007-9060-8
  19. Wimmers PF, Fung CC. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach. Med Educ. 2008;42(6):580-8. https://doi.org/10.1111/j.1365-2923.2008.03089.x
  20. Bakhsh TM, Sibiany AM, Al-Mashat FM, Meccawy AA, Al-Thubaity FK. Comparison of students' performance in the traditional oral clinical examination and the objective structured clinical examination. Saudi Med J. 2009;30(4):555-7.
  21. Remmen R, Scherpbier A, Denekens J, Derese A, Hermann I, Hoogenboom R, et al. Correlation of a written test of skills and a performance based test: a study in two traditional medical schools. Med Teach. 2001;23(1):29-32. https://doi.org/10.1080/0142159002005541
  22. Dadgar SR, Saleh A, Bahador H, Baradaran HR. OSCE as a tool for evaluation of practical semiology in comparison to MCQ & oral examination. J Pak Med Assoc. 2008;58(9):506-7.
  23. Cilliers FJ, Schuwirth LW, Herman N, Adendorff HJ, van der Vleuten CP. A model of the pre-assessment learning effects of summative assessment in medical education. Adv Health Sci Educ Theory Pract. 2012;17(1):39-53. https://doi.org/10.1007/s10459-011-9292-5
  24. Yune SJ, Lee SY, Im S. How do medical students prepare for examinations: pre-assessment cognitive and meta-cognitive activities. Korean Med Educ Rev. 2019;21(1):51-8. https://doi.org/10.17496/KMER.2019.21.1.51
  25. Kim JH. The effect of remedial precepted video review on clinical performance examination scores. Korean Med Educ Rev. 2012;14(1):51-6. https://doi.org/10.17496/KMER.2012.14.1.051