DOI QR코드

DOI QR Code

Reliability of Standardized Patients as Raters in Objective Structured Clinical Examination

객관 구조화 절차 기술 평가에서 채점자로서의 표준화환자의 신뢰도

  • Son, Hee-Jeong (Department of Anesthesiology, School of Medicine, Kangwon National University) ;
  • Moon, Joong-Bum (Emergency Medicine, School of Medicine, Kangwon National University) ;
  • Lee, Hyang-Ah (Obstetrics & Gynecology, School of Medicine, Kangwon National University) ;
  • Roh, Hye-Rin (Surgery, School of Medicine, Kangwon National University)
  • 손희정 (강원대학교 의학전문대학원 마취통증의학교실) ;
  • 문중범 (강원대학교 의학전문대학원 응급의학교실) ;
  • 이향아 (강원대학교 의학전문대학원 산부인과학교실) ;
  • 노혜린 (강원대학교 의학전문대학원 외과학교실)
  • Received : 2010.12.01
  • Accepted : 2011.01.13
  • Published : 2011.01.31

Abstract

The purpose of this study is to investigate whether standardized patient(SP) can be used as a reliable examiner in Objective Structured Clinical Examination(OSCE). 4 SPs and 4 faculties who have more than 2 years experience of OSCE scoring were selected. For 1 assignment 2 members of faculty and 2 SPs were designated as raters. SPs were educated for assessing 2 technical skills, male Foley catheter insertion and wound dressing, for 8 hours (4 hours / day, each topic). The definition, method, cautions and complications for each of procedural skills were covered in the education. Theoretical lectures, video learning, faculty demonstration and practical training on mannequins were employed. The 8 raters were standardized for an hour with simulated OSCE scoring using previous videos on the day before the OSCE. Each assessment was composed of 14 checklists and 1 global rate. The allotted time for each assignment was 5minutes and for evaluation time 2 minutes per student. The evaluation from the faculty and SPs were compared and analyzed with the GENOVA program. The overall generalizability coefficient (G coefficient) was 0.839 from two cases of OASTS. The reliability of the raters was high, 0.946. The inter-rater agreement between faculty group and SP group was 0.949 for checklist and 0.908 for global rating. Therefore SPs can play a role of raters in OSCE for procedural skills, if they are given the appropriate training.

본 연구는 절차기술의 객관구조화 진료시험(Objective Structured Clinical Examination)에서 표준화환자가 평가자의 역할을 수행할 수 있는지 알아보기 위해 신뢰도를 평가하는데 그 목적이 있다. 시험의 주제는 남성 도뇨관삽관과 창상드레싱 2가지로 정하고, 2년 이상의 객관구조화 진료시험 채점 경력이 있는 표준화환자와 교수 각 4명을 2명씩 짝을 지워 한 주제 당 표준화환자 그룹과 교수 그룹이 동시에 채점하게 하였다. 표준화환자들에게는 술기의 정의, 방법, 주의점, 후유증에 대한 교육이 이루어졌으며 동영상이 포함된 강의, 교수의 시연 후 표준화환자가 직접 실습해보고 교수로부터 되먹임을 받는 순서로 총 8시간( 주제당 4시간)의 교육이 시행되었다. 8명의 평가자 모두 객관구조화 진료시험 전날 모여 기존의 동영상자료를 이용한 가상 채점으로 1시간동안 채점 표준화를 이루었다. 채점표는 체크리스트 14문항과 총괄평가 1문항으로 이루어졌다. 한 학생당, 주제당 5분간의 시험 후 2분간의 평가가 이루어졌다. 표준화환자와 교수간의 분석은 GENOVA program을 이용하였다. 연구 결과 주제 전체에서 G상수는 0.839, 평가자의 신뢰도는 0.946으로 매우 높았다. 표준화환자그룹과 교수그룹 사이의 평가자간 일치도는 체크리스트에서 0.949, 총괄평가에서 0.908이었다. 따라서 적절한 교육이 선행되어진다면 표준화환자도 절차기술의 객관화진료시험에 신뢰할 만한 평가자로 이용되어질 수 있을 것이다.

Keywords

References

  1. Barrows HS, "An overview of the uses of standardized patients for teaching and evaluating clinical skills", Acad Med, 68(6), pp. 443-451, 1993. https://doi.org/10.1097/00001888-199306000-00002
  2. Barrows HS, Abrahamson S, "The programmed patient: a technique for appraising student performance in clinical neurology", J Med Educ, 39, pp. 802-805, 1964.
  3. Lee B, "Recent world trend in performance-based assessments and application of the standardized patient program in Korean medical education", Korean J Med Educ, 12(2), pp. 377-392, 2000.
  4. Stillman PL, Ruggill JS, Sabers DL, "The use of practical instructors to evaluate a complete physical examination", Eval Heal Prof, 1, pp. 49-54, 1978. https://doi.org/10.1177/016327877800100104
  5. Regehr G, Freeman R, Robb A, Misshiha N, Heisey R, "OSCE performance evaluations made by standardized patients: comparing checklist and global rating scores", Acad Med, 10(Oct suppl), S135-S137, 1999.
  6. Whelan GP, Boulet JR, McKinley DW, Norcini JJ, van Zanten M, Hambleton RK, Burdick WP, Peitzman SJ, "Scoring standardized patient examinations: lessons learned from the development and administration of the ECFMG Clinical Skills Assessment (CSA${\circledR}$)", Med Teach, 27(3), pp. 200-20, 2005. https://doi.org/10.1080/01421590500126296
  7. Martin JA, Reznick RK, Rothman A, Tamblyn RM, Regehr G, "Who should rate candidates in an objective structured clinical examination?", Acad Med, 71, pp. 170-175, 1996. https://doi.org/10.1097/00001888-199602000-00025
  8. Stillman PL, "Technical issues: Logistics", Acad Med, 68, pp. 464-468, 1993. https://doi.org/10.1097/00001888-199306000-00004
  9. Peggy Wallace, "Coaching Standardized Patients for Use in the Assessment of Clinical Competence", Springer Publishing Company. 2007.
  10. Harden RM, Stevenson M, Downie WW, Wilson GM, "Assessment of clinical competence using objective structured examination", BMJ, 1, pp. 447-451, 1975. https://doi.org/10.1136/bmj.1.5955.447
  11. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M, "Objective structured assessment of technical skill (OSATS) for surgical residents", Br J Surg, 84, pp. 273-278, 1997. https://doi.org/10.1002/bjs.1800840237
  12. Goff BA, Lentz GM, Lee D, Houmard B, Mandel LS, "Development of an objective structured assessment of technical skills for obstetric and gynecology residents", Obstet Gynecol, 96, pp. 146-150, 2000. https://doi.org/10.1016/S0029-7844(00)00829-2
  13. Morgan PJ, Cleave-Hogg D, Guest CB, "A comparison of global ratings and checklist scores from an undergraduate assessment using an anesthesia simulator", Acad Med, 76(10), pp. 1053-1055, 2001. https://doi.org/10.1097/00001888-200110000-00016
  14. Bould MD, Crabtree NA, Naik VN, "Assessment of procedural skills in anaesthesia", Br J Anaesth, 103, pp. 472-483, 2009. https://doi.org/10.1093/bja/aep241
  15. Friedlich M, Wood T, Regehr G, Hurst C, Shamji F, "Structured assessment of minor surgical skills (SAMSS) for clinical clerks", Acad Med, 77(10), S39-S41, 2002. https://doi.org/10.1097/00001888-200210001-00013
  16. Lammers RL, Davenport M, Korley F, et al, "Teaching and assessing procedural skills using simulation: metrics and methodology", Acad Emerg Med, 15, pp. 1079-87, 2008. https://doi.org/10.1111/j.1553-2712.2008.00233.x
  17. Regehr G, MacRae H, Reznick RK, Szalay D, "Comparing the psychometric properties of checklists and global scales for assessing performance on an OSCE-format Examination", Acad Med, 73(9), pp. 993-997, 1998. https://doi.org/10.1097/00001888-199809000-00020
  18. Friedman Z, Siddiqui N, Katznelson R, Devito I, Davies S, "Experience is not enough: repeated breaches in epidural anesthesia aseptic technique by novice operators despite improved skill", Anesthesiology, 108, pp. 914-920, 2008. https://doi.org/10.1097/ALN.0b013e31816bbdb6
  19. Newble DI, Hoare J, Sheldrake PF, "The selection and training of examiners for clinical examinations",. Med Edu, 14(5), pp. 345-349, 1980. https://doi.org/10.1111/j.1365-2923.1980.tb02379.x
  20. Heine N, Garman K, Wallace P, Bartos R, Richards A, "An analysis of standardized patient checklist errors and their effect on student scores", Med Educ, 37, pp. 99-104, 2003. https://doi.org/10.1046/j.1365-2923.2003.01416.x
  21. Humphrey-Murto S, Smee S, Touchie C, Wood TJ, Blackmore DE, "A comparison of physician examiners and trained assessors in a high-stakes OSCE setting", Acad Med, 80(10), s59-s62, 2005. https://doi.org/10.1097/00001888-200510001-00017
  22. Downing SM, "Reliability: on the reproducibility of assessment data", Med Educ, 38, pp. 1006-1012, 2004. https://doi.org/10.1111/j.1365-2929.2004.01932.x
  23. 성태제, "타당도와 신뢰도", 학지사, pp 137-163, 2002.
  24. Bullock G, kovacs G, Macdonald K, Story BA, "Evaluating procedural skills competence: inter-rater reliability of expert and non-expert observers", Acad Med, 74(1), pp. 76-78, 1999. https://doi.org/10.1097/00001888-199901000-00023
  25. 강승호, "신뢰도", 교육과학사, pp 73-80, 2004.
  26. Crick JE, Brennan RL, "Manual of GENOVA: A generalized analysis of variance system", American College Testing Program, USA, 1983.