DOI QR코드

DOI QR Code

The Analysis of Pre-Service Biology Teachers' Natural Selection Conceptions in Multiple-Choice and Open-Response Instruments

생물 예비 교사의 선택형과 개방형 문항에서 나타난 자연선택 설명 분석

  • Received : 2011.03.31
  • Accepted : 2011.09.27
  • Published : 2011.10.31

Abstract

Teachers use explanations to communicate important scientific ideas to students. Consequently, all biology teachers should be evaluated to determine how effective they are at constructing and communicating biological explanations. Open response questions are required to detect pre-service biology teachers' abilities to communicate robust and accurate scientific explanations. Nevertheless, multiple-choice questions are typically preferred by educators because of the common drawbacks of using open-response instruments, such as scoring time, inter-rater scoring disagreements, and delayed feedback to test takers. This study aims to measure pre-service biology teachers' competence in building scientific explanations and to investigate how accurately multiple-choice questions predict the results of open-response questions. One hundred twenty four pre-service biology teachers participated in the study and were administered 20 multiple-choice items and three open-response items designed to measure the accuracy and quality of their explanations of evolutionary change. The results demonstrated that pre-service teachers displayed higher competence when tested with multiple choice items than when tested with open response items. Moreover, scores derived from multiple-choice items poorly predicted the scores derived from open-response items. Multiple-choice items were also found to be poor measures of the consistency, purity and abundance of conceptual elements in teachers' evolutionary explanations. Additionally, many teachers held mixed-models composed of both scientific and naive ideas, which were difficult to detect using multiple-choice formats. Overall, the study indicates that multiple-choice formats are poorly suited to measuring several aspects of biology teachers' knowledge of evolution, including their ability to generate scientific explanations. This study suggests that open-response items should be used in teacher education programs to assess pre-service teachers' explanatory competency prior to being permitted to teach science to children.

생물교사는 자신의 글과 말을 활용하여 지식을 논리적인 과학적 설명의 형태로 변환하여 학생들을 지도한다. 그러므로 생물 예비교사들이 과학적 설명을 구성하는 능력을 확인할 수 있는 개방형 검사가 요구 될 것이다. 하지만 채점자간 일치도, 늦은 피드백과 같은 개방형 검사가 가진 문제로 선택형 검사가 대체 될 수 있을 것이다. 이 연구는 생물 예비교사들의 과학적 진화 설명 구성 능력을 확인함과 동시에 선택형 진화 개념 검사가 개방형 검사 결과를 어느 정도 예측하는지 두 가지를 조사하였다. 생물 예비교사 124명이 참여하였으며 20문항의 진화 개념 선택형 검사도구와 3문항의 실제 사례를 활용한 개방형 검사 문항을 모든 참여자에게 투입하였다. 연구 결과 생물 예비 교사는 선택형 검사 도구에 더 높은 능력을 보였으며 최상위(0 ~ 25%)의 참여자를 제외하고 선택형 검사 도구는 개방형 검사 도구 검사 결과를 예측하지 못하였다. 진화 설명의 일관성, 오개념이 포함되지 않은 순수성, 과학적 설명의 양을 측정한 결과, 선택형 검사 결과는 최상위 집단을 제외하고 다른 집단을 구분하지 못하였다. 또한 이 결과들은 생물 교사들이 과학적 설명의 구성 능력이 지식의 양에 비하여 상대적으로 낮은 것을 확인할 수 있었으며 일부 예비교사들은 오개념과 과학적 개념을 혼합하는 등의 세련되지 못한 과학적 설명을 가지고 있었다. 이 결과는 생물 예비교사들의 개념 평가는 선택형보다 개방형이 더 많은 정보를 제공할 수 있음을 보여주며, 생물 예비교사들을 위한 교사양성 프로그램은 과학적 설명 능력의 신장에 더욱 더 초점을 맞출 것을 제언한다.

Keywords

References

  1. 박정, 홍미영 (2002). 문항 유형에 따른 과학 능력 추정의 효율성 비교. 한국과학교육학회지, 22(1), 122-131.
  2. 박정, 홍미영, 김성숙, 전현정 (2000). 제3차 수학.과학 성취도 국제 비교 연구 (TIMSS-R) 국내 평가 결과 분석 연구II. 한국교육과정평가원. 연구보고RRE 2000-7.
  3. 이안나, 권용주, 정진수, 양일호 (2007). 동물 행동학자의 연구 활동에서 나타나는 연구 단계, 사고 과정, 행동 양식 및 생성 지식에 관한 연구. 한국생물교육학회지, 35(3), 361-373.
  4. Anderson, D. L., Fisher, K. M., & Norman, G. J. (2002). Development and evaluation of the conceptual inventory of natural selection. Journal of Research in Science Teaching, 39(10), 952-978. https://doi.org/10.1002/tea.10053
  5. Caleon, I. S., & Subramaniam, R. (2010). Do students know what they know and what they don't know? Using a four-tier diagnostic test to assess the nature of students alternative conceptions. Research in Science Education, 40(3), 313-337. https://doi.org/10.1007/s11165-009-9122-4
  6. Chi, M. T. H. (1996). Constructing selfexplanations and scaffolded explanations in tutoring. Applied Cognitive Psychology, 10(7), 33-49. https://doi.org/10.1002/(SICI)1099-0720(199611)10:7<33::AID-ACP436>3.0.CO;2-E
  7. Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5(2), 121-152. https://doi.org/10.1207/s15516709cog0502_2
  8. Choi, K, Lee, H, Shin, N., Kim, S. W., & Krajcik, J. (2011). Re-conceptualization of scientific literacy in South Korea for the 21st Century. Journal of Research in Science Teaching. 48(6), 670-697. https://doi.org/10.1002/tea.20424
  9. diSessa, A. A. (1988). Knowledge in pieces. In G. Forman & P. Pufall (Eds.), Constructivism in the computer age (pp. 49- 70). Hillsdale, NJ: Erlbaum.
  10. Dochy, F. (2001). A new assessment era: different needs, new challenges. Learning and Instruction, 10(1), 11-20.
  11. Gitomer, D. H., & Duschl, R. A. (2007). Establishing multilevel coherence in assessment. In P. A. Moss (Ed.), Evidence and decision making. The 106 th yearbook of the National Society for the Study of Education, Part I (pp. 288-320 ). Chicago: National Society for the Study of Education.
  12. Ha, M. & Cha, H. Y. (2009). Pre-service teachers'synthetic view on Darwinism and Lamarckism. Paper in the proceedings at 2009 International Conference of National Association for Research in Science Teaching, Garden Grove, CA.
  13. Hammer, D. M., Elby, A., Scherr, R. E., & Redish, E. F. (2005). Resources, framing, and transfer. In J. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 89-120). Greenwich, CT: Information Age Publishing.
  14. Kwon, Y. J., Lee, J. K., Shin, D. H., & Jeong, J. S. (2009). Changes in brain activation induced by the training of hypothesis generation skills: An fMRI study. Brain & Cognition, 69, 391-397. https://doi.org/10.1016/j.bandc.2008.08.032
  15. Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208(4450), 1335-1342. https://doi.org/10.1126/science.208.4450.1335
  16. Lee, J. K. (2009). Dissociation of the brain activation network associated with hypothesis-generating and hypothesisunderstanding in biology learning: Evidence from an fMRI study. Unpublished Doctoral Dissertation. Cheongwon, Chungbuk: Korea National University of Education.
  17. Lee, J. K., & Kwon, Y. J. (2011). Why traditional expository teaching-learning approaches may founder? An experimental examination of neural networks in biology learning. Journal of Biological Education, 45(2), 83-92. https://doi.org/10.1080/00219266.2010.548874
  18. Nehm, R. H., & Ha, M. (2011). Item feature effects in evolution assessment. Journal of Research in Science Teaching, 48(3), 237-256. https://doi.org/10.1002/tea.20400
  19. Nehm, R. H., & Reilly, L. (2007). Biology majors'knowledge and misconceptions of natural selection. BioScience, 57(3), 263-272. https://doi.org/10.1641/B570311
  20. Nehm, R. H., & Schonfeld, I. S. (2007). Does increasing biology teacher knowledge of evolution and the nature of science lead to greater preference for the teaching of evolution in schools? Journal of Science Teacher Education, 18(5), 699-723. https://doi.org/10.1007/s10972-007-9062-7
  21. Nehm, R. H., & Schonfeld, I. S. (2008). Measuring knowledge of natural selection: A comparison of the CINS, an open-response instrument, and an oral interview. Journal of Research in Science Teaching, 45(10), 1131- 1160. https://doi.org/10.1002/tea.20251
  22. Nehm, R. H., & Schonfeld, I. S. (2010). The future of natural selection knowledge measurement: A reply to Anderson et al.(2010). Journal of Research in Science Teaching, 47(3), 358-362.
  23. Nehm, R. H., Ha, M., Rector, M., Opfer, J., Perrin, L., Ridgway, J., Mollohan, K. (2010). Scoring Guide for the Open Response Instrument (ORI) and Evolutionary Gain and Loss Test (EGALT). Technical Report of National Science Foundation REESE Project 0909999. Accessed online January 10, 2011 at: http://evolutionassessment.org
  24. Park, J. (2010). Constructive multiple choice testing system. British Journal of Educational Technology, 41(6), 1054-1064. https://doi.org/10.1111/j.1467-8535.2010.01058.x
  25. Pyc, M. A., & Rawson, K. A. (2010). Why testing improves memory: Mediator effectiveness hypothesis. Science, 330(6002), 335. https://doi.org/10.1126/science.1191465
  26. Scouller, K. (1998). The influence of assessment method on students'learning approaches: Multiple choice question examination versus assignment essay. Higher Education, 35(4), 453-472. https://doi.org/10.1023/A:1003196224280
  27. Southerland, S. A., Abrams, E., Cummins, C. L., & Anzelmo, J. (2001). Understanding students' explanations of biological phenomena: Conceptual frameworks or pprims? Science Education, 85(4), 328-348. https://doi.org/10.1002/sce.1013
  28. Tan, K. C. D., Goh, N. K., Chia, L. S., & Treagust, D. F. (2002). Development and application of a two tier multiple choice diagnostic instrument to assess high school students' understanding of inorganic chemistry qualitative analysis. Journal of Research in Science Teaching, 39(4), 283-301. https://doi.org/10.1002/tea.10023
  29. Vosniadou, S. (1994). Capturing and modeling the process of conceptual change. Learning and Instruction, 4(1), 45-69. https://doi.org/10.1016/0959-4752(94)90018-3
  30. Vosniadou, S. (2007). Conceptual change and education. Human Development, 50, 47- 54. https://doi.org/10.1159/000097684
  31. Vosniadou, S., & Brewer, W. F. (1992). Mental models of the earth: A study of conceptual change in childhood. Cognitive Psychology, 24, 535-585. https://doi.org/10.1016/0010-0285(92)90018-W
  32. Wittmann, M. (1998). Making sense of how students come to an understanding of physics: An example from mechanical waves. Unpublished doctoral dissertation, University of Maryland.

Cited by

  1. Exploring Development of Pre-Service Science Teachers’Natural Selection Concepts according to Ecology and Genetics Concepts vol.44, pp.2, 2011, https://doi.org/10.15717/bioedu.2016.44.2.300