The Development of Two-Tier Instrument Based On Distractor to Assess Conceptual Understanding Level and Student Misconceptions in Explaining Redox Reactions
Abstract
Fifteen distractor two-level multiple choice items were developed as diagnostic instruments to evaluate the level of conceptual understanding and structure of students' misconceptions in explaining redox reactions. Questions at the first tier (Q1) assess the level of knowledge, and questions at the second tier (Q2) assess the level of reasoning of students. This instrument was given to 1150 participants. The participants were 11th grade students, from eight senior high schools, in the Eastern part of Indonesia. The collected data was analyzed using the Rasch model approach. The results of this study provide diagnostic and summative information on the progressiveness of student learning outcomes, as well as evidence of empirical validity and reliability of measurement. In addition, by comparing the size of items Q1 with Q2, it was found that the level of student knowledge is not always proportional to the level of reasoning, even in some cases, the level of knowledge is lower than the level of reasoning, and vice versa. The results of the investigation using the option probability curve; it was revealed that there were students’ misconceptions and inconsistencies about the concepts of reduction, oxidation and oxidation numbers. This result confirms why students have difficulty interpreting and converting redox reaction equations.Â
https://doi.org/10.26803/ijlter.18.9.12
Full Text:
PDFReferences
Aktan, D. C. (2013). Investigation of students’ intermediate conceptual understanding levels: The case of direct current electricity concepts. European Journal of Physics, 34(1), 33–43. doi:10.1088/0143-0807/34/1/33
Boone, W. J., Yale, M. S., & Staver, J. R. (2014). Rasch Analysis in the Human Sciences. Rasch Analysis in the Human Sciences. Springer Dordrecht Heidelberg London New York Library: Springer Dordrecht Heidelberg London New York Library. doi:10.1007/978-94-007-6857-4
Briggs, D. (2009). The psychometric modeling of ordered multiple-choice item repsonses for diagnostic assessment with a learning progression. Learning Progressions in Science (LeaPS) Conference, June 2009, Iowa City, IA, (June).
Chandrasegaran, A. L., Treagust, D. F., & Mocerino, M. (2007). The Development of Two Tier Multiple-Choice Diagnostic Instrument for Evaluating Secondary School Students’ Ability to Describe and Explain Chemical Reactions Using Multiple Levels of Representation. Chemistry Education Research and Practice, 8(3), 293–307.
Chi, S., Wang, Z., Luo, M., Yang, Y., & Huang, M. (2018). Student progression on chemical symbol representation abilities at different grade levels (Grades 10–12) across gender. Chemistry Education Research and Practice, 19(4), 1055–1064. doi:10.1039/c8rp00010g
Claesgens, J., Scalise, K., Wilson, M., & Stacy, A. (2009). Mapping student understanding in chemistry: The Perspectives of Chemists. Science Education, 93(1), 56–85. doi:10.1002/sce.20292
Femintasari, V. (2015). the Effectiveness of Two-Tier Multiple Choice Test and Multiple Choice Test Followed With Interview in Identifying Misconception of Students With Different Scientific Reasoning Skills. Jurnal Ilmu Pendidikan, 21(2), 192–197.
Gabel, D. (1999). Improving Teaching and Learning through Chemistry Education Research: A Look to the Future. Journal of Chemical Education, 76(4), 548. doi:10.1021/ed076p548
Hadenfeldt, J. C., Bernholt, S., Liu, X., Neumann, K., & Parchmann, I. (2013). Using Ordered Multiple-Choice Items To Assess Students’ Understanding of the Structure and Composition of Matter. Journal of Chemical Education, 90(12), 1602–1608. doi:10.1021/ed3006192
Haladyna, T. M. (2004). Developing and Validating Multiple-choice Test Items (3rd Editio, p. 320). New Jersey: Lawrence Erlbaum Associates. doi:10.4324/9780203825945
Herrmann-Abell, C. F., & DeBoer, G. E. (2011). Using distractor-driven standards-based multiple-choice assessments and Rasch modeling to investigate hierarchies of chemistry misconceptions and detect structural problems with individual items. Chemistry Education Research and Practice, 12(2), 184–192. doi:10.1039/c1rp90023d
Hoe, K. Y., & Subramaniam, R. (2016). On the prevalence of alternative conceptions on acid-base among secondary students: Insights from cognitive and confidence measures. Chemistry Education Research and Practice, 17(2), 263–282. doi:10.1039/c5rp00146c
Johnstone, A. H. (1991). Why is science difficult to chemistry learn? Things are seldom what they seem. Journal of Computer Assisted Learning, 7, 75–83.
Johnstone, A. H. (2006). Chemical education research in Glasgow in perspective. Chemical Education Research and Practice, 7(2), 49–63. doi:10.1039/b5rp90021b
Kementerian Pendidikan dan Kebudayaan Republik Indonesia. (2016). Silabus Mata Pelajaran Kimia. In Silabus Mata Pelajaran Sekolah Menengah Atas/Madrasah Aliyah (SMA/MA).
Klassen, S. (2006). Contextual assessment in science education: Background, issues, and policy. Science Education, 90(5), 820–851. doi:10.1002/sce.20150
Linacre, J. M. (2012). A User’s Guide to W I N S T E P S ® M I N I S T E P Rasch-Model Computer Programs Program Manual 3.75.0. Winsteps.com. doi:ISBN 0-941938-03-4
Liu, X. (2012). Developing Measurement Instruments for Science Education Research. In C. J. Fraser, Barry, Tobin, Kenneth, McRobbie (Ed.), Second International Handbook of Science Education (pp. 651–665). Springer Netherlands.
Lu, S., & Bi, H. (2016). Development of a measurement instrument to assess students’ electrolyte conceptual understanding. Chemistry Education Research and Practice, 17(4), 1030–1040. doi:10.1039/c6rp00137h
Mintzes, J. J., Wandersee, J. H., & Novak, J. D. (Eds.). (1999). Assessing Science Understanding: A Human Constructivist View. Academic Press, San Diego, California: Elsevier Academic Press. San Diego, California: Elsevier Academic Press. Retrieved from https://zodml.org/sites/default/files/%5BJoel_J._Mintzes%2C_James_H._Wandersee%2C_Joseph_D._No_0.pdf
National Research Council. (2001). Knowing what students know: The Science and Design of Educational Assesment. National Academy Press (Vol. 19). doi:10.17226/10019
Perera, C. J., Sumintono, B., & Jiang, N. (2018). The Psychometric Validation Of The Principal Practices Questionnaire Based On Item Response Theory. International Online Journal of Educational Leadership, 2(1), 21–38. doi:10.22452/iojel.vol2no1.3
Rahayu, S. (2017). Promoting the 21st century scientific literacy skills through innovative chemistry instruction (Vol. 020025, p. 020025). Published by AIP Publishing. doi:10.1063/1.5016018
Sadler, P. M. (1998). Psychometric models for student-conceptions in science: Reconciling qualitative studies and distractor-driver assessment instruments. Journal of Research in Science Teaching, 35(3), 265–296.
Smiley, J. (2015). Classical test theory or Rasch- A personal account from a novice user. Shiken, 19(1), 16–29.
Sumintono, B., & Widhiarso, W. (2015). Aplikasi Pemodelan Rasch pada Assessment Pendidikan. Penerbit Trim Komunikata.
Taber, K. S. (2009). Challenging Misconceptions in the Chemistry Classroom: Resources to Support Teachers. Educació QuÃmica EduQ, 4, 13–20. doi:10.2346/20.2003.02.27
Taber, K. S. (2013). Revisiting the chemistry triplet: Drawing upon the nature of chemical knowledge and the psychology of learning to inform chemistry education. Chemistry Education Research and Practice, 14(2), 156–168. doi:10.1039/c3rp00012e
Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10(2), 159–169. doi:10.1080/0950069880100204
Trevor G. Bond & Christine M. Fox. (2015). Applying the Rasch Model: Fundamental Measurement in the Human Sciences. Routledge Taylor & Francis Group (Third Edit, Vol. 44). New York and London: Routledge Taylor & Francis Group. doi:10.1088/1751-8113/44/8/085201
Tsaparlis, G. (2009). Linking the Macro with the Submiro Levels of Chemistry: Demonstrations and Eksperiments that can Contribute to Active/Meaningful/Conceptual Learning. In S. A. Devetak, Iztok and Glazar (Ed.), Learning with Understanding in the Chemistry Classroom (pp. 41–61). Springer Dordrecht Heidelberg London New York Library. doi:10.1007/978-94-007-4366-3
Tüysüz, C. (2009). Development of two-tier diagnostic instrument and assess students’ understanding in chemistry. Scientific Research and Essay, 4(6), 626–631. Retrieved from http://www.academicjournals.org/SRE
Wei, S., Liu, X., Wang, Z., & Wang, X. (2012). Using rasch measurement to develop a computer modeling-based instrument to assess students’ conceptual understanding of matter. Journal of Chemical Education, 89(3), 335–345. doi:10.1021/ed100852t
Wilson, M. (2008). Cognitive Diagnosis Using Item Response Models. Zeitschrift Für Psychologie / Journal of Psychology, 216(2), 74–88. doi:10.1027/0044-3409.216.2.74
Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning progression. Journal of Research in Science Teaching, 46(6), 716–730. doi:10.1002/tea.20318
Wilson, M. (2012). Responding to a Challenge that Learning Progressions Pose to Measurement Practice. In Alicia C. Alonzo and & Amelia Wenk Gotwals (Eds) (Eds.), Learning Progression in Science, 317–344. Rotterdam: Sense Publishers. doi:10.1007/978-94-6091-824-7
Refbacks
- There are currently no refbacks.
e-ISSN: 1694-2116
p-ISSN: 1694-2493