Using Item Option Characteristics Curve (IOCC) to Unfold Misconception on Chemical Reaction

Authors

  • Hilman Qudratuddarsi Universitas Sulawesi Barat
  • Nurhikma Ramadhana Universitas Sulawesi Barat
  • Nor Indriyanti Universitas Sulawesi Barat
  • Ayu Indayanti Ismail Universitas Sulawesi Barat

DOI:

https://doi.org/10.14421/jtcre.2024.62-04

Keywords:

assessment, Rasch model, misconception, chemical reaction

Abstract

Misconceptions can significantly hinder the learning process. To address this, various diagnostic instruments such as two-tier (2TMC), three-tier, and four-tier multiple-choice questions have been introduced. However, as the number of tiers increases, identifying misconceptions becomes more complex. Therefore, this study employs the Item Option Characteristics Curve (IOCC) to identify misconceptions by calculating the probability of each option being selected. The Representational Systems and Chemical Reactions Diagnostic Instrument (RSCRDI) was administered to 185 pre-service teachers across three universities in Indonesia. The data was analyzed using Winstep software to generate the IOCC for each item. The analysis revealed that each item in the phenomenon and reasoning tiers contains distractors that could interfere with the option selected by pre-service chemistry teachers. While the alternative answers identified using traditional methods (commonly used since the introduction of 2TMC) were mostly similar to those identified by IOCC, the IOCC provided more detailed insights. Specifically, it highlighted unexpected curves after 0 logits, identified less effective distractors, and revealed inconsistencies in the most influential distractors. These findings suggest that the IOCC provides richer, more detailed information and can be a valuable alternative framework for analyzing 2TMC items to unfold misconceptions.

Downloads

Download data is not yet available.

References

Ardiansah, Masykuri, M., & Rahardjo, S. B. (2018). Senior high school students’ need analysis of three-tier multiple choice (3TMC) diagnostic test about acid-base and solubility equilibrium. Journal of Physics: Conference Series, 1–8. https://doi.org/10.1088/1742-6596/1022/1/012033

Aretz, S., Borowski, A., & Schmeling, S. (2012). The role of confidence in ordered multiple-choice items about the universe ’ s expansion. ESERA 2017 Conference Dublin City University, Ireland, 2006, 3–5.

Boone, W. J., Staver, J. R., & Yale, M. S. (2014). Rasch analysis in the human sciences. Springer Netherlands.

Caleon, I. S., & Subramaniam, R. (2010). Do students know What they know and what they don’t know? Using a four-tier diagnostic test to assess the nature of students’ alternative conceptions. Research in Science Education, 40(3), 313–337. https://doi.org/10.1007/s11165-009-9122-4

Chan, M., & Subramaniam, R. (2020). Validation of a Science Concept Inventory by Rasch Analysis. In Rasch Measurement (pp. 159–178). Springer Singapore. https://doi.org/10.1007/978-981-15-1800-3_9

Chandrasegaran, A. L., Treagust, D. F., & Mocerino, M. (2007). The development of a two-tier multiple-choice diagnostic instrument for evaluating secondary school students’ ability to describe and explain chemical reactions using multiple levels of representation. Chemistry Education Research and Practice, 8(3), 293. https://doi.org/10.1039/b7rp90006f

Chandrasegaran, A. L., Treagust, D. F., & Mocerino, M. (2009). Emphasizing multiple levels of representation to enhance students’ understanding of the changes occuring during chemical reaction. Journal of Chemical Education, 86(12), 1433–1436.

Chandrasegaran, A. L., Treagust, D. F., & Mocerino, M. (2011). Facilitating high school students’ use of multiple representations to describe and explain simple chemical reactions. Teaching Science, 57(4), 13–19.

Chew, S. L., & Cerbin, W. J. (2021). The cognitive challenges of effective teaching. Journal of Economic Education, 52(1), 17–40. https://doi.org/10.1080/00220485.2020.1845266

Delgado-Rico, E., Carrctero-Dios, H., & Rueh, W. (2012). Content validity evidences in test development: An applied perspective. Imernalonal Journal of Clinical and Health Psychology, 12(3), 449–460.

Ding, L., & Beichner, R. (2009). Approaches to data analysis of multiple-choice questions. Physical Review Special Topics - Physics Education Research, 5, 1–17. https://doi.org/10.1103/PhysRevSTPER.5.020103

Fetherstonhaugh, T., & Treagust, D. F. (1992). Students’ understanding of light and its properties: Teaching to engender conceptual change. Science Education, 76(6), 653–672. https://doi.org/10.1002/sce.3730760606

Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th ed.). McGraw-Hill.

Fulmer, G. W., Chu, H.-E., Treagust, D. F., & Neumann, K. (2015). Is it harder to know or to reason? Analyzing two-tier science assessment items using the Rasch measurement model. Asia-Pacific Science Education, 1(1), 1. https://doi.org/10.1186/s41029-015-0005-x

Gurel, D. K., Eryilmaz, A., & McDermott, L. C. (2015). A review and comparison of diagnostic instruments to identify students’ misconceptions in science. Eurasia Journal of Mathematics, Science and Technology Education, 11(5), 989–1008. https://doi.org/10.12973/eurasia.2015.1369a

Herrmann-Abell, C. F., & DeBoer, G. E. (2011a). Using distractor-driven standards-based multiple-choice assessments and Rasch modeling to investigate hierarchies of chemistry misconceptions and detect structural problems with individual items. Chem. Educ. Res. Pract., 12(2), 184–192. https://doi.org/10.1039/C1RP90023D

Herrmann-Abell, C. F., & DeBoer, G. E. (2011b). Using distractor-driven standards-based multiple-choice assessments and Rasch modeling to investigate hierarchies of chemistry misconceptions and detect structural problems with individual items. Chemistry Education Research and Practice, 12(2), 184–192. https://doi.org/10.1039/c1rp90023d

Herrmann-abell, C. F., & Deboer, G. E. (2016). Using rasch modeling and option probability curves to diagnose students’ misconceptions. Paper Presented at the 2016 AERA Annual Meeting Washington, DC, 1–12.

Hidayat, R., Idris, W. I. W., Qudratuddarsi, H., & Rahman, M. N. A. (2021). Validation of the Mathematical Modeling Attitude Scale for Malaysian Mathematics Teachers. Eurasia Journal of Mathematics, Science and Technology Education, 17(12). https://doi.org/10.29333/EJMSTE/11375

Hidayat, R., Qudratuddarsi, H., Mazlan, N. H., & Mohd Zeki, M. Z. (2021). EVALUATION OF A TEST MEASURING MATHEMATICAL MODELLING COMPETENCY FOR INDONESIAN COLLEGE STUDENTS. Journal of Nusantara Studies (JONUS), 6(2), 133–155. https://doi.org/10.24200/jonus.vol6iss2pp133-155

Hoe, K. Y., & Subramaniam, R. (2016). On the prevalence of alternative conceptions on acid-base chemistry among secondary students: Insights from cognitive and confidence measures. Chemistry Education Research and Practice, 17(2), 263–282. https://doi.org/10.1039/c5rp00146c

Liampa, V., Malandrakis, G. N., Papadopoulou, P., & Pnevmatikos, D. (2017). Development and evaluation of a three-tier diagnostic test to assess undergraduate primary teachers ’ understanding of ecological footprint. Research in Science Education. https://doi.org/10.1007/s11165-017-9643-1

Lin, J., Chu, K., & Meng, Y. (2010). Distractor rationale taxonomy : Diagnostic assessment of reading with ordered multiple-choice items. American Education Research Association, 1–15.

Liu, O. L., Lee, H., Linn, M. C., & Liu, O. L. (2011). An investigation of explanation multiple-choice items in science assessment. Educational Assessment, 16(3), 164–184. https://doi.org/10.1080/10627197.2011.611702

Masson, S., Potvin, P., Riopel, M., & Foisy, L. B. (2014). Differences in Brain Activation Between Novices and Experts in Science During a Task Involving a Common Misconception in Electricity. Mind, Brain, and Education, 8(1), 44–55. https://doi.org/10.1111/mbe.12043

Park, M., & Liu, X. (2019). An investigation of item difficulties in energy aspects across biology , chemistry , environmental science , and physics. Research in Science Education.

Qudratuddarsi, H., Hidayat, R., Shah, R. L. Z. binti R. M., Nasir, N., Imami, M. K. W., & Nor, R. bin M. (2022). Rasch Validation of Instrument Measuring Gen-Z Science, Technology, Engineering, and Mathematics (STEM) Application in Teaching during the Pandemic. International Journal of Learning, Teaching and Educational Research, 21(6), 104–121. https://doi.org/10.26803/ijlter.21.6.7

Qudratuddarsi, H., Sathasivam, R. V, & Hutkemri, A. (2019). Difficulties and Correlation between Phenomenon and Reasoning Tier of Multiple-Choice Questions: A Survey Study (Vol. 3).

Raykov, T., & Marcoulides, G. A. (2011). Introduction to psychometric theory (1st ed.). Routledge/Taylor & Francis Group.

Resbiantoro, G., Setiani, R., & Dwikoranto. (2022). A Review of Misconception in Physics: The Diagnosis, Causes, and Remediation. Journal of Turkish Science Education, 19(2), 403–427. https://doi.org/10.36681/tused.2022.128

Sadhu, S., & Laksono, E. W. (2018). Development and validation of an integrated assessment for measuring critical thinking and chemical literacy in chemical equilibrium. International Journal of Instruction, 11(3), 557–572. https://doi.org/10.12973/iji.2018.11338a

Shtulman, A., & Valcarcel, J. (2012). Scientific knowledge suppresses but does not supplant earlier intuitions. Cognition, 124(2), 209–215. https://doi.org/10.1016/j.cognition.2012.04.005

Shultz, K. S., Whitney, D. J., & Zickar, M. J. (2014). Measurement theory in action: Case studies and exercises (2nd ed.). Routledge/Taylor & Francis Group.

Sreenivasulu, B., & Subramaniam, R. (2013). University students’ understanding of chemical thermodynamics. International Journal of Science Education, 35(4), 601–635. https://doi.org/10.1080/09500693.2012.683460

Sreenivasulu, B., & Subramaniam, R. (2014). Exploring undergraduates’ understanding of transition metals chemistry with the use of cognitive and confidence measures. Research in Science Education, 44(6), 801–828. https://doi.org/10.1007/s11165-014-9400-7

Wild, D., Eremenco, S., Mear, I., Martin, M., Houchin, C., Gawlicki, M., Hareendran, A., Wiklund, I., Chong, L. Y., Cohen, L., & Molsen, E. (2009). Multinational trials—recommendations on the translations required, approaches to using the same language in different countries, and the approaches to support pooling the data: The ISPOR patient-reported outcomes translation and linguistic. Value in Health, 12(4), 430–440. https://doi.org/https://doi.org/10.1111/j.1524-4733.2008.00471.x

Wind, S. A., & Gale, J. D. (2015). Diagnostic opportunities using rasch measurement in the context of a misconceptions-based physical science assessment. Science Education, 99(4), 721–741. https://doi.org/10.1002/sce.21172

Xiao, Y., Han, J., Koenig, K., Xiong, J., & Bao, L. (2018). Multilevel Rasch modeling of two-tier multiple choice test: A case study using Lawson’s classroom test of scientific reasoning. Physical Review Physics Education Research, 14(2), 020104. https://doi.org/10.1103/PhysRevPhysEducRes.14.020104

Yan, Y. K., & Subramaniam, R. (2018). Using a multi-tier diagnostic test to explore the nature of students’ alternative conceptions on reaction. Chemistry Education Research and Practice, 19, 213–226. https://doi.org/10.1039/c7rp00143f

Downloads

Published

2024-10-18

How to Cite

Qudratuddarsi, H., Ramadhana, N., Indriyanti, N., & Indayanti Ismail, A. (2024). Using Item Option Characteristics Curve (IOCC) to Unfold Misconception on Chemical Reaction. Journal of Tropical Chemistry Research and Education, 6(2), 105–118. https://doi.org/10.14421/jtcre.2024.62-04

Issue

Section

Articles