Rasch Modeling: A Multiple Choice Chemistry Test

Atiek Winarti(1*), Al Mubarak(2)

(1) Faculty of Teacher Training Education, Universitas Lambung Mangkurat
(2) Faculty of Teacher Training Education, Universitas Lambung Mangkurat
(*) Corresponding Author

Abstract

The study aimed to reveal the difficulty level of items and the suitability of items of Chemistry test with the Rasch model. In addition to detecting this item quality, the Rasch model shows the student's answer pattern as well, so that the assessment can imply the quality of the instrument as an assessment of chemical learning. As many as 20 numbers of multiple-choice questions in chemical bonding material were analyzed by using WINSTEPS 3.73. The samples consisted of 200 senior high school students in Banjarmasin Indonesia. The results revealed that the average item measure was 0.00 with items (Measure Order = 4.64) which has the highest difficulty level. The Q10 was the item that has a level of conformity with the model, and outliers or misfit in Rasch were MNSQ=+0.97, ZSTD=-0.2, Pt Mean Corr=+0.58. In other words, assessment of learning with test techniques such as multiple choice based on Rasch model analysis was an effective way for teachers to review the progress of students in the learning process, guidelines for designing chemical learning strategies, and identifying students' understanding of chemical material.

Keywords

rasch model; multiple choices; chemical bonding

Full Text:

PDF

References

Brandriet, A. R., & Bretz, S. L. (2014). The Development of the Redox Concept Inventory as a Measure of Students ’ Symbolic and Particulate Redox Understandings and Con fi dence. Journal of Chemical Education, 91, 1132–1144.

Brandriet, A. R., Xu, X., Bretz, S. L., & Lewis, J. E. (2011). Diagnosing changes in attitude in first-year college chemistry students with a shortened version of Bauer’s semantic differential. Chemistry Education Research and Practice, 12(2), 271–278. https://doi.org/10.1039/c1rp90032c

Brannon, R. M., Rusilowati, A., & Nugroho, S. E. (2018). Design of Chemical Literacy Assessment by Using Model of Educational Reconstruction ( MER ) on Solubility Topic Design of Chemical Literacy Assessment by Using Model of Educational Reconstruction ( MER ) on Solubility Topic. IOP Conference Series: Material Science and Engineering. https://doi.org/10.1088/1757-899X/335/1/012106

Bruce, M. R. M., Bruce, A. E., Avargil, S., Amar, G., Wemyss, T. M., & Flood, V. J. (2016). Polymers and Cross-Linking: A CORE Experiment To Help Students Think on the Submicroscopic Level. Journal of Chemical Education, 93(9), 1599–1605. https://doi.org/10.1021/acs.jchemed.6b00010

Chan, S. W., Ismail, Z., & Sumintono, B. (2014). A Rasch Model Analysis on Secondary Students’ Statistical Reasoning Ability in Descriptive Statistics. Procedia - Social and Behavioral Sciences, 129, 133–139. https://doi.org/10.1016/j.sbspro.2014.03.658

Cheung, D. (2011). Using diagnostic assessment to help teachers understand the chemistry of the lead-acid battery. Chemistry Education Research and Practice, 12(2), 228–237. https://doi.org/10.1039/c1rp90028e

Chiang, W.-W. (2015). Ninth Grade Student’ Self-assessment in Science: A Rasch Analysis Approach. Procedia - Social and Behavioral Sciences, 176, 200–210. https://doi.org/10.1016/j.sbspro.2015.01.462

Chow, J., Tse, A., & Armatas, C. (2018). Comparing trained and untrained teachers on their use of LMS tools using the Rasch analysis. Computers and Education, 123, 124–137. https://doi.org/10.1016/j.compedu.2018.04.009

Cloonan, C. A., & Hutchinson, J. S. (2011). A chemistry concept reasoning test. Chemistry Education Research and Practice, 12(2), 205–209. https://doi.org/10.1039/c1rp90025k

Eliyawati, Rohman, I., & Kadarohman, A. (2018). The effect of learning multimedia on students’ understanding of macroscopic, sub-microscopic, and symbolic levels in electrolyte and nonelectrolyte. Journal of Physics: Conference Series, 1013(1). https://doi.org/10.1088/1742-6596/1013/1/012002

Erdogan, T., & Senemoglu, N. (2014). Problem-Based Learning in Teacher Education : Its Promises and Challenges. Procedia - Social and Behavioral Sciences, 116, 459–463. https://doi.org/10.1016/j.sbspro.2014.01.240

Flood, V. J., Bruce, M. R. M., & Wittmann, M. C. (2015). Paying Attention to Gesture when Students Talk Chemistry: Interactional Resources for Responsive Teaching. Journal of Chemical Education, 92(1), 11–22.

Henning, G. (1989). Does the Rasch Model Really Work for Multiple-Choice Items? Take Another Look: A Response to Divgi. Journal of Educational Measurement, 26(1), 91–97. Retrieved from http://links.jstor.org/sici?sici=0022-0655%28198921%2926%3A1%3C91%3ADTRMRW%3E2.0.CO%3B2-6

Herrington, D. G., & Sweeder, R. D. (2018). Outside of the Classroom. Journal of Chemical Education, 95(12), 2148–2154. https://doi.org/10.1021/acs.jchemed.8b00361

Herrmann-Abell, C. F., & DeBoer, G. E. (2011). Using distractor-driven standards-based multiple-choice assessments and Rasch modeling to investigate hierarchies of chemistry misconceptions and detect structural problems with individual items. Chemistry Education Research and Practice, 12(2), 184–192. https://doi.org/10.1039/c1rp90023d

Kiliç, D., & Saǧlam, N. (2009). Development of a two-tier diagnostic test concerning genetics concepts: the study of validity and reliability. Procedia - Social and Behavioral Sciences, 1(1), 2685–2686. https://doi.org/10.1016/j.sbspro.2009.01.474

Lee, H. S., Liu, O. L., & Linn, M. C. (2011). Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 24(2), 115–136. https://doi.org/10.1080/08957347.2011.554604

Mamat, M. N., Maidin, P., & Mokhtar, F. (2014). Simplified Reliable Procedure for Producing Accurate Student’s Ability Grade Using Rasch Model. Procedia - Social and Behavioral Sciences, 112(Iceepsy 2013), 1077–1082. https://doi.org/10.1016/j.sbspro.2014.01.1272

Milenkovic, D Dusica., Segedinac, D Mirjana., and Hrin, N. T. (2014). Increasing High School Students ’ Chemistry Performance and Reducing Cognitive Load through an Instructional Strategy Based on the Interaction of Multiple Levels of Knowledge Representation. Journal of Chemical Education, 91(9), 1409–1416.

Milenković, D. D., Hrin, T. N., Segedinac, M. D., & Horvat, S. (2016). Identification of Misconceptions through Multiple Choice Tasks at Municipal Chemistry Competition Test Identification of Misconceptions through Multiple Choice Tasks at Municipal Chemistry Competition Test. Journal of Subject Didactics, 1(November), 3–12. https://doi.org/10.5281/zenodo.55468

Naiker, M., & Wakeling, L. (2015). Evaluation of group based inquiry oriented learning in undergraduate chemistry practicals. International Journal of Innovation in Science and Mathematics Education, 23(5), 1–17.

Pappa, E. T., & Tsaparlis, G. (2011). Evaluation of questions in general chemistry textbooks according to the form of the questions and the question-answer relationship (QAR): The case of intra- and intermolecular chemical bonding. Chemistry Education Research and Practice, 12(2), 262–270. https://doi.org/10.1039/c1rp90031e

Park, M., Liu, X., & Waight, N. (2017). Development of the Connected Chemistry as Formative Assessment Pedagogy for High School Chemistry Teaching. Journal of Chemical Education, 94(3), 273–281. https://doi.org/10.1021/acs.jchemed.6b00299

Potgieter, M., & Davidowitz, B. (2011). Preparedness for tertiary chemistry: Multiple applications of the Chemistry Competence Test for diagnostic and prediction purposes. Chemistry Education Research and Practice, 12(2), 193–204. https://doi.org/10.1039/c1rp90024b

Runnels, J. (2012). Using the Rash model to validate a multiple choice English achievement test. International Journal of Language Studies, 6(4), 141–155.

Serobatse, B. M., Selvaratnam, M., & Drummond, H. P. (2014). Students’ conceptions about the sub-microscopic approach to explanations in chemistry throughout their BSc degree course : research article. South African Journal of Chemistry, 67, 40–44. Retrieved from http://reference.sabinet.co.za/webx/access/electronic_journals/chem/chem_v67_a7.pdf

Sprague, E., Siegert, R. J., Medvedev, O., & Roberts, M. H. (2018). Rasch Analysis of the Edmonton Symptom Assessment System. Journal of Pain and Symptom Management, 55(5), 1356–1363. https://doi.org/10.1016/j.jpainsymman.2018.01.016

Sukor, N. S., Osman, K., & Abdullah, M. (2010). Students’ achievement of Malaysian 21st Century Skills in Chemistry. Procedia - Social and Behavioral Sciences, 9, 1256–1260. https://doi.org/10.1016/j.sbspro.2010.12.316

Sumintono, B. (2018). Rasch Model Measurements as Tools in Assesment for Learning, (October 2017). https://doi.org/10.2991/icei-17.2018.11

Torre, J. de la. (2009). A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options. Applied Psychological Measurement, 33(3), 163–183. https://doi.org/10.1177/0146621608320523

Üce, M., & Ceyhan, İ. (2019). Misconception in Chemistry Education and Practices to Eliminate Them : Literature Analysis. Journal of Education and Training Studies, 7(3), 202–208. https://doi.org/10.11114/jets.v7i3.3990

Velychko, O., & Gordiyenko, T. (2018). A comparative analysis of the assessment results of the competence of technical experts by methods of analytic hierarchy process and with using the Rasch model. Eastern-European Journal of Enterprise Technologies, 3(3 (93)), 14–21. https://doi.org/10.15587/1729-4061.2018.131459

Versprille, A., Zabih, A., Holme, T. A., Mckenzie, L., Maha, P., Martin, B., & Towns, M. (2017). Assessing Student Knowledge of Chemistry and Climate Science Concepts Associated with Climate Change: Resources To Inform Teaching and Learning. Journal of Chemical Ed, 94(4), 407–417. https://doi.org/10.1021/acs.jchemed.6b00759

Villafañe, S. M., Loertscher, J., Minderhout, V., & Lewis, J. E. (2011). Uncovering students’ incorrect ideas about foundational concepts for biochemistry. Chemistry Education Research and Practice, 12(2), 210–218. https://doi.org/10.1039/c1rp90026a

Wei, S., Liu, X., Wang, Z., & Wang, X. (2012). Using Rasch Measurement To Develop a Computer Modeling-Based Instrument To Assess Students ’ Conceptual Understanding of Matter. Journal of Chemical Education, 89, 335–345.

Wiliam, D. (2011). Studies in Educational Evaluation What is assessment for learning ? Studies in Educational Evaluation, 37(1), 3–14. https://doi.org/10.1016/j.stueduc.2011.03.001

Yasin, R. M., Rus, R. C., Ahmad, A., Rahim, M. B., & Yunus, F. A. N. (2015). Validity and Reliability Learning Transfer Item Using Rasch Measurement Model. Procedia - Social and Behavioral Sciences, 204(November 2014), 212–217. https://doi.org/10.1016/j.sbspro.2015.08.143

Zamri bin Khairani, A., & Bin Abd. Razak, N. (2015). Modeling a Multiple Choice Mathematics Test with the Rasch Model. Indian Journal of Science and Technology, 8(12). https://doi.org/10.17485/ijst/2015/v8i12/70650.

Article Metrics

Abstract view(s): 1579 time(s)
PDF: 915 time(s)

Refbacks

  • There are currently no refbacks.