Rasch analysis and validity of the construct understanding of the nature of models in Spanish-speaking students

Jose M. Oliva 1 * , Ángel Blanco 2
More Detail
1 Department of Didactics, University of Cádiz, Cádiz, SPAIN
2 Department of Science Education, University of Málaga, Málaga, SPAIN
* Corresponding Author
EUR J SCI MATH ED, Volume 11, Issue 2, pp. 344-359. https://doi.org/10.30935/scimath/12651
Published Online: 17 November 2022, Published: 01 April 2023
OPEN ACCESS   849 Views   538 Downloads
Download Full Text (PDF)

ABSTRACT

A questionnaire was recently developed for the use with the Spanish-speaking, and evidence have been provided about the construct internal validity by means of structural equation modelling. In this paper, two research questions were considered: (i) What new evidence does application of the Rasch model provide regarding the validity of this construct? (ii) What cutoffs should be applied to the constructed scales in order to differentiate between acceptable and insufficient levels of the construct being measured? Participants were 1,272 Spanish at both high-school and college level. The instrument is a pencil and paper questionnaire written in Spanish, comprising 20 items (5-point Likert-type scale) distributed evenly across four scales: beyond exact replicas, purpose of models, multiple models, and changing models. Students’ responses were coded on an ordinal scale from zero to four. We then conducted a Rasch analysis using both a multidimensional approach and a consecutive unidimensional approach for each dimension. Data provided new evidence regarding the internal validity of the four scales of the questionnaire. The Rasch analysis also allowed us to establish cutoffs for the constructed scales. The evidence provided by this, and the previous study suggest that the questionnaire may be useful as a diagnostic tool when applied to groups or populations of students. In addition, the identified cutoffs could, hypothetically, serve to differentiate between students with an adequate versus an insufficient understanding of the nature of models.

CITATION

Oliva, J. M., & Blanco, Á. (2023). Rasch analysis and validity of the construct understanding of the nature of models in Spanish-speaking students. European Journal of Science and Mathematics Education, 11(2), 344-359. https://doi.org/10.30935/scimath/12651

REFERENCES

  • Andrich, D., & Marais, I. (2019). A course in Rasch measurement theory: Measuring in the educational, social and health sciences. Springer. https://doi.org/10.1007/978-981-13-7496-8
  • Angoff, W. H. (1971). Scales, norms, and equivalent scores. In R. L. Thorndike (Ed.), Educational measurement (pp. 508-600). American Council on Education.
  • Baghaei, P. (2007). Applying the Rasch rating-scale model to set multiple cut-offs. Rasch Measurement Transactions, 20(4), 1075-1076.
  • Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human science. Lawrence Erlbaum Associates.
  • Boone, W. J., & Scantlebury, K. (2006). The role of Rasch analysis when conducting science education research utilizing multiple-choice tests. Science Education, 90(2), 253-269. https://doi.org/10.1002/sce.20413
  • Boone, W. J., Townsend, J. S., & Staver, J. (2010). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: An exemplar utilizing STEBI self-efficacy data. Science Education, 95(2), 258-280. https://doi.org/10.1002/sce.20413
  • Bravo, B., & Mateo, E. (2017). Visión de los maestros en formación sobre los modelos científicos y sus funciones en las ciencias y en su enseñanza [Prospective teachers’ understanding of scientific models and their role in science and science education]. Didáctica de las Ciencias Experimentales y Sociales [Didactics of Experimental and Social Sciences], 33, 143-160. http://doi.org/10.7203/DCES.33.10102
  • Briggs, D. C., & Wilson, M. (2003). An introduction to multidimensional measurement using Rasch models. Journal of Applied Measurement, 4(1), 87-100.
  • Cheng, M. F., & Lin, J. L. (2015). Investigating the relationship between students’ views of scientific models and their development of models. International Journal of Science Education, 37(15), 2453-2475. https://doi.org/10.1080/09500693.2015.1082671
  • Connor, U. (1996). Contrastive rhetoric. Cross-cultural aspects of second-language writing. Cambridge University Press. https://doi.org/10.1017/CBO9781139524599
  • Crawford, B., & Cullin, M. (2005). Dynamic assessments of pre-service teachers’ knowledge of models and modelling. In K. Boersma, H. Eijkelhof, M. Goedhart, & O. Jong (Eds.), Research and the quality of science education (pp. 309-323). Springer. https://doi.org/10.1007/1-4020-3673-6_25
  • Everett, S. A., Otto, C. A., & Luera, G. L. (2009). Preservice elementary teachers’ growth in knowledge of models in a science Capstone course. International Journal of Science and Mathematics Education, 7(6), 1201-1225. https://doi.org/10.1007/s10763-009-9158-y
  • Figueiredo, A. O., & Perticarrari, A. (2022). El aprendizaje basado en modelos mantiene a los alumnos activos y con atención sostenida [Model-based learning keeps learners active and focused]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 19(3), 3102. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2022.v19.i3.3102
  • Fisher, W. P. (1992). Reliability, separation, strata statistics. Rasch Measurement Transactions, 6(3), 238.
  • Gilbert, J., Boulter, C., & Elmer, R. (2000). Positioning models in science education and in design and technology education. In J. K. Gilbert, & C. J. Boulter (Eds), Developing models in science education (pp. 3-17). Kluwer. https://doi.org/10.1007/978-94-010-0876-1_1
  • Gobert, J., O’Dwyer, L., Horwitz, P., Buckley, B., Levy, S. T., & Wilensky, U. (2011). Examining the relationship between students’ epistemologies of models and conceptual learning in three science domains: Biology, physics, & chemistry. International Journal of Science Education, 33(5), 653-684. https://doi.org/10.1080/09500691003720671
  • Gogolin, S., & Krüger, D. (2018). Students’ understanding of the nature and purpose of models. Journal of Research in Science Teaching, 55(9), 1313-1338. https://doi.org/10.1002/tea.21453
  • Goodwin, L. D. (1996). Determining cut-off scores. Research in Nursing & Health, 19, 249-256. https://doi.org/10.1002/(SICI)1098-240X(199606)19:3<249::AID-NUR8>3.0.CO;2-K
  • Grosslight, L., Unger, C., Jay, E., & Smith, C. L. (1991). Understanding models and their use in science: Conceptions of middle and high school students and experts. Journal of Research in Science Teaching, 28(9), 799-822. http://doi.org/10.1002/tea.3660280907
  • Grünkorn, J., Upmeier zu Belzen, A., & Krüger, D. (2014). Assessing students’ understandings of biological models and their use in science to evaluate a theoretical framework. International Journal of Science Education, 36(10), 1651-1684, https://doi.org/10.1080/09500693.2013.873155
  • Halloun, I. (1996). Schematic modelling for meaningful learning of physics. Journal of Research in Science Teaching, 33(9), 1019-1041. https://doi.org/10.1002/(SICI)1098-2736(199611)33:9<1019::AID-TEA4>3.0.CO;2-I
  • Hunt, E., & F. Agnoli (1991). The Whorfian hypothesis: A cognitive psychology perspective. Psychological Review, 98, 377-389. https://doi.org/10.1037/0033-295X.98.3.377
  • Jansen, S., Knippels, M. C. P. J., & van Joolingen, W. R. (2019). Assessing students’ understanding of models of biological processes: A revised framework. International Journal of Science Education, 41(8), 981-994. http://doi.org/10.1080/09500693.2019.1582821
  • Jiménez-Tenorio, N., Aragón, L., Blanco, Á., & Oliva, J. M. (2016). Comprensión acerca de la naturaleza de los modelos por parte de profesorado de ciencias de secundaria en formación inicial [Understanding about the nature of models by preservice secondary science teachers]. Campo Abierto, 35(1), 121-132.
  • Justi, R. (2006). La enseñanza de ciencias basada en la elaboración de modelos [Teaching science through the development of models]. Enseñanza de las Ciencias [Science Education], 24(2), 173-184. https://doi.org/10.5565/rev/ensciencias.3798
  • Justi, R. S., & Gilbert, J. K. (2003). Teachers’ views on the nature of models. International Journal of Science Education, 25(11), 1369-1386. https://doi.org/10.1080/0950069032000070324
  • Kaplan, E., & Grabe, W. (1991). The fiction in science writing. In H. Schröder (Ed.), Subject-oriented texts: Languages for special purposes and text theory (pp. 199-217). Mouton de Gruyter. https://doi.org/10.1515/9783110858747.199
  • Karampelas, K. (2021). Trends on science education research topics in education journals. European Journal of Science and Mathematics Education, 9(1), 1-12. https://doi.org/10.30935/scimath/9556
  • Kennedy, C. A., Wilson, M. R., Draney, K., Tutunciyan, S., & Vorp, R. (2010). ConstructMap 4.6. BEAR Center.
  • Krell, M., & Krüger, D. (2015). Testing models: A key aspect to promote teaching activities related to models and modelling in biology lessons? Journal of Biological Education, 50(2), 160-173. https://doi.org/10.1080/00219266.2015.1028570
  • Krell, M., Reinisch, B., & Krüger, D. (2015). Analyzing students’ understanding of models and modeling referring to the disciplines biology, chemistry, and physics. Research in Science Education, 45, 367-393. https://doi.org/10.1007/s11165-014-9427-9
  • Krell, M., Upmeier zu Belzen, A., & Krüger, D. (2014). Students’ levels of understanding models and modelling in biology. Research in Science Education, 44, 109-132. https://doi.org/10.1007/s11165-013-9365-y
  • Lazenby, K., & Becker, N. M. (2021). Evaluation of the students’ understanding of models in science (SUMS) for use in undergraduate chemistry. Chemistry Education Research and Practice, 22, 62-76. https://doi.org/10.1039/d0rp00084a
  • Lazenby, K., Stricker, A., Brandriet, A., Rupp, C. A., Mauger-Sonnek, K., & Becker, N. M. (2019). Mapping undergraduate chemistry students’ epistemic ideas about models and modeling. Journal of Research in Science Teaching, 57(5), 794-824. https://doi.org/10.1002/tea.21614
  • Lee, S., Chang, H., & Wu, H. (2015). Students’ views of scientific models and modeling: Do representational characteristics of models and students’ educational levels matter? Research in Science Education, 47, 305-32. https://doi.org/10.1007/s11165-015-9502-x
  • Linacre, J. M. (2009). Dichotomizing rating scales and Rasch-Thurstone thresholds. Rasch Measurement Transactions, 23(3), 1228.
  • Linacre, J. M. (2020). A user’s guide to Winsteps/Ministeps Rasch model programs. MESA Press.
  • Lindfors, M., Bodin M., & Simon, S. (2020). Unpacking students’ epistemic cognition in a physics problem-solving environment. Journal of Research in Science Teaching, 57, 695-732. https://doi.org/10.1002/tea.21606
  • Liu, X., & Boone, W. J. (2006). Introduction to Rasch measurement in science education. In X. Liu, & W. J. Boone (Eds.), Applications of Rasch measurement in science education (pp. 1-22). JAM Press.
  • Martinand, J. L. (1986). Enseñanza y aprendizaje de la modelización [Teaching and learning modeling]. Enseñanza de las Ciencias [Science Education], 4(1), 45-50. https://doi.org/10.5565/rev/ensciencias.5189
  • Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149-174. https://doi.org/10.1007/BF02296272
  • Muñiz, J. (2010). Las teorías de los tests: Teoría clásica y teoría de respuesta a los ítems [Test theory: Classical test theory and item response theory]. Papeles del Psicólogo [Papers of the Psychologist], 31(1), 57-66
  • Muñoz-Campos, V., Cañero-Arias, J., Oliva-Martínez, J. Mª., Blanco-López, A., & Franco-Mariscal, A. J. (2016). Assessment of teacher training students’ understanding of the nature of the models. In J. Lavonen, K. Juuti, J. Lampiselkä, A. Uitto, & K. Hahl (Eds.), Science education research: Engaging learners for a sustainable future (pp. 799-805). ESERA.
  • Neumann, I., Neumann, K., & Nehm, R. (2011). Evaluating instrument quality in science education: Rasch-based analyses of a nature of science test. International Journal of Science Education, 33(10), 1373-1405. https://doi.org/10.1080/09500693.2010.511297
  • Nicolaou, C. T., & Constantinou, C. P. (2014). Assessment of the modeling competence: A systematic review and synthesis of empirical research. Educational Research Review, 13, 52-73. https://doi.org/10.1016/j.edurev.2014.10.001
  • Oh, P. S., & Oh, S. J. (2011). What teachers of science need to know about models: An overview. International Journal of Science Education, 33(8,) 1109-1130. https://doi.org/10.1080/09500693.2010.502191
  • Oliva, J. M., & Blanco-López, A. (2021). Development of a questionnaire for assessing Spanish‐speaking students' understanding of the nature of models and their uses in science. Journal of Research in Science Teaching, 58(6), 852-878. https://doi.org/10.1002/tea.21681
  • Osborne, J. F., Henderson, J. B., MacPherson, A., Szu, E., Wild, A., & Yao, S. Y. (2016). The development and validation of a learning progression for argumentation in science. Journal of Research in Science Teaching, 53(6), 821-846. https://doi.org/10.1002/tea.21316
  • Pardo, O, Solaz-Pórtoles, J. J., & San José, V. (2018). Creencias de los estudiantes de educación secundaria sobre la naturaleza de la ciencia y los modelos científicos: Un estudio transversal [Secondary education students’ beliefs about the nature of science and scientific models: A cross-sectional study]. Educatio Siglo XXI [21st Century Education], 36(3), 465-484. https://doi.org/10.6018/j/350091
  • Park, M., Liu, X., Smith, E., & Waight, N. (2017). The effect of computer models as formative assessment on student understanding of the nature of models. Chemistry Education Research and Practice, 18(4), 572-581. https://doi.org/10.1039/c7rp00018a
  • Rasch, G. (1960). Probabilistic models for some attainment and intelligence tests. Denmark’s Pedagogical Institute.
  • Rasch, G. (1977). On specific objectivity: An attempt at formalizing the request for generality and validity of scientific statements. In M. Glegvad (Ed.), The Danish yearbook of philosophy (pp. 59-94). Munksgarrd. https://doi.org/10.1163/24689300-01401006
  • Raviolo, A., Ramírez, P., López, E. A., & Aguilar, A. (2010). Concepciones sobre el conocimiento y los modelos científicos: Un estudio preliminar [Conceptions of knowledge and models in science: A preliminary study]. Formación Universitaria [University Education], 3(5), 29-36.
  • Rodríguez-Mora, F., Cebrián-Robles, D., & Blanco-López, Á. (2022). An assessment using Rubrics and the Rasch Model of 14/15-year-old students’ difficulties in arguing about bottled water consumption. Research in Science Education, 52, 1075-1091. https://doi.org/10.1007/s11165-020-09985-z
  • Romine, W. L., Sadler, T. D., Dauer, J. M., & Kinslow, A. (2020). Measurement of socio-scientific reasoning (SSR) and exploration of SSR as a progression of competencies. International Journal of Science Education, 42(18), 2981-3002. https://doi.org/10.1080/09500693.2020.1849853
  • Schwarz, C. V. (1998). Developing students’ understanding of scientific modelling [Unpublished doctoral dissertation]. University of California, Berkeley.
  • Schwarz, C. V. (2002). Is there a connection? The role of meta-modeling knowledge in learning with models. In Proceedings of the International Conference of Learning Sciences.
  • Schwarz, C. V., & White, B. Y. (2005). Metamodeling knowledge: Developing students’ understanding of scientific modeling. Cognition and Instruction, 23(2), 165-205. https://doi.org/10.1207/s1532690xci2302_1
  • Sins, P. H., Savelsbergh, E. R., van Joolingen, W. R., & van Hout‐Wolters, B. H. (2009). The relation between students’ epistemological understanding of computer models and their cognitive processing on a modelling task. International Journal of Science Education, 31(9), 1205-1229. https://doi.org/10.1080/09500690802192181
  • Testa, I., Capasso, G., Colantonio, A., Galano, S., Marzoli, I., di Uccio, U.S., Trani, F., & Zappia, A. (2019). Development and validation of a university students’ progression in learning quantum mechanics through exploratory factor analysis and Rasch analysis. International Journal of Science Education, 41(3), 388-417. https://doi.org/10.1080/09500693.2018.1556414
  • Treagust, D. F., Chittleborough, G., & Mamiala, T. L. (2002). Students’ understanding of the role of scientific models in learning science. International Journal of Science Education, 24(4), 357-368. https://doi.org/10.1080/09500690110066485
  • Treagust, D. F., Chittleborough, G., & Mamiala, T. L. (2004). Students’ understanding of the descriptive and predictive nature of teaching models in organic chemistry. Research in Science Education, 34(1), 1-20. https://doi.org/10.1023/B:RISE.0000020885.41497.ed
  • Van Der Valk, T., Van Driel, J., & De Vos, W. (2007). Common characteristics of models in present-day scientific practice. Research in Science Education, 37(4), 469-488. https://doi.org/10.1007/s11165-006-9036-3
  • Vasques Brandão, R., Solano Araujo, I., & Veit, E. A. (2015). Validación de un cuestionario para investigar concepciones de profesores sobre ciencia y modelado científico en el contexto de la física [Validation of a questionnaire for investigating teachers’ ideas about science and scientific modeling in physics]. Revista Electrónica de Investigación en Educación en Ciencias [Electronic Journal of Research in Science Education], 6(1), 43-60.
  • Villablanca, S., Montenegro, M., & Ramos-Moore, E. (2020). Analysis of student perceptions of scientific models: Validation of a Spanish-adapted version of the Students’ Understanding of Models in Science instrument. International Journal of Science Education, 42(17), 2945-2958. https://doi.org/10.1080/09500693.2020.1843735
  • Wei, S., Liu, X., & Jia, Y. (2014). Using Rasch measurement to validate the instrument of students’ understanding of models in science (SUMS). International Journal of Science and Mathematics Education, 12, 1067-1082. https://doi.org/10.1007/s10763-013-9459-z
  • Wright, B. D., & Linacre, J. M. (1994). Reasonable mean-square fit values. Rasch Measurement Transactions, 8(3), 370.