GRCI: An investigation into the feasibility of a General Relativity Concept Inventory

Mark A. J. Parker 1 * , Holly Hedgeland 2, Nicholas St. J. Braithwaite 3, Sally E. Jordan 1
More Detail
1 School of Physical Sciences, The Open University, Walton Hall, Milton Keynes, MK7 6AA, UNITED KINGDOM
2 Clare Hall, University of Cambridge, Herschel Road, Cambridge, CB3 9AL, UNITED KINGDOM
3 Faculty of Science, Technology, Engineering & Mathematics, The Open University, Walton Hall, Milton Keynes, MK7 6AA, UNITED KINGDOM
* Corresponding Author
EUR J SCI MATH ED, Volume 12, Issue 4, pp. 489-501. https://doi.org/10.30935/scimath/15018
Published Online: 31 August 2024, Published: 01 October 2024
OPEN ACCESS   523 Views   240 Downloads
Download Full Text (PDF)

ABSTRACT

The study outlines the early-stage development of a free-response General Relativity Concept Inventory (GRCI), an educational instrument designed to test for conceptual understanding of General Relativity. Data were collected for the study by having 26 participants from General Relativity courses work through the questions on the GRCI. Interviews were conducted with four of the participants to gain further insight about their experience of working through the GRCI. The written responses revealed that participants were proficient when answering questions which required mathematical thought processes, but were more limited when answering questions which required conceptual and physical thought processes. The interviews revealed that participants found that free-response questions were appropriate to test for conceptual understanding of General Relativity. Participants identified that General Relativity has physical interpretations and mathematical constructs, and both are important to understand the theory. Participants thought that the GRCI could be given a formative purpose in a teaching context. The study was proof of concept in scope, with the aim of highlighting important points pertaining to the feasibility and development of the GRCI. Additional work to further investigate the above points highlighted by the study is encouraged.

CITATION

Parker, M. A. J., Hedgeland, H., Braithwaite, N. S. J., & Jordan, S. E. (2024). GRCI: An investigation into the feasibility of a General Relativity Concept Inventory. European Journal of Science and Mathematics Education, 12(4), 489-501. https://doi.org/10.30935/scimath/15018

REFERENCES

  • Aslanides, J. S., & Savage, C. M. (2013). Relativity Concept Inventory: Development, analysis, and results. Physical Review Special Topics Physics Education Research, 9, Article 010118. https://doi.org/10.1103/PhysRevSTPER.9.010118
  • Bailey, J. M., Johnson, B., Prather, E. E., & Slater, T. F. (2012). Development and validation of the Star Properties Concept Inventory. International Journal of Science Education, 34(14), 2257–2286. https://doi.org/10.1080/09500693.2011.589869
  • Baily, C., Ryan, Q. X., Astolfi, C., & Pollock, S. J. (2017). Conceptual assessment tool for advanced undergraduate electrodynamics. Physical Review Physics Education Research, 13, Article 020113. https://doi.org/10.1103/PhysRevPhysEducRes.13.020113
  • Bandyopadhyay, A., & Kumar, A. (2010). Probing students’ understanding of some conceptual themes in general relativity. Physical Review Special Topics Physics Education Research, 6, Article 020104. https://doi.org/10.1103/PhysRevSTPER.6.020104
  • Bloom, B. (1956). Taxonomy of educational objectives: The classification of educational goals. McKay.
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Braun, V., Clarke, V., & Terry, G. (2014). Thematic analysis. Qualitative Research in Clinical Health Psychology, 24, 95–114.
  • Brown, E., & Glover, C. (2006). Evaluating written feedback. In C. Bryan, & K. Clegg (Eds.), Innovative assessment in higher education (pp. 81–91). Routledge.
  • Bulut, O., Cutumisu, M., Aquilina, A., & Singh, D. (2019). Effect of digital score reporting and feedback on students’ learning in higher education. Frontiers in Education, 4. https://doi.org/10.3389/feduc.2019.00065
  • Burko, L. M. (2017). Gravitational wave detection in the introductory lab. The Physics Teacher, 55, 288–292. https://doi.org/10.1119/1.4981036
  • Catalyst Harvard. (2024). Mixed methods research. https://catalyst.harvard.edu/community-engagement/mmr/
  • Conlon, M., Coble, K., Bailey, J. M., & Cominsky, L. R. (2017). Investigating undergraduate students’ ideas about the fate of the Universe. Physical Review Physics Education Research, 13, Article 020128. https://doi.org/10.1103/PhysRevPhysEducRes.13.020128
  • Dick-Perez, M., Luxford, C. J., Windus, T. L., & Holme, T. (2016). A quantum chemistry concept inventory for physical chemistry classes. Journal of Chemical Education, 93(4), 605–612. https://doi.org/10.1021/acs.jchemed.5b00781
  • Ding, L., Chaby, R., Sherwood, B., & Beichner, R. (2006). Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment. Physical Review Special Topics - Physics Education Research, 2, Article 010105. https://doi.org/10.1103/PhysRevSTPER.2.010105
  • Hartle, J. B. (2008). General relativity in the undergraduate physics curriculum. American Journal of Physics, 74(1), 14–21. https://doi.org/10.1119/1.2110581
  • Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force Concept Inventory. The Physics Teacher, 30, 141–158. https://doi.org/10.1119/1.2343497
  • Hufnagel, B. (2002). Development of the astronomy diagnostic test. Astronomy Education Review, 1(1), 47–51. https://doi.org/10.3847/AER2001004
  • Institute of Physics. (2024). Degree accreditation and recognition. https://www.iop.org/education/support-work-higher-education/degree-accreditation-recognition
  • Kaur, T., Blair, D., Moschilla, J., Stannard, W., & Zadnik, M. (2017). Teaching Einsteinian physics at schools: Part 1, models and analogies for relativity. Physics Education, 52(6), Article 065012. https://doi.org/10.1088/1361-6552/aa83e4
  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical view, a meta-analysis and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254–284. https://doi.org/10.1037/0033-2909.119.2.254
  • Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory Into Practice, 41(4), 212–218. https://doi.org/10.1207/s15430421tip4104_2
  • Lindell, R. S., Peak, E., & Foster, T. M. (2007). Are they all created equal? A comparison of different concept inventory development methodologies. AIP Conference Proceedings, 883(1), 14–17. https://doi.org/10.1063/1.2508680
  • Muller, T., & Frauendiener, J. (2011). Studying null and time-like geodesics in the classroom. European Journal of Physics, 32, 747–759. https://doi.org/10.1088/0143-0807/32/3/011
  • Nicol, D. (2007). E-assessment by design: Using multiple-choice tests to good effect. Journal of Further and higher Education, 31(1), 53–64. https://doi.org/10.1080/03098770601167922
  • Parker, M., Hedgeland, H., Braithwaite, N., & Jordan, S. (2022). Student reaction to a modified force concept inventory: The impact of free-response questions and feedback. European Journal of Science and Mathematics Education, 10(3), 310–323. https://doi.org/10.30935/scimath/11882
  • Parker, M., Hedgeland, H., Braithwaite, N., & Jordan, S. (2023). Establishing a physics concept inventory using computer marked free-response questions. European Journal of Science and Mathematics Education, 11(2), 360–375. https://doi.org/10.30935/scimath/12680
  • Porter, L., Taylor, C., & Webb, K. (2014). Leveraging open source principles for flexible concept inventory development. In Proceedings of the 2014 Conference on Innovation Technology in Computer Science Education (pp. 243–248). https://doi.org/10.1145/2591708.2591722
  • Rebello, N., & Zollman, D. (2004). The effect of distractors on student performance on the force concept inventory. American Journal of Physics, 72, 116–125. https://doi.org/10.1119/1.1629091
  • Semon, M. D., Malin, S., & Wortel, S. (2009). Exploring the transition from special to general relativity. American Journal of Physics, 77, 434–438. https://doi.org/10.1119/1.3088883
  • Smith, J. I., & Tanner, K. (2010). The problem of revealing how students think: Concept inventories and beyond. CBE Life Sciences Education, 9(1), 1–5. https://doi.org/10.1187/cbe.09-12-0094
  • Stannard, W., Kersting, M., Kraus, U., & Moschilla, J. (2017). Research into the teaching and learning of Einsteinian physics in international contexts. In Proceedings of the 2017 GIREP-ICPE-EPEC Conference.
  • Thornton, R., & Sokoloff, D. (1998). Assessing student learning of Newton’s laws: The force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66(4), 338–352. https://doi.org/10.1119/1.18863
  • University of Auckland. (2024). Thematic analysis. https://www.auckland.ac.nz/en/science/about-the-faculty/school-of-psychology/psychology-research/studies-methods-and-ethics.html
  • Zahn, C., & Kraus, U. (2014). A toolkit for teaching general relativity: I. Curved spaces and spacetimes. European Journal of Physics, 35, Article 055020. https://doi.org/10.1088/0143-0807/35/5/055020
  • Zeilik, M. (2003). Birth of the astronomy diagnostic test: Prototest evolution. Astronomy Education Review, 1(2), 46–52. https://doi.org/10.3847/AER2002005
  • Zhu, M., Liu, O. L., & Lee, H. S. (2020). The effect of automated feedback on revision behaviour and learning gains in formative assessment of scientific argument writing. Computers and Education, 143, Article 103668. https://doi.org/10.1016/j.compedu.2019.103668