Student Reaction to a Modified Force Concept Inventory: The Impact of Free-Response Questions and Feedback

Mark A. J. Parker 1 * , Holly Hedgeland 2, Nicholas Braithwaite 1, Sally Jordan 1
More Detail
1 The Open University, UK
2 University of Cambridge, UK
* Corresponding Author
EUR J SCI MATH ED, Volume 10, Issue 3, pp. 310-323. https://doi.org/10.30935/scimath/11882
Published: 08 March 2022
OPEN ACCESS   1186 Views   689 Downloads
Download Full Text (PDF)

ABSTRACT

The study investigated student reaction to the alternative mechanics survey (AMS), a modified force concept inventory, which used automatically marked free-response questions and offered limited feedback to students after their answers had been submitted. Eight participants were observed in completing the AMS, and they were interviewed to gain insight into what had been observed; the resultant data set was analyzed by thematic analysis. This revealed six key themes: “use of free-response questions supported deep learning”, “interpretation of the AMS instructions affected answer length”, “the idea of being marked by a computer did not affect answer structure”, “participant reaction to the usability of the AMS was mostly positive”, “reactions to the AMS depended upon what participants thought it was for”, and “limited feedback was a useful addition to the AMS”. Participants gave answers of differing length, being guided by the question wording as well as by their own preferences. It was found that participants valued being given feedback on their performance. Participants reacted positively to the free-response questions and could see potential for the use of this question type, opening up possibilities for the use of automatically marked free-response questions in concept inventories in the future.

CITATION

Parker, M. A. J., Hedgeland, H., Braithwaite, N., & Jordan, S. (2022). Student Reaction to a Modified Force Concept Inventory: The Impact of Free-Response Questions and Feedback. European Journal of Science and Mathematics Education, 10(3), 310-323. https://doi.org/10.30935/scimath/11882

REFERENCES

  • Bailey, J. M., Johnson, B., Prather, E. E., & Slater, T. F. (2012). Development and validation of the star properties concept inventory. International Journal of Science Education, 34(14), 2257-2286. https://doi.org/10.1080/09500693.2011.589869
  • Barnum, C. B. (2010). Usability testing essentials: Ready, set, test. Morgan Kaufmann Publishers.
  • Boud, D., & Soler, R. (2016). Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41(3), 400-413. https://doi.org/10.1080/02602938.2015.1018133
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101. https://doi.org/10.1191/1478088706qp063oa
  • Braun, V., Clarke, V., & Terry, G. (2014). Thematic analysis. Qualitative Research in Clinical Health Psychology, 24, 95-114. https://doi.org/10.1007/978-1-137-29105-9_7
  • Brown, E., & Glover, C. (2006). Evaluating written feedback. In C. Bryan, & K. Clegg (Eds.), Innovative assessment in higher education (pp. 81-91). Routledge.
  • Bull, J., & McKenna, C. (2004). Blueprint for computer-aided assessment. Routledge. https://doi.org/10.4324/9780203464687
  • Carless, D. (2006). Differing perceptions in the feedback process. Studies in Higher Education, 31(2), 219-233. https://doi.org/10.1080/03075070600572132
  • Carless, D. (2020). From teacher transmission of information to student feedback literacy: Activating the learner role in feedback processes. Active Learning in Higher Education. https://doi.org/10.1177/1469787420945845
  • Castro, A., & Andrews, G. (2018). Nursing lives in the blogosphere: A thematic analysis of anonymous online nursing narratives. Journal of Advanced Nursing, 74(2), 329-338. https://doi.org/10.1111/jan.13411
  • Chen, J. C., Kadlowec, J., & Whittinghill, D. (2004). Work in progress: Combining concept inventories with rapid feedback to enhance learning. In Proceedings of the 34th Annual Frontiers in Education. https://doi.org/10.1109/FIE.2004.1408580
  • Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683-695. https://doi.org/10.1080/13562517.2013.827653
  • Crisp, G. (2007). The e-assessment handbook. Continuum.
  • Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2018). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education, 44(1), 25-36. https://doi.org/10.1080/02602938.2018.1467877
  • Ding, L., Chaby, R., Sherwood, B., & Beichner, R. (2006). Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment. Physical Review Special Topics-Physics Education Research, 2, 010105. https://doi.org/10.1103/PhysRevSTPER.2.010105
  • Docktor, J. L., & Mestre, J. P. (2014). Synthesis of discipline-based education research in physics. Physical Review Special Topics-Physics Education Research, 10, 020119. https://doi.org/10.1103/PhysRevSTPER.10.020119
  • Draper, S. (2009). What are learners actually regulating when given feedback. British Journal of Educational Technology, 40(2), 306-315. https://doi.org/10.1111/j.1467-8535.2008.00930.x
  • Eaton, P. (2021). Evidence of measurement invariance across gender for the force concept inventory. Physical Review Physics Education Research, 17, 010130. https://doi.org/10.1103/PhysRevPhysEducRes.17.010130
  • Filia, K. M., Jackson, H. J., Cotton, S. M., Gardner, A., Killackey, E. J., & Cook, J. A. (2018). What is social inclusion? A thematic analysis of professional opinion. Psychiatric Rehabilitation Journal, 41(3), 183-195. https://doi.org/10.1037/prj0000304
  • Garcia, M., & Hamilton-Giachritsis, C. (2017). Getting involved: A thematic analysis of caregivers’ perspectives in Chilean residential children’s homes. Journal of Social and Personal Relationships, 34(3), 356-375. https://doi.org/10.1177/0265407516637838
  • Garvin-Doxas, K., Klymkowsky, M., & Elrod, S. (2007). Building, using, and maximizing the impact of concept inventories in the biological sciences: Report on a National Science Foundation-sponsored conference on the construction of concept inventories in the biological sciences. CBE Life Sciences Education, 6(4), 277-282. https://doi.org/10.1187/cbe.07-05-0031
  • Grogan, S., & Jayne, M. (2017). Body image after mastectomy: A thematic analysis of younger women’s written accounts. Journal of Health Psychology, 22(11), 1480-1490. https://doi.org/10.1177/1359105316630137
  • Halloun, I., & Hestenes, D. (1985). The initial knowledge state of college students. American Journal of Physics, 53, 1043-1056. https://doi.org/10.1119/1.14030
  • Henderson, M., Phillips, M., Ryan, T., Boud, D., Dawson, P., Molloy, E., & Mahoney, P. (2019). Conditions that enable effective feedback. Higher Education Research & Development, 38(7), 1401-1416. https://doi.org/10.1080/07294360.2019.1657807
  • Hestenes, D., Wells, M., & Swackhamer, G., (1992). Force concept inventory. The Physics Teacher, 30, 141-158. https://doi.org/10.1119/1.2343497
  • Hufnagel, B. (2002). Development of the astronomy diagnostic test. Astronomy Education Review, 1(1), 47-51. https://doi.org/10.3847/AER2001004
  • Hunt, T. (2012). Computer-marked assessment in Moodle: Past, present and future. Digital Education Research Network. https://dern.acer.org/dern/ict-research/page/computer-marked-assessment-in-moodle-past-present-and-future
  • Jensen, L. X., Bearman, M., & Boud, D. (2021). Understanding feedback in online learning–A critical review and metaphor analysis. Computers & Education, 173, 104271.https://doi.org/10.1016/j.compedu.2021.104271
  • Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical view, a meta-analysis and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254-284. https://doi.org/10.1037/0033-2909.119.2.254
  • Lasry, N., Rosenfield, S., Dedic, H., Dahan, A., & Reshef, O. (2011). The puzzling reliability of the force concept inventory. American Journal of Physics, 79(9), 909-914. https://doi.org/10.1119/1.3602073
  • Lawrie, G., Wright, A., Schultz, M., Dargaville, T., O’Brien, G., Bedford, S., Williams, M., Tasker, R., Dickson, H., & Thompson, C. (2013). Using formative feedback to identify and support first-year chemistry students with missing or misconceptions. A practice report. International Journal of the First Year in Higher Education, 4(2), 111-116. https://doi.org/10.5204/intjfyhe.v4i2.179
  • Lee, N. W., Shamsuddin, W. N. F. W, Wei, L. C., Anuardi, M. N. A. M., Heng, C. S., & Abdullah, A. N. (2021). Using online multiple choice questions with multiple attempts: A case for self-directed learning among tertiary students. International Journal of Evaluation and Research in Education, 10(2), 553-568. https://doi.org/10.11591/ijere.v10i2.21008
  • Millar, J. (2005). Engaging students with assessment feedback: What works? An FDTL5 project literature review. Oxford Brookes University Press.
  • Mitchell, T., Aldridge, N., Williamson, W., & Broomhead, P. (2003). Computer based testing of medical knowledge. In Proceedings of the 7th International CAA Conference (pp. 249-267).
  • Nardi, A., & Ranieri, M. (2019). Comparing paper-based and electronic multiple-choice examinations with personal devices: Impact on students’ performance, self-efficacy and satisfaction. British Journal of Educational Technology, 50(3), 1495-1506. https://doi.org/10.1111/bjet.12644
  • Nickerson, J., Corter, J., Esche, S., & Chassapis, C. (2007). A model for evaluation the effectiveness of remote engineering laboratories and simulations in education. Computers and Education, 49(3), 708-725. https://doi.org/10.1016/j.compedu.2005.11.019
  • Nicol, D. (2007). E‐assessment by design: Using multiple‐choice tests to good effect. Journal of Further and higher Education, 31(1), 53-64. https://doi.org/10.1080/03098770601167922
  • Nicol, D. (2021). The power of internal feedback: Exploiting natural comparison processes. Assessment & Evaluation in Higher Education, 46(5), 756-778. https://doi.org/10.1080/02602938.2020.1823314
  • Nyquist, J. B. (2003). The benefits of reconstructing feedback as a larger system of formative assessment: A meta-analysis. Vanderbilt University Press.
  • Porter, L., Taylor, C., & Webb, K. (2014). Leveraging open source principles for flexible concept inventory development. In Proceedings of the 2014 Conference on Innovation & Technology in Computer Science Education (pp. 243-248). https://doi.org/10.1145/2591708.2591722
  • Rebello, N., & Zollman, D. (2004). The effect of distractors on student performance on the force concept inventory. American Journal of Physics, 72, 116. https://doi.org/10.1119/1.1629091
  • Reeves, B., & Nass, C. (1996). The media equation. Stanford University Press.
  • Robertson, A. E., Stanfield, A. C., Watt, J., Barry, F., Day, M., Cormack, M., & Melville, C. (2018). The experience and impact of anxiety in autistic adults: A thematic analysis. Research in Autism Spectrum Disorders, 46, 8-18. https://doi.org/10.1016/j.rasd.2017.11.006
  • Sangwin, C. J. (2013). Computer aided assessment of mathematics. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199660353.001.0001
  • Scanlon, E., Colwell, C., Cooper, M., & Di Paolo, T. (2004). Remote experiments, re-versioning and re-thinking science learning. Computers and Education, 43(1-2), 153-163. https://doi.org/10.1016/j.compedu.2003.12.010
  • Scott, S. V. (2014). Practising what we preach: Towards a student-centred definition of feedback. Teaching in Higher Education, 19(1), 49-57. https://doi.org/10.1080/13562517.2013.827639
  • Sedrakyan, G., Malmberg, J., Verbert, K., Jarvela, S., & Kirschner, P. A. (2020). Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior, 107, 105512. https://doi.org/10.1016/j.chb.2018.05.004
  • Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153-189. https://doi.org/10.3102/0034654307313795
  • Simon, & Snowdon, S. (2014). Multiple-choice vs free-text code-explaining examination questions. In Proceedings of the 14th Koli Calling International Conference on Computing Education Research (pp. 91-97). https://doi.org/10.1145/2674683.2674701
  • Smedley, R. M., & Coulson, N. S. (2017). A thematic analysis of messages posted by moderators within health-related asynchronous online support forums. Patient Education and Counseling, 100(9), 1688-1693. https://doi.org/10.1016/j.pec.2017.04.008
  • Smith, J. I., & Tanner, K. (2010). The problem of revealing how students think: Concept inventories and beyond. CBE Life Sciences Education, 9(1), 1-5. https://doi.org/10.1187/cbe.09-12-0094
  • Thornton, R., & Sokoloff, D. (1998). Assessing student learning of Newton’s laws: The force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66, 338. https://doi.org/10.1119/1.18863
  • University of Auckland. (2017). Thematic analysis. https://www.psych.auckland.ac.nz/en/about/our-research/research-groups/thematic-analysis/about-thematic-analysis.html
  • Varagona, L. M., & Hold, J. L. (2019). Nursing students’ perceptions of faculty trustworthiness: Thematic analysis of a longitudinal study. Nurse Education Today, 72, 27-31. https://doi.org/10.1016/j.nedt.2018.10.008
  • Walker, D. J., Topping, K., & Rodrigues, S. (2008). Student reflections on formative e-assessment: Expectations and perceptions. Learning, Media and Technology, 33(3), 221-234. https://doi.org/10.1080/17439880802324178
  • Weaver, M. R. (2006). Do students value feedback? Student perceptions of tutors’ written responses. Assessment and Evaluation in Higher Education, 31(3), 379-394. https://doi.org/10.1080/02602930500353061
  • Woodford, K., & Bancroft, P. (2005). Multiple choice questions not considered harmful. In Proceedings of the Seventh Australasian Computing Education Conference (pp. 109-115).
  • Yasuda, J., Mae, N., Hull, M. M., & Taniguchi, M. (2021). Optimizing the length of computerized adaptive testing for the force concept inventory. Physical Review Physics Education Research, 17, 010115. https://doi.org/10.1103/PhysRevPhysEducRes.17.010115
  • Zeilik, M. (2003). Birth of the astronomy diagnostic test: Prototest evolution. Astronomy Education Review, 1(2), 46-52. https://doi.org/10.3847/AER2002005
  • Zhang, L., & VanLehn, K. (2021). Evaluation of auto-generated distractors in multiple choice questions from a semantic network. Interactive Learning Environments, 29(6), 1019-1036. https://doi.org/10.1080/10494820.2019.1619586
  • Zilvinskis, J., Willis, J. III., & Bordon, V. M. H. (2017). An overview of learning analytics. New Directions for Higher Education, 179, 9-17. https://doi.org/10.1002/he.20239