APS Logo

Developing and Validating a Computational Thinking Assessment Instrument for Introductory Physics

ORAL

Abstract

Computational thinking in physics (CTIP) is an essential skill contributing to physics literacy. In a previous study we interviewed physicists in industry and academia (N=26), all participants believed that computation should be integrated in introductory mechanics classes. Many institutions have integrated computation into their courses, but the curricula varies based on resources and instructor time among others. Ultimately, in order to support the integration of computation into introductory physics we will need means of assessing CTIP. Our goal is to develop an assessment of computational thinking in physics. Based on our previous work, we identified three approaches to assessing CTIP: comprehension- reading and commenting code; skills- giving students a program to edit or create; or attitudes- collecting data on how attitudes change over time regarding CTIP. We begin by developing an assessment that tests students' comprehension of CTIP. The VPython/Python computational language is used in the assessment as it is the most common for introductory physics. The assessment design is principally multiple choice methods to provide an easier analysis for users. We present preliminary results from giving the assessment to a pilot group of students with the intent of further validating the assessment tool.

Presenters

  • Justin Gambrell

    Drexel University

Authors

  • Justin Gambrell

    Drexel University

  • Eric Brewe

    Drexel University