Adaptive testing using Maple TA is an effective and consistent method to employ algorithmically-graded, high fidelity student assessments of complex, higher-level problem solving
ORAL
Abstract
Adaptive testing using Maple TA has been carried out to create algorithmically-graded, high fidelity student assessments. While multiple-choice questioning can be used to quickly grade and give students immediate feedback upon completion of an assessment, it is not effective for testing complex problem-solving situations. Write-on problems are more effective for assessing higher-level problem solving, but these hand-graded questions can be time-consuming and inconsistent thereby delaying student feedback. In the present work adaptive testing at the question level, rather than the exam level, has been employed using Maple TA's sequential Adaptive Questions. If a question is answered incorrectly, the student is asked a series of sub-problems designed to identify the student's weak areas, and appropriate levels of partial credit are automatically applied. Results showed that these scores were comparable to hand-graded scoring, indicating the current application generally did not inaptly provide solution hints to the students during the assessment. Response-specific feedback, though not yet implemented, is also possible. This work builds on the use of adaptive assessments for homework using Adaptive Questions, a precursor to Maple TA's more powerful branching "Adaptive Assignments".
–
Presenters
-
Justin Des Yarrington
Georgia Institute of Technology, Brigham Young University - Idaho
Authors
-
Justin Des Yarrington
Georgia Institute of Technology, Brigham Young University - Idaho
-
Russell Daines
Brigham Young University - Idaho, Pennsylvania State University