PERC 2025 Abstract Detail Page
Previous Page | New Search | Browse All
| Abstract Title: | Using an LLM to Investigate Students' Explanations on Conceptual Physics Questions |
|---|---|
| Abstract Type: | Contributed Poster Presentation |
| Abstract: | Analyzing students' written solutions to physics questions is a major area in PER. However, gauging student understanding in college courses is bottle-necked by large class sizes, limiting assessments to multiple-choice (MC) format for ease of grading. Although sufficient in quantifying scientifically correct conceptions, MC assessments do not uncover students' deeper ways of understanding physics. Large language models (LLMs) offer a promising approach for assessing students' written responses at scale. Our study used an LLM, validated by human graders, to classify students' written explanations to three questions on the Energy and Momentum Conceptual Survey as correct or incorrect, and (2) organized students' incorrect explanations into emergent categories. We found that the LLM (GPT-4o) can fairly assess students' explanations, comparably to human graders (0-3 \% discrepancy). Furthermore, the categories of incorrect explanations were different from corresponding MC distractors, allowing for different and deeper conceptions to become accessible to educators. |
| Footnote: | Supported in part by U.S. National Science Foundation Grants 2300645 and 2111138. Opinions expressed are of the authors and not of the Foundation. |
| Session Time: | Poster Session C |
| Poster Number: | C-6 |
| Contributed Paper Record: | Contributed Paper Information |
| Contributed Paper Download: | Download Contributed Paper |
Author/Organizer Information | |
| Primary Contact: |
Sean Savage Purdue University West Lafayette, IN 47906 Phone: 484-387-9176 |
| Co-Author(s) and Co-Presenter(s) |
N. Sanjay Rebello, Purdue University |
Contributed Poster | |
| Contributed Poster: | Download the Contributed Poster |




