Materials Similar to An Introduction to Classical Test Theory as Applied to Conceptual Multiple-choice Tests
- 51%: Item response theory analysis of the mechanics baseline test
- 48%: Multiple-choice test of energy and momentum concepts
- 48%: Approaches to data analysis of multiple-choice questions
- 46%: Improving test security and efficiency of computerized adaptive testing for the Force Concept Inventory
- 46%: Classical test theory and item response theory comparison of the brief electricity and magnetism assessment and the conceptual survey of electricity and magnetism
- 46%: Analyzing the dimensionality of the Energy and Momentum Conceptual Survey using Item Response Theory
- 45%: Development and validation of a conceptual multiple-choice survey instrument to assess student understanding of introductory thermodynamics
- 44%: Mechanical waves conceptual survey: Its modification and conversion to a standard multiple-choice test
- 44%: A Classical Test Theory Analysis of the Light and Spectroscopy Concept Inventory National Study Data Set
- 44%: Motivations for using the item response theory nominal response model to rank responses to multiple-choice items
- 43%: Force Concept Inventory-based multiple-choice test for investigating students' representational consistency
- 43%: Examining students' understanding of electrical circuits through multiple-choice testing and interviews
- 42%: Test equity in developing short version conceptual inventories: A case study on the Conceptual Survey of Electricity and Magnetism
- 41%: Multilevel Rasch modeling of two-tier multiple choice test: A case study using Lawson’s classroom test of scientific reasoning
- 41%: Development of a multiple-choice problem-solving categorization test for assessment of student knowledge structure
- 41%: Analyzing Multiple-Choice-Multiple-Response Items Using Item Response Theory
- 40%: FCI-based Multiple Choice Test for Investigating Students' Representational Coherence
- 38%: Validity evaluation of the Lawson classroom test of scientific reasoning
- 38%: Quantitatively ranking incorrect responses to multiple-choice questions using item response theory