This study explores the relationship between students’ missing responses on a large-scale assessment and their cognitive skill profiles and characteristics. Data from the 48 multiple-choice items on the 2006 Ontario Secondary School Literacy Test (OSSLT), a high school graduation requirement, were analyzed using the item response theory (IRT) three-parameter logistic model and the Reduced Reparameterized Unified Model, a Cognitive Diagnostic Model. Missing responses were analyzed by item and by student. Item-level analyses examined the relationships among item difficulty, item order, literacy skills targeted by the item, the cognitive skills required by the item, the percent of students not answering the item, and other features of the item. Student-level analyses examined the relationships among students’ missing responses, overall performance, cognitive skill mastery profiles, and characteristics such as gender and home language. The results of this study have implications for test designers who seek to improve provincial large-scale assessments, and for teachers who seek to help students improve their cognitive skills and develop test taking strategies.