Show simple item record

Performance Variations Across Response Formats on Reading Comprehension Assessments

dc.creatorCollins, Alyson Alexander
dc.date.accessioned2020-08-21T21:15:39Z
dc.date.available2015-09-29
dc.date.issued2015-04-02
dc.identifier.urihttps://etd.library.vanderbilt.edu/etd-03192015-153652
dc.identifier.urihttp://hdl.handle.net/1803/10902
dc.description.abstractFindings of recent studies suggest variations in performance across reading comprehension tests may be a product of differences among assessment dimensions (e.g., response format, genre) or child skills (e.g., Francis, Fletcher, Catts, & Tomblin, 2005; Keenan, 2013). The purpose of the current study was to investigate sources of variation in reading comprehension for three response formats (i.e., open-ended questions, multiple choice, retell) in relation to text genres and child skills. Participants included 79 fourth graders recruited from six classrooms within one elementary school. All participants read six passages (including three narrative and three expository texts) from the Qualitative Reading Inventory-Fifth Edition (QRI-5; Leslie & Caldwell, 2011) and completed a brief comprehension assessment, each of a varying response format. In addition, measures of word reading, linguistic and cognitive skills, and learning strategies were administered to each student across two 60-min testing sessions. Item-response crossed random effects models revealed statistically significant differences between open-ended and multiple-choice questions. Moreover, across the three response formats, five covariates were statistically significant predictors of reading comprehension: (a) genre, (b) listening comprehension, (c) working memory, (d) attention, and (e) word recognition. Further exploratory analyses identified three two-way interactions between: (a) Response Format (i.e., open-ended and multiple-choice questions) × Genre, (b) Response Format × Listening Comprehension, and (c) Response Format × Attention. Results of this study offer evidence to suggest the use of different response formats may lead to variations in student performance across reading comprehension tests. Given these findings, directions for future research and implications for using comprehension tests in research, policy, and practice are discussed.
dc.format.mimetypeapplication/pdf
dc.subjectreading comprehension
dc.subjectassessment
dc.subjectlistening comprehension
dc.subjectattention
dc.subjectworking memory
dc.subjectword reading
dc.titlePerformance Variations Across Response Formats on Reading Comprehension Assessments
dc.typedissertation
dc.contributor.committeeMemberMarcia A. Barnes, Ph.D.
dc.contributor.committeeMemberDouglas Fuchs, Ph.D.
dc.contributor.committeeMemberLynn S. Fuchs, Ph.D.
dc.type.materialtext
thesis.degree.namePHD
thesis.degree.leveldissertation
thesis.degree.disciplineSpecial Education
thesis.degree.grantorVanderbilt University
local.embargo.terms2015-09-29
local.embargo.lift2015-09-29
dc.contributor.committeeChairDonald L. Compton, Ph.D.


Files in this item

Icon

This item appears in the following Collection(s)

Show simple item record