NOTE: Beginning with the 2017 assessment, NAEP reading results are from a digitally based assessment; prior to 2017, results were from a paper-and-pencil based assessment. Detail may not sum to totals because of rounding. Although the estimates (e.g., average scores or percentages) are shown as rounded numbers in the chart, the positions of the data points in the graphics are based on the unrounded numbers. Unrounded numbers were used for calculating the differences between the estimates, and for the statistical comparison test when the estimates were compared to each other. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2017 Reading Assessment.
As part of the 2017 NAEP reading assessment, students, teachers, and school administrators answered survey questionnaires. These questionnaires provide information about students' educational experiences and factors that are related to students' learning both in and outside of the classroom. Selected results are highlighted below for students’ access to and use of digital technology in school, classroom learning and instruction for reading, and students’ enjoyment of complex problems. You can also explore the grade 4 and 8 student, teacher, and school questionnaires or use the NAEP Data Explorer to view other results from the questionnaires.
Findings are presented for individual survey questions as well as for indices composed of a set of related survey questions that measure particular topics of interest. Learn more about the development of NAEP survey questionnaire indices.
Interpreting the results
The highlighted findings demonstrate the range of information available from the 2017 NAEP reading survey questionnaires. They do not provide a complete picture of students' learning experiences in and outside of school. The NAEP data can be explored further using the NAEP Data Explorer.
NAEP survey questionnaire responses provide additional information for understanding NAEP performance results. Although comparisons are made in students' performance based on student, teacher, and school characteristics and educational experiences, these results cannot be used to establish a cause-and-effect relationship between the characteristics or experiences and student achievement. NAEP is not designed to identify the causes of performance differences. Results must be interpreted with caution. There are many factors that may influence average student achievement, including local educational policies and practices, the quality of teachers, available resources, and the demographic characteristics of the student body. Such factors may change over time and vary among student groups.
NAEP reports results using widely accepted statistical standards; findings are reported based on a statistical significance level set at .05 with appropriate adjustments for multiple comparisons. Students are always used as the unit of analysis when reporting NAEP survey questionnaire responses. The percentages shown are weighted and represent students or students whose teachers or school administrators indicated a specific response on the survey questionnaire. Some student responses are missing due to the inability to link students to their teacher's or school administrator's responses to the survey questionnaire, or due to nonresponse from students, teachers, or school administrators. The denominator of the percentages presented excludes all students with missing information in data for the analysis. To find missing rates, use the NAEP Data Explorer.