About the NAEP Mathematics Assessment

The National Assessment of Educational Progress (NAEP) is a congressionally mandated project administered by the National Center for Education Statistics (NCES) within the U.S. Department of Education and is the largest continuing and nationally representative assessment of what our nation's students know and can do in select subjects. NCES first administered NAEP in 1969 to measure student achievement nationally. The National Assessment of Educational Progress (NAEP) mathematics assessment at grades 4 and 8 is a digitally based assessment administered on tablets. The NAEP mathematics assessment measures students' knowledge and skills in mathematics and their ability to solve problems in mathematical and real-world contexts. Results are reported for the nation overall, for states and jurisdictions, and for districts participating in the Trial Urban District Assessment (TUDA).

Digitally Based Assessment

NAEP Digitally Based Mathematics Assessment

The NAEP mathematics assessments at grades 4 and 8 were administered on tablets supplied by NCES using a secure, local NAEP network. This allowed the NAEP administrators to create a stable administration environment by bringing in their own equipment that would not be influenced by school-based equipment or school internet connectivity, thereby maintaining consistency across the assessed schools. Students were able to interact with the tablets via touchscreen, with an attached keyboard, or using a stylus provided by NCES. The digitally based mathematics assessment provided students with a variety of onscreen tools, including an equation editor for entering numbers and expressions using the correct mathematical symbols; a scratchwork tool for annotating figures, performing computations, drawing diagrams, and highlighting portions of a question; and a calculator. The use of an onscreen calculator was available to students on approximately 30 percent of the test questions at both grades 4 and 8. At the beginning of the assessment session, students viewed an interactive tutorial that provided the information needed to take the assessment on tablet; for example, it explains how to progress through questions, how to indicate answers for multiple choice questions, and how to use onscreen tools effectively when answering questions. The interactive nature of the tutorial allowed students to familiarize themselves with the digital delivery system before beginning the actual assessment. See how the mathematics digitally based assessment was presented to students.

The previous NAEP mathematics assessment in 2017 was administered for the first time as a digitally based assessment (DBA) at grades 4 and 8; prior to 2017, paper-based assessments (PBA) were administered. A multi-step process was used for the transition from PBA to DBA in order to preserve trend lines that show student performance over time. The transition process involved administering the assessment in both the DBA and PBA formats to randomly equivalent groups of students in 2017. The results from the digitally based assessments can therefore be compared to those from previous years, showing how students’ performance in mathematics has changed over time. See more on the NAEP transition.

The digitally based mathematics assessment was designed to continue reporting trends in student performance dating back to 1990, while keeping pace with the new generation of classroom environments in which digital technology has increasingly become a part of students' learning. The 2017 assessment content was developed with the same mathematics framework used to develop the 2015 paper-based assessment. The 2017 assessment was composed of previous paper-based assessment questions; new, digitally based questions were developed to take advantage of the new digital delivery system.

At grades 4 and 8, approximately two-thirds of the questions from the 2015 paper-based assessment were adapted to the 2017 digitally based assessment. The previously used paper-based assessment questions were adapted to fit a tablet screen but the mathematical content was not changed. The goal of adapting questions was to retain the same measurement targets as the original version of the question. At each grade, six of the ten assessment blocks used only questions that had been adapted from the 2015 paper-based assessment and were assembled to be as similar as possible to corresponding paper-based blocks. Four of the ten blocks consisted of new questions developed for digital administration.

In addition to the digitally based assessment, random subsamples of students were administered the complete 2015 paper-based version of the assessment in 2017 and 2019. NCES administered the assessment in both modes—paper-based and digitally based—in all the sampled schools to investigate potential differences in performance between students taking the assessment on a tablet and students taking the paper-based assessment. However, in schools with fewer than 21 students, all students were assigned to either the digitally or paper-based assessment. Each participating student, however, took the assessment in only one mode. See how mathematics questions looked in the paper-based version of the grade 4 and grade 8 assessments and how the same questions appeared in the digitally based version.

After the administration of the assessment, the National Center for Education Statistics (NCES) conducted rigorous analyses of the data and aligned the 2017 results to previous assessment years using a two-step process.

  • First, common item linking was used to calculate the trend line from 2015 to 2017 based on the paper-based assessment results. This kind of linking was possible because the majority of 2017 assessment questions were also administered in 2015 and showed the same statistical properties.
  • Second, common population linking was used to align the 2017 paper-based assessment results with the 2017 digital assessment results. This kind of linking was possible because the samples of students for each assessment mode were randomly equivalent; that is, each random sample included students from the same school, ensuring that the students' educational experiences and characteristics were equivalent.

Once the common population linking aligned the digital results to the paper results on the national level, the analyses evaluated whether the linking allowed for fair and meaningful comparisons for national student groups as well as for states and districts. These evaluations supported making trend comparisons between the digital assessment and previous paper-based assessments for subgroups, states, and districts.

These analyses—common item linking based on paper results and common population linking of paper results to digital results—enabled NCES to successfully maintain the mathematics trend line while transitioning to the digital assessment in 2017 and continuing with the 2019 digital assessment.