Sample Scenario-Based Tasks and Discrete Questions
In the NAEP TEL assessment, students were tested using computer simulations of technology and engineering problem-solving tasks set in a variety of real-world contexts. Through interaction with these multimedia scenario-based tasks, students used an assortment of tools and applied their TEL knowledge and skills to solve problems across the content areas and practices.
The Andromeda task was administered in the 2014 and 2018 TEL assessments. The tasks Chicago, Bike Lanes, Iguana Home, and Recreation Center were administered in the 2014 TEL assessmentâ€”these tasks were subsequently made available to the public and were not administered in the 2018 assessment.
- Some tasks measured student performance in one content area and practice while other tasks measured more than one content area or practice.
- The assessment included long tasks (about 30 minutes) and short tasks (about 10 to 20 minutes).
- Tasks were designed to be accessible to all students so they could progress through each task to completion and demonstrate their TEL knowledge and skills.
The TEL assessment also included interactive discrete questions that were not part of a scenario. See examples of discrete questions below.
Explore how selected items are mapped on the NAEP TEL scale by using Item Maps.
Examples of questions from the 2018 NAEP TEL assessment, including how students performed on them, are presented below. Each example contains a brief description of the question, the format type (selected response or constructed response), the content area, the practice, the measured skill, and student performance. The percentage of students who answered a selected-response question correctly or who received full credit for their answer to a constructed response question is also displayed.